The Cursor feature I rely on most is one most tutorials do not cover properly: Custom Project Rules
There is a lot written about Cursor's AI agents and the Composer feature. I want to talk about Custom Project Rules because I think it is underappreciated and it is the thing that most improved the quality of AI suggestions on my actual codebases.
Custom Project Rules let you set binding instructions that the AI follows for every prompt in that project. Not suggestions, not preferences you have to repeat each time, actual rules that apply automatically. I have rules set for each project: the TypeScript version, the component structure pattern, what styling system to use, what not to do, how to handle state. When the AI generates code it generates code that fits the existing conventions rather than producing something technically correct but stylistically inconsistent with the rest of the codebase.
On a large team codebase where consistency matters this is the difference between AI output you can merge and AI output you have to refactor before merging. The time saved on review and cleanup is significant.
The Documentation Indexing via @Docs is the other underrated feature. You paste a link to the documentation for a library or API you are using and Cursor reads it. From that point it generates code using current syntax rather than hallucinating outdated patterns or inventing function names that do not exist. This matters most with fast-moving libraries where training data is already stale.
The Composer for multi-file edits and the autonomous agents are powerful and worth learning. But the Project Rules and @Docs features are the ones that made the output consistently good rather than occasionally good.