This notes is for using AI services and tools. For working with APIs, please refer this note.
This is based on my personal experience with the current versions. The list may change significantly in the future.
- Summarize YouTube videos: Notebook LM or ask Grok with a URL.
- Check references/sources: Perplexity then Grok/ChatGPT.
- Record and summarize live meetings: ChatGPT Pro.
- All-in-one chatbot models: Monica (affordable option).
- Work with personal files/sources: Use Project or Spaces features in AI services and upload your resources.
- Voice Mode (for English speaking practice): ChatGPT (has memory), Grok (for creative conversations).
- AI IDE: Cursor, then VSCode with Github Copilot. Both use Claude models.
- Image editing/generation: Gemini Banana.
- Video generation: Grok Imagine.
- How I use LLMs by Andrej Karpathy
- Note: VSCode.
- Different from VSCode, all
cmd+kis replaced bycmd+r!
- If you prefer a vertical activity bar like VSCodeβs (for search, extensions, and other icons) instead of the horizontal layout, navigate to Settings β Workbench β Activity Bar β Orientation and change it there.
This method works with both Claude Code CLI and the latest Claude Code extension in IDE.
- Create or update
~/.claude/settings.jsonwith the following hook:
- Create a script
~/.claude/scripts/notify-end.shwith following content:
- Then run
chmod +x ~/.claude/scripts/notify-end.shp
- Restart your Claude Code (both CLI or extension) to see the result!
- Use Anthropicβs prompt improver / generator.
- β€οΈ Alternative to (and compatible with) Claude Code: Z.ai's GLM Coding, which is much cheaper.
- Instruction prompt: Add instructions to
~/.claude/CLAUDE.mdfor global guides (for any project). For a single project, addCLAUDE.mdin the root of the project!
- When coding CLI with Claude, if you want to add something to
CLAUDE.md, just add#before what you want to say, e.g.# Response in Vietnamese every question.
- For reading images/photos, should use Gemini with CC.
- Set a keyboard shortcut for "Claude Code: Open in Side Bar" (I use
ctrl+ESC) to quickly open Claude Code in the Side Bar.
- When Claude Code is open in the Sidebar, an icon will appear. Drag and drop this icon to the other sidebar (for example, in the same area as Cursor Chat or GitHub Chat).
- π Once configured, Claude Code will automatically open in the chat panel whenever you access it from the sidebar.
- sniffly β Claude Code dashboard with usage stats, error analysis, and sharable feature.
- spec-kit β Toolkit to help you get started with Spec-Driven Development
Best practice: Always use Claude Code login in the VSCode extension (it will be reset after each 5 hours). Always use Claude Code CLI with GLM, open a new tab as a Terminal (next to the tab of extension). With that way, whenever, the usage runs out, we can change to use GLM in the Terminal tab easily.
- Check the official guide but itβs not enough.
- Modify your
~/.zshrcor~/.bashrc
Then
source ~/.zshrc or source ~/.bashrc in the current terminal to make them work!- To switch between services, run
vb_glmto use GLM orvb_claudeto use default Claude Code.
- Verify the configuration by typing
claudein the terminal, then running/status. - You can also simply ask "Who r u? Which model r u?"
- To make GLM work with the latest VSCode extension: open the terminal, switch to GLM with
vb_glm, then open the current folder usingcursor .orcode .. - Test the Claude Code extension by asking: "Who r u? Which model r u?" (you may need to ask several times until you see an answer containing "glm-4.6")
- β Another way: Open IDE Settings β search for "Claude Code" β Click to open the
settings.jsonfile and add the following:
Then reload the current IDE windows.
β οΈ Note that "default" for "selectedModel" will not work! You can also type
/model and then select "opus".- Cursor Directory/MCPs
- Install MCP for Claude Desktop: Settings β Developer β Edit Config
- Install MCP for Cursor:
- Global: Cursor β Settingsβ¦ β Cursor Settings β Tools & Integrations
- Project:
.cursor/mcp.json
- In your IDE (VSCode or Cursor), install the Continue extension.
- In LM Studio, navigate to the Developer tab, select your downloaded model β Settings β enable "Serve on Local Network" β enable the server.
- In your IDE, select the "Continue" tab on the left sidebar β Choose "Or, configure your own model" β "Click here to view more providers" (or select the Ollama icon tab if you're using Ollama) β in the provider list, select LM Studio β Set Model to "Autodetect" β Connect β a config file will open at
~/.continue/config.yaml, keep the default settings and save.
- That's it!
- As another option, you can use Granite.code (from IBM)