This notes is for using AI services and tools. For working with APIs, please refer this note.
This is based on my personal experience with the current versions. The list may change significantly in the future.
- Summarize YouTube videos: Notebook LM or ask Grok with a URL.
- Check references/sources: Perplexity then Grok/ChatGPT.
- Record and summarize live meetings: ChatGPT Pro.
- All-in-one chatbot models: Monica (affordable option).
- Work with personal files/sources: Use Project or Spaces features in AI services and upload your resources.
- Voice Mode (for English speaking practice): ChatGPT (has memory), Grok (for creative conversations).
- AI IDE: Cursor, then VSCode with Github Copilot. Both use Claude models.
- Image editing/generation: Gemini Banana.
- Video generation (photo to video): Grok Imagine.
- Always look for
llms.txtof a site. This doc is LLM friendly for AI services. For example AI SDK’s llms.txt,
How I use LLMs by Andrej Karpathy
Cannot sign in to Google Account
- Note: VSCode.
- Different from VSCode, all
cmd+kis replaced bycmd+r!
- If you prefer a vertical activity bar like VSCode’s (for search, extensions, and other icons) instead of the horizontal layout, navigate to Settings → Workbench → Activity Bar → Orientation and change it there.
1- **[IMPORTANT]** Do not just simulate the implementation or mocking them, always implement the real code.
2- Use file system (in markdown format) to hand over reports in `./plans/reports` directory from agent to agent with this file name format: `NNN-from-agent-name-to-agent-name-task-name-report.md`.
3
4**Task Completeness Verification**
5- Verify all tasks in the TODO list of the given plan are completed
6- Check for any remaining TODO comments
7- Update the given plan file with task status and next stepsThis method works with both Claude Code CLI and the latest Claude Code extension in IDE.
- Create or update
~/.claude/settings.jsonwith the following hook:
1{
2 "hooks": {
3 "Stop": [{
4 "matcher": "",
5 "hooks": [{
6 "type": "command",
7 "command": "bash ~/.claude/scripts/notify-end.sh"
8 }]
9 }]
10 }
11}- Create a script
~/.claude/scripts/notify-end.shwith following content:
1#!/bin/bash
2# macOS
3# osascript -e 'display notification "Task completed" with title "Claude Code" sound name "Glass"'
4
5# Or just play a custom sound file
6afplay /System/Library/Sounds/Glass.aiff- Then run
chmod +x ~/.claude/scripts/notify-end.shp
- Restart your Claude Code (both CLI or extension) to see the result!
- Use Anthropic’s prompt improver / generator.
- ❤️ Alternative to (and compatible with) Claude Code: Z.ai's GLM Coding, which is much cheaper.
- Best practices in the codes should be in skills.
- The magic: Skills handle all the "how to write code" guidelines, and CLAUDE.md handles "how this specific project works." Separation of concerns for the win.
- Dev Docs System: track the TODO
1### Starting Large Tasks
2
3When exiting plan mode with an accepted plan: 1.**Create Task Directory**:
4mkdir -p ~/git/project/dev/active/[task-name]/
5
62.**Create Documents**:
7
8- `[task-name]-plan.md` - The accepted plan
9- `[task-name]-context.md` - Key files, decisions
10- `[task-name]-tasks.md` - Checklist of work
11
123.**Update Regularly**: Mark tasks complete immediately
13
14### Continuing Tasks
15
16- Check `/dev/active/` for existing tasks
17- Read all three files before proceeding
18- Update "Last Updated" timestampsAn example from this post.
- Create custom slash commands to avoid repeating tasks and prompts.
- To use
shift + enter, run/terminal-setupin claude. Note that, iTerms supportshift+enterbut Terminal.app supportalt+enterinstead.
- Always start with plan mode before asking CC to do things.
- Agent Skills Best Practices - official docs.
- Instruction prompt: Add instructions to
~/.claude/CLAUDE.mdfor global guides (for any project). For a single project, addCLAUDE.mdin the root of the project!
- When coding CLI with Claude, if you want to add something to
CLAUDE.md, just add#before what you want to say, e.g.# Response in Vietnamese every question.
- For reading images/photos, should use Gemini with CC.
- PM2 for production process manager with built-in load balancer.
1Me: "The email service is throwing errors"
2Me: [Manually finds and copies logs]
3Me: [Pastes into chat]
4Claude: "Let me analyze this..."Before PM2…
1Me: "The email service is throwing errors"
2Claude: [Runs] pm2 logs email --lines 200
3Claude: [Reads the logs] "I see the issue - database connection timeout..."
4Claude: [Runs] pm2 restart email
5Claude: "Restarted the service, monitoring for errors..."After PM2…
- Set a keyboard shortcut for "Claude Code: Open in Side Bar" (I use
ctrl+ESC) to quickly open Claude Code in the Side Bar.
- When Claude Code is open in the Sidebar, an icon will appear. Drag and drop this icon to the other sidebar (for example, in the same area as Cursor Chat or GitHub Chat).
- 🎉 Once configured, Claude Code will automatically open in the chat panel whenever you access it from the sidebar.
- You can drag and drop the CC extension menu back into the sidebar to display the CC icon and open it on demand.
- sniffly — Claude Code dashboard with usage stats, error analysis, and sharable feature.
- spec-kit — Toolkit to help you get started with Spec-Driven Development
Best practice: Always use Claude Code login in the VSCode extension (it will be reset after each 5 hours). Always use Claude Code CLI with GLM, open a new tab as a Terminal (next to the tab of extension). With that way, whenever, the usage runs out, we can change to use GLM in the Terminal tab easily.
CCS - Claude Code Switch — Instant switch between Claude Subscription profile and GLM Coding Plan profile with one command.
- Check the official guide but it’s not enough.
- Modify your
~/.zshrcor~/.bashrc
1export GCL_API_KEY="xxx"
2# GLM Setup
3vb_glm() {
4 echo "🚀 Using GLM for claude code"
5 export ANTHROPIC_BASE_URL=https://api.z.ai/api/anthropic
6 export ANTHROPIC_AUTH_TOKEN=$GCL_API_KEY
7 export ANTHROPIC_DEFAULT_OPUS_MODEL="GLM-4.6"
8 export ANTHROPIC_DEFAULT_SONNET_MODEL="GLM-4.6"
9 export ANTHROPIC_DEFAULT_HAIKU_MODEL="GLM-4.5-Air"
10 echo "✅ Done"
11}
12# Claude Setup (unset the variables)
13vb_claude() {
14 echo "🚀 Using default claude for claude code"
15 unset ANTHROPIC_BASE_URL
16 unset ANTHROPIC_AUTH_TOKEN
17 unset ANTHROPIC_DEFAULT_OPUS_MODEL
18 unset ANTHROPIC_DEFAULT_SONNET_MODEL
19 unset ANTHROPIC_DEFAULT_HAIKU_MODEL
20 echo "✅ Done"
21}Then
source ~/.zshrc or source ~/.bashrc in the current terminal to make them work!- To switch between services, run
vb_glmto use GLM orvb_claudeto use default Claude Code.
- Verify the configuration by typing
claudein the terminal, then running/status. - You can also simply ask "Who r u? Which model r u?"
- To make GLM work with the latest VSCode extension: open the terminal, switch to GLM with
vb_glm, then open the current folder usingcursor .orcode .. - Test the Claude Code extension by asking: "Who r u? Which model r u?" (you may need to ask several times until you see an answer containing "glm-4.6")
- ⭐ Another way: Open IDE Settings → search for "Claude Code" → Click to open the
settings.jsonfile and add the following:
1"claude-code.environmentVariables": [
2 {
3 "name": "ANTHROPIC_AUTH_TOKEN",
4 "value": "xxx"
5 },
6 {
7 "name": "ANTHROPIC_BASE_URL",
8 "value": "https://api.z.ai/api/anthropic"
9 },
10 {
11 "name": "ANTHROPIC_DEFAULT_OPUS_MODEL",
12 "value": "glm-4.6"
13 },
14 {
15 "name": "ANTHROPIC_DEFAULT_SONNET_MODEL",
16 "value": "glm-4.6"
17 },
18 {
19 "name": "ANTHROPIC_DEFAULT_HAIKU_MODEL",
20 "value": "glm-4.5-air"
21 }
22],
23"claude-code.selectedModel": "opus"Then reload the current IDE windows.
⚠️ Note that "default" for "selectedModel" will not work! You can also type
/model and then select "opus".- Cursor Directory/MCPs
- Install MCP for Claude Desktop: Settings → Developer → Edit Config
- Install MCP for Cursor:
- Global: Cursor → Settings… → Cursor Settings → Tools & Integrations
- Project:
.cursor/mcp.json
- In your IDE (VSCode or Cursor), install the Continue extension.
- In LM Studio, navigate to the Developer tab, select your downloaded model → Settings → enable "Serve on Local Network" → enable the server.
- In your IDE, select the "Continue" tab on the left sidebar → Choose "Or, configure your own model" → "Click here to view more providers" (or select the Ollama icon tab if you're using Ollama) → in the provider list, select LM Studio → Set Model to "Autodetect" → Connect → a config file will open at
~/.continue/config.yaml, keep the default settings and save.
- That's it!
- As another option, you can use Granite.code (from IBM)
I’m using Claude Code, if you use another Coding CLI service, modify the codes. Insert below codes in
.bashrc or .zshrc and then source ~/.zshrc:1claude_execute() {
2 emulate -L zsh
3 setopt NO_GLOB
4 local query="$*"
5 local prompt="You are a command line expert. The user wants to run a command but they don't know how. Here is what they asked: ${query}. Return ONLY the exact shell command needed. Do not prepend with an explanation, no markdown, no code blocks - just return the raw command you think will solve their query."
6 local cmd
7 # use Claude Code
8 cmd=$(claude --dangerously-skip-permissions --disallowedTools "Bash(*)" --model default -p "$prompt" --output-format text | tr -d '\000-\037' | sed 's/^[[:space:]]*//;s/[[:space:]]*$//')
9 if [[ -z "$cmd" ]]; then
10 echo "claude_execute: No command found"
11 return 1
12 fi
13 echo -e "$ \033[0;36m$cmd\033[0m"
14 eval "$cmd"
15}
16alias ask="noglob claude_execute"1# Usage
2ask "List all conda env in this computer"