Setting Up Free AI Code Review with GLM and Ollama

A step-by-step guide to running Open Code Review with zero cost using GLM (free cloud) and Ollama (free local). Full L3 deep scan included.

Raye Deng2026-03-104 min read
tutorialfreeglmollama

Zero-Cost AI Code Review

One of the most common questions: "Do I need to pay for an API key?" No. Open Code Review supports two completely free providers.

Option 1: GLM (Cloud, Free)

GLM (智谱) offers a free tier that's more than enough for code review.

# 1. Get your free API key at https://open.bigmodel.cn
# 2. Run scan
npx @opencodereview/cli@latest scan ./src \
  --level l3 \
  --provider glm \
  --api-key your-free-glm-key
# 3. Auto-heal
npx @opencodereview/cli@latest heal ./src \
  --provider glm \
  --api-key your-free-glm-key

Option 2: Ollama (Local, Free)

Run everything locally — no API key, no internet, no data leaving your machine.

# 1. Install Ollama
curl -fsSL https://ollama.com/install.sh | sh
# 2. Pull a model
ollama pull llama3
# 3. Run OCR
npx @opencodereview/cli@latest scan ./src \
  --level l3 \
  --provider ollama

Environment Variables

For CI/CD, use environment variables instead of CLI flags:

export OCR_PROVIDER=glm
export OCR_API_KEY=your-free-glm-key
export OCR_MODEL=glm-4

npx @opencodereview/cli@latest scan ./src --level l3 npx @opencodereview/cli@latest heal ./src

Config File

For project-level configuration:

{
  "level": "l3",
  "provider": "glm",
  "apiKey": "your-free-glm-key",
  "paths": ["src/**/*.ts"],
  "threshold": 70
}

Which Should You Choose?

GLMOllama
Setup time2 min5 min
Internet requiredYesNo
Data privacyCloud-processed100% local
Model qualityHigh (GLM-4)Varies by model
Best forCI/CD pipelinesPrivacy-sensitive projects

Next Steps

  • Pick a provider (GLM for easy start, Ollama for privacy)
  • Run your first scan
  • Add to your CI/CD pipeline
  • Enable auto-heal
  • Total cost: $0. Total setup time: 5 minutes.

    Ready to detect AI code hallucinations?

    Get started for free in 30 seconds.