For many users, Ollama is the fastest honest answer to “how do I run Gemma 4 locally without building an entire stack first?”
ollama --version
ollama pull gemma4
ollama list
ollama run gemma4 "roses are red"The official Google AI for Developers integration page also lists the available tags:
gemma4:e2bgemma4:e4bgemma4:26bgemma4:31b# simple text generation
curl http://localhost:11434/api/generate -d '{
"model": "gemma4",
"prompt": "roses are red"
}'
# multimodal caption example (image path or base64 array)
curl http://localhost:11434/api/generate -d '{
"model": "gemma4",
"prompt": "caption this image",
"images": ["/path/to/image.png"]
}'ollama pull a tag before running.