Back to Browse
Ollama
Run LLMs locally with one command. Llama, Mistral, Gemma, and more. Zero cloud dependency.
CommunityOriginally by Ollama
0 installsListed 3/24/2026
claude-codecodexcursorllmlocalollamaself-hostedprivacy
Security
Passed security verification
This artifact was automatically scanned for malicious patterns, credential access, code execution risks, and source authenticity. All checks passed. Learn about our verification process.
Reviews (0)
No reviews yet. Be the first to review!
Sign in to write a review.
Publisher
Skill Shope
Verified Publisher