๐ŸฆŠStackFox
๐Ÿง 

Ollama Local LLM

OfficialLow Risk

MCP server for running and interacting with local LLMs through Ollama

Category
๐Ÿง AI & LLM
Author
ollama
Last Updated
Unknown
Source
โšกVoltAgent

๐Ÿ”’Security Analysis

Risk ScoreLow Risk
Safe1/10Risky

Required Permissions

๐ŸŒ
Network
Can make network requests
Security Notes

Runs locally without sending data to external servers. May consume significant CPU/GPU resources.

Minimal system access, safe to use

Source Code

View on GitHub

Explore More Skills

Discover hundreds more Claude Code skills with security analysis.

Browse All Skills