RL-MCP-Server

RL-Slave/RL-MCP-Server

3.2

If you are the rightful owner of RL-MCP-Server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

The Ollama MCP Server is a comprehensive Model Context Protocol server designed to facilitate the use of Ollama models through the MCP protocol, offering a wide range of tools and features for efficient model management and text generation.

Tools

Functions exposed to the LLM to take actions

ollama_list_models

Lists all models

ollama_show_model

Shows model details

ollama_pull_model

Downloads model

ollama_delete_model

Deletes model

ollama_copy_model

Copies model

ollama_create_model

Creates model from modelfile

ollama_generate

Generates text

ollama_generate_stream

Streaming generation

ollama_chat

Chat completion

ollama_chat_stream

Streaming chat

ollama_embeddings

Generate embeddings

ollama_create_embeddings

Batch embeddings

ollama_check_health

Health check

ollama_get_version

Ollama version

ollama_list_processes

Running processes

ollama_get_models_info

All model info

ollama_update_model

Update model

ollama_get_modelfile

Get modelfile

ollama_validate_model

Validate model

ollama_get_model_size

Model size

ollama_search_models

Search models

ollama_check_blobs

Blob status

ollama_save_context

Save context

ollama_load_context

Load context

ollama_clear_context

Clear context

ollama_batch_generate

Batch generation

ollama_compare_models

Compare models

Prompts

Interactive templates invoked by user choice

No prompts

Resources

Contextual data attached and managed by the client

No resources