mnemo

Logos-Flux/mnemo

3.4

If you are the rightful owner of mnemo and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

Mnemo is an extended memory solution for AI assistants, leveraging Gemini's context caching to provide access to large datasets like codebases and documentation.

Tools

Functions exposed to the LLM to take actions

context_load

Load GitHub repos, URLs, PDFs, or local directories into Gemini cache.

context_query

Query a cached context with natural language.

context_list

List all active caches with token counts and expiry.

context_evict

Remove a cache.

context_stats

Get usage statistics with cost tracking.

context_refresh

Reload a cache with fresh content.

Prompts

Interactive templates invoked by user choice

No prompts

Resources

Contextual data attached and managed by the client

No resources