llm_forensics

vsn411/llm_forensics

3.2

If you are the rightful owner of llm_forensics and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

The LLM Trace Server is a lightweight, fast Model Context Protocol (MCP) server designed for storing, searching, retrieving, and exporting LLM traces, making it ideal for audit logging and forensic analysis of AI interactions.

Tools

Functions exposed to the LLM to take actions

store_trace

Stores a trace with a given prompt and response.

search_traces

Searches stored traces based on a query.

export_traces_to_csv

Exports all stored traces to a CSV file.

Prompts

Interactive templates invoked by user choice

No prompts

Resources

Contextual data attached and managed by the client

No resources