AjayiMike/openai-apps-template
If you are the rightful owner of openai-apps-template and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
The Model Context Protocol (MCP) server is a robust backend solution designed to facilitate communication between interactive widgets and the ChatGPT environment, enabling seamless data synchronization and tool execution.
OpenAI Apps Template (Todo App)
Build, preview, and ship a ChatGPT App that renders a responsive React widget and talks to a Model Context Protocol (MCP) server—all from one repository.
- Single-file widget delivery – every widget is bundled into
dist/<name>.htmlwith inline JS + CSS so ChatGPT can embed it viaui://widget/.... - Batteries-included MCP server – exposes SSE + POST endpoints, advertises tools/resources, and returns structured todo data plus widget metadata.
- Local-first DX – Vite dev server with HMR, Tailwind styles, React 19, TypeScript, and helper hooks that mirror the ChatGPT
window.openairuntime.
Use this template as a starting point for any small app that needs both an interactive UI and tools the ChatGPT agent can call.
At a Glance
| Piece | Tech | Purpose |
|---|---|---|
src/widgets/todo.tsx | React + hooks | Interactive todo widget that can optimistically update local state while syncing with the agent |
server.ts | Node + @modelcontextprotocol/sdk | Minimal MCP server exposing todo list, add_todo, toggle_todo, delete_todo, and the widget resource |
build.ts | Vite + esbuild | Discovers widget entries, bundles them, and emits inline HTML files ready for ui://widget/... |
dev.ts | Vite loader | Lets you hot-reload any widget via http://localhost:5173?entry=<name> |
Prerequisites
- Node.js 18+
- npm (or pnpm/yarn if you adapt the scripts)
- (Optional) ngrok or similar tunnel to share the MCP server with ChatGPT
Install dependencies:
npm install
Quick Start
-
Build the widget bundle
npm run build # → dist/todo.html (inline CSS + JS) -
Run the MCP server
npm start # SSE GET http://localhost:8000/mcp # POST POST http://localhost:8000/mcp/messages?sessionId=... -
Preview the widget locally (optional)
npm run dev # open http://localhost:5173?entry=todo -
Expose to ChatGPT (optional)
ngrok http 8000 # use https://<subdomain>.ngrok-free.app/mcp as the connector URL
Development Workflow
Widget iteration
- Widgets live under
src/widgets/<name>.tsx. Each file should callcreateRootand mount itself into__WIDGET_ROOT_ID__. npm run devspins up Vite with HMR. Navigate tohttp://localhost:5173?entry=<name>to load that widget.- Hooks under
src/hooksprovide typed access towindow.openaiglobals:useWidgetPropspulls structured tool output (toolOutput.todoListin this case).useWidgetStatekeeps local widget state in sync with the ChatGPT host viawindow.openai.setWidgetState.useCallTool,useRequestDisplayMode, etc. wrap MCP APIs.
- The todo widget reads
theme,safeArea.insets,displayMode, andmaxHeightso it can match ChatGPT’s light/dark palette, respect safe areas, and resize itself for PiP, inline, or fullscreen layouts without extra work on your end.
Building for ChatGPT
npm run buildauto-discovers every file insrc/widgets(or nestedindex.tsxfiles inside directories) and emitsdist/<name>.html.- Each HTML file inlines:
- Shared Tailwind styles from
src/styles/main.css - A fallback
window.todoDatablock for standalone preview - The minified widget bundle
- Shared Tailwind styles from
- The build script keeps the output self-contained so ChatGPT can fetch
ui://widget/<name>.htmlwith no extra assets.
MCP server
server.tswires up@modelcontextprotocol/sdkwith anSSEServerTransport:GET /mcpstarts the SSE session.POST /mcp/messages?sessionId=<id>streams tool messages/responses.
- Tools shipped by default:
Name Description Arguments list_todosReturn the current todo list none add_todo_itemInsert a new todo { title }toggle_todo_itemFlip completion state { todoId }delete_todo_itemRemove a todo { todoId } - All tool responses include:
structuredContent.todoList(mirrors the widget data contract)_meta.openai/*descriptors so ChatGPT knows to render the widget
- The server keeps a simple in-memory
globalTodoState. Swap this out for a real database or API when you graduate from the demo.
Connecting to ChatGPT Apps
-
Enable Developer Mode in ChatGPT → Settings → Connectors.
-
Create a new “Model Context Protocol” connector.
-
Point the MCP URL at your server (local tunnel or deployed host):
https://<your-ngrok-subdomain>.ngrok-free.app/mcp -
In a ChatGPT conversation, ask something like “Show my todo list” or “Add ‘Review PR’ to my todos.” ChatGPT will call the appropriate tool(s), and the widget will render using the
ui://resource the server exposes.
Project Structure
openai-apps-template/
├─ src/
│ ├─ widgets/
│ │ └─ todo/
│ │ ├─ index.tsx # Widget entry point (self-mounting)
│ │ └─ components/
│ │ ├─ NewTodo.tsx # Input row for creating todos
│ │ └─ TodoItem.tsx # Presentational list item
│ ├─ hooks/ # window.openai + MCP helpers
│ ├─ styles/main.css # Tailwind layer shared by widgets
│ └─ utils/ # UI utilities (media queries, etc.)
├─ build.ts # Inline widget bundler
├─ dev.ts # Vite preview loader (?entry=<name>)
├─ server.ts # MCP SSE server + todo tools
├─ dist/ # Generated ui:// HTML widgets
├─ package.json
└─ README.md
Scripts & Configuration
| Command | Description |
|---|---|
npm run build | Bundle every widget into dist/*.html |
npm start | Launch MCP server (defaults to port 8000) |
npm run start:all | Build widgets, then start the server |
npm run dev | Vite dev server for widget HMR preview |
Environment variables:
PORT– overrides the MCP server port (default8000).
Extending the Template
- Add another widget: drop
src/widgets/notes.tsx, ensure it mounts to__WIDGET_ROOT_ID__, runnpm run build, then expose it as a new resource/tool inserver.ts. - Persist todo data: replace
globalTodoStatewith a database call or REST client. The only contract the widget needs is{ todoList: TodoList }, whereTodoListis just{ title: string; todos: TodoItem[] }. - Enhance UI state: use
useWidgetStatefor optimistic updates; it already mirrors state back to ChatGPT so the host stays in sync between agent turns. - Add more tools: register descriptors in
server.ts(ListToolsRequestSchemahandler) and implement the behavior inside theCallToolRequesthandler.
Troubleshooting
- Widget shows stale data – ensure the MCP response includes fresh
structuredContent.todoListand_meta.openai/outputTemplate. The widget only resyncs when the host sends new structured data. - “Build not found” error – run
npm run buildbefore startingnpm startsodist/todo.htmlexists. - No widget in ChatGPT – verify the connector is allowed to render widgets (
openai/widgetAccessible: true) and that theui://widget/<name>.htmlresource is listed viaListResources. - CORS or tunnel issues – the server enables
Access-Control-Allow-Origin: *, but your tunnel must forward bothGET /mcpandPOST /mcp/messages.
Happy shipping! Open issues or PRs if you add new widgets, storage adapters, or deployment recipes.