Skip to content

Conversation

Obad94
Copy link
Contributor

@Obad94 Obad94 commented Oct 15, 2025

Implements parametrized resource URIs for MCP widgets, enabling runtime template resolution and SSR-like functionality.

Changes

  • Unified template-URI handling between Node and Python demos
  • Added Pizzaz toolset with toppings search and pizza-detail handlers using meta.openai/outputTemplate* fields
  • Streamlined asset delivery with auto-inlining and CDN fallback
  • Updated build pipeline for hashed bundles and clean URLs
  • Refreshed README and .env samples with new workflow documentation

How it works

Tools can now return parametrized templates like ui://widget/pizza-list/{pizzaTopping} that get resolved dynamically at runtime instead of being statically cached.

Fixes #47

- align template URI handling across node and python demos

- add pizzaz toppings and detail tools returning parameterized widgets

- inline dev/CDN asset handling with template metadata

- update readme and env samples for SSR resource templates

Fixes: openai#47
@rinormaloku
Copy link

rinormaloku commented Oct 16, 2025

Hi @Obad94

How are you testing this? I gave it a try, and although I defined the resource template, it is never used by ChatGPT.
I validate that by checking connecting the app and then looking at the calls in ngrok interface. And there you can see the following sequence:

  1. Initialize the mcp session
  2. List all tools
  3. Query all Resources that are in the Meta of those tools

And at no point are ResourceTemplates called, but it is masked and may go unnoticed, because of the Resource resource which is identical to the ResourceTemplate.

In my understanding this seems to be a limitation in the OpenAI/ChatGPT server side, or an intended limitation, and I would like input from @katia-openai, if nothing else at least where I should file a ticket to get an answer on this matter.

@Obad94
Copy link
Contributor Author

Obad94 commented Oct 16, 2025

@rinormaloku, in my implementation, the MCP server advertises templated URIs, returns _meta.openai/outputTemplate*, and serves the resolved text/html+skybridge resource with the injected parameters.
The current gap is on the OpenAI/ChatGPT client as it still caches the static URI and doesn’t re-fetch the resolved template, so the widget won’t render until runtime fetching ships.
(Looping in @katia-openai to confirm the client-side timeline.)


Inspector proof and logs

Server Logs

Pizzaz MCP server listening on http://localhost:8000
  SSE stream: GET http://localhost:8000/mcp
  Message post endpoint: POST http://localhost:8000/mcp/messages?sessionId=...
templated ui://widget/pizza-list/%7BpizzaTopping%7D.html?v=dev-hgxti { pizzaTopping: '{pizzaTopping}' }
templated ui://widget/pizza-list/mushroom.html?v=dev-hgxti { pizzaTopping: 'mushroom' }

List tools

npx @modelcontextprotocol/inspector --cli https://1e1216949a29.ngrok-free.app/mcp \
  --transport sse --method tools/list

Key output:

{
  "name": "pizza-list",
  "title": "Show Pizza List",
  "_meta": {
    "openai/outputTemplate": "ui://widget/pizza-list/{pizzaTopping}.html?v=dev-hgxti",
    "openai/toolInvocation/invoking": "Hand-tossing a list",
    "openai/toolInvocation/invoked": "Served a fresh list",
    "openai/widgetAccessible": true,
    "openai/resultCanProduceWidget": true,
    "openai/outputTemplateSchema": {
      "pizzaTopping": {
        "type": "string",
        "description": "Name of the topping to highlight in the list."
      }
    }
  }
}

Call the tool with a parameter

npx @modelcontextprotocol/inspector --cli https://1e1216949a29.ngrok-free.app/mcp \
  --transport sse \
  --method tools/call \
  --tool-name pizza-list \
  --tool-arg pizzaTopping=mushroom

Result:

{
  "_meta": {
    "openai/outputTemplate": "ui://widget/pizza-list/{pizzaTopping}.html?v=dev-hgxti",
    "openai/outputTemplateValues": { "pizzaTopping": "mushroom" },
    "openai/outputTemplateResolved": "ui://widget/pizza-list/mushroom.html?v=dev-hgxti"
  },
  "content": [
    { "type": "text", "text": "Rendered a pizza list! Filtered by “mushroom”." }
  ],
  "structuredContent": {
    "pizzaTopping": "mushroom",
    "availableToppings": [
      "basil", "bianca", "burrata", "deep-dish", "detroit",
      "four-cheese", "margherita", "marinara", "mozzarella",
      "mushroom", "pepperoni", "prosciutto", "sausage",
      "seasonal", "sourdough", "spinach", "truffle", "veggie", "white"
    ],
    "filterApplied": true,
    "requestedTopping": "mushroom"
  }
}

Read the resolved resource

npx @modelcontextprotocol/inspector --cli https://1e1216949a29.ngrok-free.app/mcp \
  --transport sse \
  --method resources/read \
  --uri ui://widget/pizza-list/mushroom.html?v=dev-hgxti

Output:

{
  "contents": [
    {
      "uri": "ui://widget/pizza-list/mushroom.html?v=dev-hgxti",
      "mimeType": "text/html+skybridge",
      "_meta": {
        "openai/outputTemplate": "ui://widget/pizza-list/{pizzaTopping}.html?v=dev-hgxti",
        "openai/outputTemplateValues": { "pizzaTopping": "mushroom" },
        "openai/outputTemplateResolved": "ui://widget/pizza-list/mushroom.html?v=dev-hgxti"
      },
      "text": "<div id=\"pizzaz-list-root\"></div>\n<link rel=\"stylesheet\" href=\"http://localhost:4444/pizzaz-list.css\">\n<script type=\"module\" src=\"http://localhost:4444/pizzaz-list.js\"></script>\n<script>window.__PIZZAZ_TEMPLATE_PARAMS__ = {\"pizzaTopping\":\"mushroom\"};</script>"
    }
  ]
}

Conclusion

  • The MCP server advertises, resolves, and serves parametrized templates as expected.
  • The Inspector confirms _meta.openai/outputTemplate* and the HTML payload are generated dynamically.
  • The widget doesn’t render in ChatGPT today only because the client still prefetches static URIs.
  • Once runtime template fetching is supported, this implementation will work automatically without further changes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Feature: Support Resource Templates (for things such as SSR)

2 participants