Skip to content

feature: custom ai executor #1766

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from
Draft

feature: custom ai executor #1766

wants to merge 1 commit into from

Conversation

YousefED
Copy link
Collaborator

@YousefED YousefED commented Jun 17, 2025

This adds support for a custom AI executor instead of always relying on the Vercel AI SDK to execute requests.

See this example code for usage.

You can test this using npm i https://pkg.pr.new/TypeCellOS/BlockNote/@blocknote/xl-ai@1766

TODO:

  • Implement an easier way to create LLMResponse from a custom streaming response (for now, use LLMResponse.fromArray for testing
  • We could move "model", "stream", "maxRetries" etc. to become responsibilities of the executor

Alternatives

This PR basically kind of lets you "opt out" from using the AI SDK as layer to call underlying LLMs.

We can also explore:
a) (lean in on AI SDK) making it easy to implement a custom Model / Provider which you can then use as "executor". (i.e.: we consider the Model to be the point that defines how LLMs are called
b) (lean "out" of AI SDK on client side): instead of using the AI SDK heavily on the client and a lightweight proxy on the server, we could follow their default design more closely and recommend users to expose REST endpoints that can then in turn use the AI SDK

Copy link

vercel bot commented Jun 17, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Updated (UTC)
blocknote ✅ Ready (Inspect) Visit Preview Jun 17, 2025 2:28pm
blocknote-website ❌ Failed (Inspect) Jun 17, 2025 2:28pm

@YousefED YousefED mentioned this pull request Jun 17, 2025
1 task
Copy link

pkg-pr-new bot commented Jun 17, 2025

Open in StackBlitz

@blocknote/ariakit

npm i https://pkg.pr.new/TypeCellOS/BlockNote/@blocknote/ariakit@1766

@blocknote/code-block

npm i https://pkg.pr.new/TypeCellOS/BlockNote/@blocknote/code-block@1766

@blocknote/core

npm i https://pkg.pr.new/TypeCellOS/BlockNote/@blocknote/core@1766

@blocknote/mantine

npm i https://pkg.pr.new/TypeCellOS/BlockNote/@blocknote/mantine@1766

@blocknote/react

npm i https://pkg.pr.new/TypeCellOS/BlockNote/@blocknote/react@1766

@blocknote/server-util

npm i https://pkg.pr.new/TypeCellOS/BlockNote/@blocknote/server-util@1766

@blocknote/shadcn

npm i https://pkg.pr.new/TypeCellOS/BlockNote/@blocknote/shadcn@1766

@blocknote/xl-ai

npm i https://pkg.pr.new/TypeCellOS/BlockNote/@blocknote/xl-ai@1766

@blocknote/xl-docx-exporter

npm i https://pkg.pr.new/TypeCellOS/BlockNote/@blocknote/xl-docx-exporter@1766

@blocknote/xl-multi-column

npm i https://pkg.pr.new/TypeCellOS/BlockNote/@blocknote/xl-multi-column@1766

@blocknote/xl-odt-exporter

npm i https://pkg.pr.new/TypeCellOS/BlockNote/@blocknote/xl-odt-exporter@1766

@blocknote/xl-pdf-exporter

npm i https://pkg.pr.new/TypeCellOS/BlockNote/@blocknote/xl-pdf-exporter@1766

commit: 2040f86

@nperez0111
Copy link
Contributor

I would prefer for the client to do less work, and given that backends will have a lot more context than a frontend would, I'd personally ease more toward moving the work into the backend & making that easier.

Also worth documenting what the format should look like over the wire even if only examples

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants