Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .bumpversion.cfg
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
[bumpversion]
current_version = 5.0.0a1
current_version = 4.5.1a1
commit = False
tag = False
parse = (?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+)(rc(?P<build>\d+))?
Expand Down
134 changes: 134 additions & 0 deletions docs/getting-started/agentic.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,134 @@
# Agentic application
!!! danger
The features and api described in this section are under heavy development and therefore subject to change.
Be prepared for it to break.

**However if you don't care about an unstable API, using the features in this mode
of the orchestrator will unlock quite a bit of potential**


The Agentic mode of the Orchestrator can be unlocked by doing the following.

### Pre-requisites
- pg_vector installed in your postgres database
- At minimum an `api_key` to talk to ChatGPT
- The UI configured to with the LLM integration branch - still WIP - https://github.com/workfloworchestrator/example-orchestrator-ui/pull/72/files

### Step 1 - Install the package:

Create a virtualenv and install the core including the LLM dependencies.

<div class="termy">

```shell
python -m venv .venv
source .venv/bin/activate
pip install orchestrator-core[llm]
```

</div>

### Step 2 - Setup the database:

Create a postgres database, make sure your postgres install has the `pgvector` extension installed:

<div class="termy">

```shell
createuser -sP nwa
createdb orchestrator-core -O nwa
```

</div>

Choose a password and remember it for later steps.

As an example, you can run these docker commands in separate shells to start a temporary postgres instance:

```shell
docker run --rm --name temp-orch-db -e POSTGRES_PASSWORD=rootpassword -p 5432:5432 postgres:15

docker exec -it temp-orch-db su - postgres -c 'createuser -sP nwa && createdb orchestrator-core -O nwa'
```

### Step 3 - Create the main.py:

Create a `main.py` file.

```python
from orchestrator import AgenticOrchestratorCore
from orchestrator.cli.main import app as core_cli
from orchestrator.settings import app_settings
from orchestrator.llm_settings import llm_settings

llm_settings.LLM_ENABLED = True
llm_settings.AGENT_MODEL = 'gpt-4o-mini'
llm_settings.OPENAI_API_KEY = 'xxxxx'


app = AgenticOrchestratorCore(
base_settings=app_settings,
llm_settings=llm_settings,
llm_model=llm_settings.AGENT_MODEL,
agent_tools=[]
)

if __name__ == "__main__":
core_cli()
```

### Step 4 - Run the database migrations:

Initialize the migration environment and database tables.

<div class="termy">

```shell
export DATABASE_URI=postgresql://nwa:PASSWORD_FROM_STEP_2@localhost:5432/orchestrator-core

python main.py db init
python main.py db upgrade heads
```

</div>

### Step 5 - Run the app

<div class="termy">

```shell
export DATABASE_URI=postgresql://nwa:PASSWORD_FROM_STEP_2@localhost:5432/orchestrator-core
export OAUTH2_ACTIVE=False

uvicorn --reload --host 127.0.0.1 --port 8080 main:app
```

</div>

### Step 6 - Index all your current subscriptions, processes, workflows and products:

!!! warning
This will call out to external LLM services and cost money


<div class="termy">

```shell
python main.py index subscriptions
python main.py index products
python main.py index processes
python main.py index workflows
```

</div>

### Step 7 - Profit :boom: :grin:

Visit the [ReDoc](http://127.0.0.1:8080/api/redoc) or [OpenAPI](http://127.0.0.1:8080/api/docs) to view and interact with the API.


### Next:

- [Create a product.](../workshops/advanced/domain-models.md)
- [Create a workflow for a product.](./workflows.md)
- [Generate products and workflows](../reference-docs/cli.md#generate)
4 changes: 2 additions & 2 deletions docs/getting-started/base.md
Original file line number Diff line number Diff line change
Expand Up @@ -58,9 +58,9 @@ Create a `main.py` file.
```python
from orchestrator import OrchestratorCore
from orchestrator.cli.main import app as core_cli
from orchestrator.settings import AppSettings
from orchestrator.settings import app_settings

app = OrchestratorCore(base_settings=AppSettings())
app = OrchestratorCore(base_settings=app_settings)

if __name__ == "__main__":
core_cli()
Expand Down
2 changes: 1 addition & 1 deletion docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ There are a number of options for getting started:
- For those who are more adventurous, follow the guide on the [next page](getting-started/base.md) to
start coding right away.

<!-- Followinh line are not visible? -->
<!-- Following line are not visible? -->
[//]: # (- If you would like to see the workflow engine in action, click [here]&#40;https://demo.workfloworchestrator.org&#41; this )

[//]: # (will take you to our demo environment, where you can see some of our examples in action.)
11 changes: 11 additions & 0 deletions docs/reference-docs/app/agentic-app.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
# agentic_app.py

The agentic_app.py module is used in `orchestrator-core` for actually running the entire Agentic WFO FastAPI backend and the CLI.

## FastAPI Backend

The code for the WFO's Fast API backend is very well documented, so look through the functions used in this module here:

::: orchestrator.agentic_app
options:
heading_level: 3
4 changes: 3 additions & 1 deletion mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -148,9 +148,10 @@ nav:
- Backfilling Existing Subscriptions: architecture/product_modelling/backfilling.md
- Getting Started:
- Prerequisites: getting-started/versions.md
- Base Application:
- Application:
- Preparing source folder: getting-started/prepare-source-folder.md
- Base application: getting-started/base.md
- Agentic application: getting-started/agentic.md
- Workflows:
- Creating a workflow: getting-started/workflows.md
- Registering a workflow: getting-started/workflows#register-workflows
Expand Down Expand Up @@ -183,6 +184,7 @@ nav:
- Forms: reference-docs/forms.md
- Running the App:
- App.py: reference-docs/app/app.md
- Agentic App.py: reference-docs/app/agentic-app.md
- Python Version: reference-docs/python.md
- Scaling: reference-docs/app/scaling.md
- Settings: reference-docs/app/settings-overview.md
Expand Down
28 changes: 26 additions & 2 deletions orchestrator/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,15 +13,39 @@

"""This is the orchestrator workflow engine."""

__version__ = "5.0.0a1"
__version__ = "4.5.1a1"

from orchestrator.app import OrchestratorCore

from structlog import get_logger

logger = get_logger(__name__)

logger.info("Starting the orchestrator", version=__version__)

from orchestrator.llm_settings import llm_settings
from orchestrator.settings import app_settings

if llm_settings.LLM_ENABLED:
try:
from importlib import import_module

import_module("pydantic_ai")
from orchestrator.agentic_app import AgenticOrchestratorCore as OrchestratorCore

except ImportError:
logger.error(
"Unable to import 'pydantic_ai' module, please install the orchestrator with llm dependencies. `pip install orchestrator-core[llm]",
)
exit(1)
else:
from orchestrator.app import OrchestratorCore # type: ignore[assignment]

from orchestrator.workflow import begin, conditional, done, focussteps, inputstep, retrystep, step, steplens, workflow

__all__ = [
"OrchestratorCore",
"app_settings",
"llm_settings",
"step",
"inputstep",
"workflow",
Expand Down
84 changes: 84 additions & 0 deletions orchestrator/agentic_app.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,84 @@
#!/usr/bin/env python3
"""The main application module.

This module contains the main `AgenticOrchestratorCore` class for the `FastAPI` backend and
provides the ability to run the CLI.
"""
# Copyright 2019-2025 SURF
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from typing import Any

import typer
from pydantic_ai.models.openai import OpenAIModel
from pydantic_ai.toolsets import FunctionToolset
from structlog import get_logger

from orchestrator.app import OrchestratorCore
from orchestrator.cli.main import app as cli_app
from orchestrator.llm_settings import LLMSettings, llm_settings

logger = get_logger(__name__)


class AgenticOrchestratorCore(OrchestratorCore):
def __init__(
self,
*args: Any,
llm_model: OpenAIModel | str = "gpt-4o-mini",
llm_settings: LLMSettings = llm_settings,
agent_tools: list[FunctionToolset] | None = None,
**kwargs: Any,
) -> None:
"""Initialize the `AgenticOrchestratorCore` class.

This class takes the same arguments as the `OrchestratorCore` class.

Args:
*args: All the normal arguments passed to the `OrchestratorCore` class.
llm_model: An OpenAI model class or string, not limited to OpenAI models (gpt-4o-mini etc)
llm_settings: A class of settings for the LLM
agent_tools: A list of tools that can be used by the agent
**kwargs: Additional arguments passed to the `OrchestratorCore` class.

Returns:
None
"""
self.llm_model = llm_model
self.agent_tools = agent_tools
self.llm_settings = llm_settings

super().__init__(*args, **kwargs)

logger.info("Mounting the agent")
self.register_llm_integration()

def register_llm_integration(self) -> None:
"""Mount the Agent endpoint.

This helper mounts the agent endpoint on the application.

Returns:
None

"""
from orchestrator.search.agent import build_agent_app

agent_app = build_agent_app(self.llm_model, self.agent_tools)
self.mount("/agent", agent_app)


main_typer_app = typer.Typer()
main_typer_app.add_typer(cli_app, name="orchestrator", help="The orchestrator CLI commands")

if __name__ == "__main__":
main_typer_app()
15 changes: 9 additions & 6 deletions orchestrator/api/api_v1/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,6 @@
product_blocks,
products,
resource_types,
search,
settings,
subscription_customer_descriptions,
subscriptions,
Expand All @@ -31,6 +30,7 @@
workflows,
ws,
)
from orchestrator.llm_settings import llm_settings
from orchestrator.security import authorize

api_router = APIRouter()
Expand Down Expand Up @@ -85,8 +85,11 @@
)
api_router.include_router(ws.router, prefix="/ws", tags=["Core", "Events"])

api_router.include_router(
search.router,
prefix="/search",
tags=["Core", "Search"],
)
if llm_settings.LLM_ENABLED:
from orchestrator.api.api_v1.endpoints import search

api_router.include_router(
search.router,
prefix="/search",
tags=["Core", "Search"],
)
Loading