diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index 4a101818..eec273a4 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -33,6 +33,13 @@ jobs: - uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6 with: python-version: ${{ matrix.python }} + # protoc is required to build temporalio from the sdk-python git source + # (see [tool.uv.sources] in pyproject.toml). Drop this step once the SDK + # ships a release that includes the LangGraph plugin. + - uses: arduino/setup-protoc@c65c819552d16ad3c9b72d9dfd5ba5237b9c906b # v3 + with: + version: "23.x" + repo-token: ${{ secrets.GITHUB_TOKEN }} - run: uv tool install poethepoet - run: uv sync --group=dsl --group=encryption --group=trio-async - run: poe lint diff --git a/README.md b/README.md index d4d6a61b..a94d7b7f 100644 --- a/README.md +++ b/README.md @@ -72,6 +72,7 @@ Some examples require extra dependencies. See each sample's directory for specif * [gevent_async](gevent_async) - Combine gevent and Temporal. * [hello_standalone_activity](hello_standalone_activity) - Use activities without using a workflow. * [langchain](langchain) - Orchestrate workflows for LangChain. +* [langgraph_plugin](langgraph_plugin) - Run LangGraph workflows as durable Temporal workflows (Graph API and Functional API). * [message_passing/introduction](message_passing/introduction/) - Introduction to queries, signals, and updates. * [message_passing/safe_message_handlers](message_passing/safe_message_handlers/) - Safely handling updates and signals. * [message_passing/update_with_start/lazy_initialization](message_passing/update_with_start/lazy_initialization/) - Use update-with-start to update a Shopping Cart, starting it if it does not exist. diff --git a/langgraph_plugin/README.md b/langgraph_plugin/README.md new file mode 100644 index 00000000..bc3c1233 --- /dev/null +++ b/langgraph_plugin/README.md @@ -0,0 +1,73 @@ +# LangGraph Plugin Samples + +These samples demonstrate the [Temporal LangGraph plugin](https://github.com/temporalio/sdk-python/pull/1448), which runs LangGraph workflows as durable Temporal workflows. Each LangGraph graph node (Graph API) or `@task` (Functional API) executes as a Temporal activity with automatic retries, timeouts, and crash recovery. + +Samples are organized by API style: + +- **Graph API** (`graph_api/`) -- Define workflows as `StateGraph` with nodes and edges. +- **Functional API** (`functional_api/`) -- Define workflows with `@task` and `@entrypoint` decorators for an imperative programming style. + +## Samples + +| Sample | Graph API | Functional API | Description | +|--------|:---------:|:--------------:|-------------| +| **Hello World** | [graph_api/hello_world](graph_api/hello_world) | [functional_api/hello_world](functional_api/hello_world) | Minimal sample -- single node/task that processes a query string. Start here. | +| **Human-in-the-loop** | [graph_api/human_in_the_loop](graph_api/human_in_the_loop) | [functional_api/human_in_the_loop](functional_api/human_in_the_loop) | Chatbot that uses `interrupt()` to pause for human approval, Temporal signals to receive feedback, and queries to expose the pending draft. | +| **Continue-as-new** | [graph_api/continue_as_new](graph_api/continue_as_new) | [functional_api/continue_as_new](functional_api/continue_as_new) | Multi-stage data pipeline that uses `continue-as-new` with task result caching so previously-completed stages are not re-executed. | +| **ReAct Agent** | [graph_api/react_agent](graph_api/react_agent) | [functional_api/react_agent](functional_api/react_agent) | Tool-calling agent loop. Graph API uses conditional edges; Functional API uses a `while` loop. | +| **Control Flow** | -- | [functional_api/control_flow](functional_api/control_flow) | Demonstrates parallel task execution, `for` loops, and `if/else` branching -- patterns that are natural in the Functional API. | +| **LangSmith Tracing** | [graph_api/langsmith_tracing](graph_api/langsmith_tracing) | [functional_api/langsmith_tracing](functional_api/langsmith_tracing) | Combines `LangGraphPlugin` with Temporal's `LangSmithPlugin` for durable execution + full observability of LLM calls. Requires API keys. | + +## Prerequisites + +1. Install dependencies: + + ```bash + uv sync --group langgraph + ``` + +2. Start a [Temporal dev server](https://docs.temporal.io/cli#start-dev-server): + + ```bash + temporal server start-dev + ``` + +## Running a Sample + +Most samples have two scripts -- start the Worker first, then the Workflow starter in a separate terminal. + +```bash +# Terminal 1: start the Worker +uv run langgraph_plugin///run_worker.py + +# Terminal 2: start the Workflow +uv run langgraph_plugin///run_workflow.py +``` + +For example, to run the Graph API human-in-the-loop chatbot: + +```bash +# Terminal 1 +uv run langgraph_plugin/graph_api/human_in_the_loop/run_worker.py + +# Terminal 2 +uv run langgraph_plugin/graph_api/human_in_the_loop/run_workflow.py +``` + +The LangSmith Tracing samples bundle the Worker and Workflow execution into a single `main.py`: + +```bash +uv run langgraph_plugin//langsmith_tracing/main.py +``` + +## Key Features Demonstrated + +- **Durable execution** -- Every graph node / `@task` runs as a Temporal activity with configurable timeouts and retry policies. +- **Human-in-the-loop** -- LangGraph's `interrupt()` pauses the graph; Temporal signals deliver human input; queries expose pending state to UIs. +- **Continue-as-new with caching** -- `cache()` captures completed task results; passing the cache to the next execution avoids re-running them. +- **Conditional routing** -- Graph API's `add_conditional_edges` and Functional API's native `if/else`/`while` for agent loops. +- **Parallel execution** -- Functional API launches multiple tasks concurrently by creating futures before awaiting them. + +## Related + +- [SDK PR: LangGraph plugin](https://github.com/temporalio/sdk-python/pull/1448) diff --git a/langgraph_plugin/__init__.py b/langgraph_plugin/__init__.py new file mode 100644 index 00000000..a8523185 --- /dev/null +++ b/langgraph_plugin/__init__.py @@ -0,0 +1 @@ +"""Temporal LangGraph plugin samples.""" diff --git a/langgraph_plugin/functional_api/__init__.py b/langgraph_plugin/functional_api/__init__.py new file mode 100644 index 00000000..6caae83e --- /dev/null +++ b/langgraph_plugin/functional_api/__init__.py @@ -0,0 +1 @@ +"""LangGraph Functional API samples using @task and @entrypoint.""" diff --git a/langgraph_plugin/functional_api/continue_as_new/README.md b/langgraph_plugin/functional_api/continue_as_new/README.md new file mode 100644 index 00000000..1571e11f --- /dev/null +++ b/langgraph_plugin/functional_api/continue_as_new/README.md @@ -0,0 +1,36 @@ +# Continue-as-New with Caching (Functional API) + +Demonstrates combining Temporal's continue-as-new with LangGraph's task result caching to avoid re-executing completed `@task` functions across workflow boundaries. + +## What This Sample Demonstrates + +- Task result caching across continue-as-new boundaries with `cache()` +- Restoring cached results with `entrypoint(name, cache=...)` +- Each `@task` executes exactly once despite multiple workflow invocations + +## How It Works + +1. Three tasks run sequentially: `double` (x2) -> `add_50` (+50) -> `triple` (x3). +2. After the first invocation, the workflow continues-as-new with the cache. +3. On subsequent invocations, all tasks return cached results instantly. +4. Input 10 -> 20 -> 70 -> 210. + +## Running the Sample + +Prerequisites: `uv sync --group langgraph` and a running Temporal dev server (`temporal server start-dev`). + +```bash +# Terminal 1 +uv run langgraph_plugin/functional_api/continue_as_new/run_worker.py + +# Terminal 2 +uv run langgraph_plugin/functional_api/continue_as_new/run_workflow.py +``` + +## Files + +| File | Description | +|------|-------------| +| `workflow.py` | `@task` functions, `@entrypoint`, `PipelineInput`, and `PipelineFunctionalWorkflow` | +| `run_worker.py` | Registers tasks and entrypoint with `LangGraphPlugin`, starts Worker | +| `run_workflow.py` | Executes the pipeline Workflow and prints the result | diff --git a/langgraph_plugin/functional_api/continue_as_new/__init__.py b/langgraph_plugin/functional_api/continue_as_new/__init__.py new file mode 100644 index 00000000..5b90703e --- /dev/null +++ b/langgraph_plugin/functional_api/continue_as_new/__init__.py @@ -0,0 +1 @@ +"""Continue-as-new pipeline with task result caching.""" diff --git a/langgraph_plugin/functional_api/continue_as_new/run_worker.py b/langgraph_plugin/functional_api/continue_as_new/run_worker.py new file mode 100644 index 00000000..a6b6a9d7 --- /dev/null +++ b/langgraph_plugin/functional_api/continue_as_new/run_worker.py @@ -0,0 +1,37 @@ +"""Worker for the continue-as-new pipeline (Functional API).""" + +import asyncio +import os + +from temporalio.client import Client +from temporalio.contrib.langgraph import LangGraphPlugin +from temporalio.worker import Worker + +from langgraph_plugin.functional_api.continue_as_new.workflow import ( + PipelineFunctionalWorkflow, + activity_options, + all_tasks, + pipeline_entrypoint, +) + + +async def main() -> None: + client = await Client.connect(os.environ.get("TEMPORAL_ADDRESS", "localhost:7233")) + plugin = LangGraphPlugin( + entrypoints={"pipeline": pipeline_entrypoint}, + tasks=all_tasks, + activity_options=activity_options, + ) + + worker = Worker( + client, + task_queue="langgraph-pipeline-functional", + workflows=[PipelineFunctionalWorkflow], + plugins=[plugin], + ) + print("Worker started. Ctrl+C to exit.") + await worker.run() + + +if __name__ == "__main__": + asyncio.run(main()) diff --git a/langgraph_plugin/functional_api/continue_as_new/run_workflow.py b/langgraph_plugin/functional_api/continue_as_new/run_workflow.py new file mode 100644 index 00000000..563aaba6 --- /dev/null +++ b/langgraph_plugin/functional_api/continue_as_new/run_workflow.py @@ -0,0 +1,31 @@ +"""Start the continue-as-new pipeline workflow (Functional API).""" + +import asyncio +import os +from datetime import timedelta + +from temporalio.client import Client + +from langgraph_plugin.functional_api.continue_as_new.workflow import ( + PipelineFunctionalWorkflow, + PipelineInput, +) + + +async def main() -> None: + client = await Client.connect(os.environ.get("TEMPORAL_ADDRESS", "localhost:7233")) + + result = await client.execute_workflow( + PipelineFunctionalWorkflow.run, + PipelineInput(data=10), + id="pipeline-functional-workflow", + task_queue="langgraph-pipeline-functional", + execution_timeout=timedelta(seconds=60), + ) + + # 10*2=20 -> 20+50=70 -> 70*3=210 + print(f"Pipeline result: {result}") + + +if __name__ == "__main__": + asyncio.run(main()) diff --git a/langgraph_plugin/functional_api/continue_as_new/workflow.py b/langgraph_plugin/functional_api/continue_as_new/workflow.py new file mode 100644 index 00000000..c8594b34 --- /dev/null +++ b/langgraph_plugin/functional_api/continue_as_new/workflow.py @@ -0,0 +1,84 @@ +"""Continue-as-new with caching using the LangGraph Functional API with Temporal. + +Demonstrates Temporal's continue-as-new with LangGraph's task result caching +to avoid re-executing completed @task functions across workflow boundaries. +""" + +from dataclasses import dataclass +from datetime import timedelta +from typing import Any + +from langgraph.func import entrypoint, task +from temporalio import workflow +from temporalio.contrib.langgraph import cache +from temporalio.contrib.langgraph import entrypoint as temporal_entrypoint + + +@task +def double(data: int) -> int: + """Stage 1: double the input.""" + return data * 2 + + +@task +def add_50(data: int) -> int: + """Stage 2: add 50.""" + return data + 50 + + +@task +def triple(data: int) -> int: + """Stage 3: triple the result.""" + return data * 3 + + +@entrypoint() +async def pipeline_entrypoint(data: int) -> dict: + """Run the 3-stage pipeline: double -> add_50 -> triple.""" + doubled = await double(data) + plus_50 = await add_50(doubled) + tripled = await triple(plus_50) + return {"result": tripled} + + +all_tasks: list[Any] = [double, add_50, triple] + +activity_options = { + t.func.__name__: { + "execute_in": "activity", + "start_to_close_timeout": timedelta(seconds=30), + } + for t in all_tasks +} + + +@dataclass +class PipelineInput: + data: int + cache: dict[str, Any] | None = None + phase: int = 1 + + +@workflow.defn +class PipelineFunctionalWorkflow: + """Runs the pipeline, continuing-as-new after each phase. + + Input 10: 10*2=20 -> 20+50=70 -> 70*3=210 + Each task executes once; phases 2 and 3 use cached results. + """ + + @workflow.run + async def run(self, input_data: PipelineInput) -> dict[str, Any]: + app = temporal_entrypoint("pipeline", cache=input_data.cache) + result = await app.ainvoke(input_data.data) + + if input_data.phase < 3: + workflow.continue_as_new( + PipelineInput( + data=input_data.data, + cache=cache(), + phase=input_data.phase + 1, + ) + ) + + return result diff --git a/langgraph_plugin/functional_api/control_flow/README.md b/langgraph_plugin/functional_api/control_flow/README.md new file mode 100644 index 00000000..e6e1ec49 --- /dev/null +++ b/langgraph_plugin/functional_api/control_flow/README.md @@ -0,0 +1,37 @@ +# Control Flow (Functional API) + +Demonstrates the Functional API's strength for complex control flow: parallel execution, sequential loops, and conditional branching — all as natural Python code. + +## What This Sample Demonstrates + +- **Parallel execution**: launching multiple tasks concurrently by creating futures before awaiting +- **For loops**: processing items sequentially with `for item in items` +- **If/else branching**: routing items based on classification results +- Why the Functional API is ideal for programmatic composition patterns + +## How It Works + +1. A batch of items is validated **in parallel** — all `validate_item` tasks launch concurrently. +2. Valid items are processed **sequentially** in a for loop. +3. Each item is classified, then routed via **if/else** to `process_urgent` or `process_normal`. +4. Results are aggregated with a `summarize` task. + +## Running the Sample + +Prerequisites: `uv sync --group langgraph` and a running Temporal dev server (`temporal server start-dev`). + +```bash +# Terminal 1 +uv run langgraph_plugin/functional_api/control_flow/run_worker.py + +# Terminal 2 +uv run langgraph_plugin/functional_api/control_flow/run_workflow.py +``` + +## Files + +| File | Description | +|------|-------------| +| `workflow.py` | `@task` functions (validate, classify, process, summarize), `@entrypoint`, and `ControlFlowWorkflow` | +| `run_worker.py` | Registers tasks and entrypoint with `LangGraphPlugin`, starts worker | +| `run_workflow.py` | Sends a batch of items and prints processing results | diff --git a/langgraph_plugin/functional_api/control_flow/__init__.py b/langgraph_plugin/functional_api/control_flow/__init__.py new file mode 100644 index 00000000..e78d3ea4 --- /dev/null +++ b/langgraph_plugin/functional_api/control_flow/__init__.py @@ -0,0 +1 @@ +"""Control flow: parallel execution, for loops, and if/else branching.""" diff --git a/langgraph_plugin/functional_api/control_flow/run_worker.py b/langgraph_plugin/functional_api/control_flow/run_worker.py new file mode 100644 index 00000000..0a73c19f --- /dev/null +++ b/langgraph_plugin/functional_api/control_flow/run_worker.py @@ -0,0 +1,37 @@ +"""Worker for the control flow pipeline (Functional API).""" + +import asyncio +import os + +from temporalio.client import Client +from temporalio.contrib.langgraph import LangGraphPlugin +from temporalio.worker import Worker + +from langgraph_plugin.functional_api.control_flow.workflow import ( + ControlFlowWorkflow, + activity_options, + all_tasks, + control_flow_pipeline, +) + + +async def main() -> None: + client = await Client.connect(os.environ.get("TEMPORAL_ADDRESS", "localhost:7233")) + plugin = LangGraphPlugin( + entrypoints={"control-flow": control_flow_pipeline}, + tasks=all_tasks, + activity_options=activity_options, + ) + + worker = Worker( + client, + task_queue="langgraph-control-flow", + workflows=[ControlFlowWorkflow], + plugins=[plugin], + ) + print("Worker started. Ctrl+C to exit.") + await worker.run() + + +if __name__ == "__main__": + asyncio.run(main()) diff --git a/langgraph_plugin/functional_api/control_flow/run_workflow.py b/langgraph_plugin/functional_api/control_flow/run_workflow.py new file mode 100644 index 00000000..2784415b --- /dev/null +++ b/langgraph_plugin/functional_api/control_flow/run_workflow.py @@ -0,0 +1,38 @@ +"""Start the control flow pipeline workflow (Functional API).""" + +import asyncio +import os + +from temporalio.client import Client + +from langgraph_plugin.functional_api.control_flow.workflow import ( + ControlFlowWorkflow, +) + + +async def main() -> None: + client = await Client.connect(os.environ.get("TEMPORAL_ADDRESS", "localhost:7233")) + + items = [ + "Fix login bug", + "URGENT: Production outage in payments", + "Update README", + "INVALID:", + "Urgent: Security patch needed", + "Refactor test suite", + ] + + result = await client.execute_workflow( + ControlFlowWorkflow.run, + items, + id="control-flow-workflow", + task_queue="langgraph-control-flow", + ) + + print(f"Summary: {result['summary']}") + for r in result["results"]: + print(f" {r}") + + +if __name__ == "__main__": + asyncio.run(main()) diff --git a/langgraph_plugin/functional_api/control_flow/workflow.py b/langgraph_plugin/functional_api/control_flow/workflow.py new file mode 100644 index 00000000..5db75ce1 --- /dev/null +++ b/langgraph_plugin/functional_api/control_flow/workflow.py @@ -0,0 +1,99 @@ +"""Control flow sample using the LangGraph Functional API with Temporal. + +Demonstrates the Functional API's advantage for complex control flow: + - Parallel task execution (launch multiple tasks concurrently) + - Sequential for-loop processing + - Conditional if/else branching based on intermediate results +""" + +from datetime import timedelta +from typing import Any + +from langgraph.func import entrypoint, task +from temporalio import workflow +from temporalio.contrib.langgraph import entrypoint as temporal_entrypoint + + +@task +def validate_item(item: str) -> bool: + """Validate an item. Returns True if the item is non-empty and well-formed.""" + return len(item.strip()) > 0 and not item.startswith("INVALID:") + + +@task +def classify_item(item: str) -> str: + """Classify an item as 'urgent' or 'normal' based on its content.""" + return "urgent" if "urgent" in item.lower() else "normal" + + +@task +def process_urgent(item: str) -> str: + """Process an urgent item with priority handling.""" + return f"[PRIORITY] Processed: {item}" + + +@task +def process_normal(item: str) -> str: + """Process a normal item with standard handling.""" + return f"[STANDARD] Processed: {item}" + + +@task +def summarize(results: list[str]) -> str: + """Produce a summary of all processed results.""" + urgent_count = sum(1 for r in results if r.startswith("[PRIORITY]")) + normal_count = sum(1 for r in results if r.startswith("[STANDARD]")) + return ( + f"Processed {len(results)} items ({urgent_count} urgent, {normal_count} normal)" + ) + + +@entrypoint() +async def control_flow_pipeline(items: list[str]) -> dict: + """Process a batch of items with parallel validation, sequential + classification, and conditional routing. + """ + # PARALLEL: Validate all items concurrently. + # Creating task futures without awaiting launches them in parallel. + validation_futures = [validate_item(item) for item in items] + valid_flags = [await f for f in validation_futures] + valid_items = [item for item, is_valid in zip(items, valid_flags) if is_valid] + + # SEQUENTIAL + CONDITIONAL: Process each valid item + results = [] + for item in valid_items: + category = await classify_item(item) + if category == "urgent": + result = await process_urgent(item) + else: + result = await process_normal(item) + results.append(result) + + # Aggregate all results + summary_text = await summarize(results) + + return {"results": results, "summary": summary_text, "total": len(results)} + + +all_tasks: list[Any] = [ + validate_item, + classify_item, + process_urgent, + process_normal, + summarize, +] + +activity_options = { + t.func.__name__: { + "execute_in": "activity", + "start_to_close_timeout": timedelta(seconds=30), + } + for t in all_tasks +} + + +@workflow.defn +class ControlFlowWorkflow: + @workflow.run + async def run(self, items: list[str]) -> dict: + return await temporal_entrypoint("control-flow").ainvoke(items) diff --git a/langgraph_plugin/functional_api/hello_world/README.md b/langgraph_plugin/functional_api/hello_world/README.md new file mode 100644 index 00000000..4f66cf89 --- /dev/null +++ b/langgraph_plugin/functional_api/hello_world/README.md @@ -0,0 +1,29 @@ +# Hello World (Functional API) + +The simplest possible LangGraph Functional API + Temporal sample: a single `@task` called from an `@entrypoint`. + +## What This Sample Demonstrates + +- Defining a `@task` and `@entrypoint` +- Wrapping them with `LangGraphPlugin` so the task runs as a Temporal activity +- Invoking the entrypoint from a Temporal workflow + +## Running the Sample + +Prerequisites: `uv sync --group langgraph` and a running Temporal dev server (`temporal server start-dev`). + +```bash +# Terminal 1 +uv run langgraph_plugin/functional_api/hello_world/run_worker.py + +# Terminal 2 +uv run langgraph_plugin/functional_api/hello_world/run_workflow.py +``` + +## Files + +| File | Description | +|------|-------------| +| `workflow.py` | `process_query` task, `hello_entrypoint`, and `HelloWorldFunctionalWorkflow` | +| `run_worker.py` | Registers task and entrypoint with `LangGraphPlugin`, starts worker | +| `run_workflow.py` | Executes the workflow and prints the result | diff --git a/langgraph_plugin/functional_api/hello_world/__init__.py b/langgraph_plugin/functional_api/hello_world/__init__.py new file mode 100644 index 00000000..97766011 --- /dev/null +++ b/langgraph_plugin/functional_api/hello_world/__init__.py @@ -0,0 +1 @@ +"""Minimal hello world — @task and @entrypoint.""" diff --git a/langgraph_plugin/functional_api/hello_world/run_worker.py b/langgraph_plugin/functional_api/hello_world/run_worker.py new file mode 100644 index 00000000..7039ee02 --- /dev/null +++ b/langgraph_plugin/functional_api/hello_world/run_worker.py @@ -0,0 +1,37 @@ +"""Worker for the hello world sample (Functional API).""" + +import asyncio +import os + +from temporalio.client import Client +from temporalio.contrib.langgraph import LangGraphPlugin +from temporalio.worker import Worker + +from langgraph_plugin.functional_api.hello_world.workflow import ( + HelloWorldFunctionalWorkflow, + activity_options, + all_tasks, + hello_entrypoint, +) + + +async def main() -> None: + client = await Client.connect(os.environ.get("TEMPORAL_ADDRESS", "localhost:7233")) + plugin = LangGraphPlugin( + entrypoints={"hello-world": hello_entrypoint}, + tasks=all_tasks, + activity_options=activity_options, + ) + + worker = Worker( + client, + task_queue="langgraph-hello-world-functional", + workflows=[HelloWorldFunctionalWorkflow], + plugins=[plugin], + ) + print("Worker started. Ctrl+C to exit.") + await worker.run() + + +if __name__ == "__main__": + asyncio.run(main()) diff --git a/langgraph_plugin/functional_api/hello_world/run_workflow.py b/langgraph_plugin/functional_api/hello_world/run_workflow.py new file mode 100644 index 00000000..77cc92fb --- /dev/null +++ b/langgraph_plugin/functional_api/hello_world/run_workflow.py @@ -0,0 +1,27 @@ +"""Start the hello world workflow (Functional API).""" + +import asyncio +import os + +from temporalio.client import Client + +from langgraph_plugin.functional_api.hello_world.workflow import ( + HelloWorldFunctionalWorkflow, +) + + +async def main() -> None: + client = await Client.connect(os.environ.get("TEMPORAL_ADDRESS", "localhost:7233")) + + result = await client.execute_workflow( + HelloWorldFunctionalWorkflow.run, + "Hello, Temporal + LangGraph!", + id="hello-world-functional-workflow", + task_queue="langgraph-hello-world-functional", + ) + + print(f"Result: {result}") + + +if __name__ == "__main__": + asyncio.run(main()) diff --git a/langgraph_plugin/functional_api/hello_world/workflow.py b/langgraph_plugin/functional_api/hello_world/workflow.py new file mode 100644 index 00000000..36552806 --- /dev/null +++ b/langgraph_plugin/functional_api/hello_world/workflow.py @@ -0,0 +1,40 @@ +"""Hello world using the LangGraph Functional API with Temporal. + +The simplest possible sample: a single task called from an entrypoint. +""" + +from datetime import timedelta + +from langgraph.func import entrypoint, task +from temporalio import workflow +from temporalio.contrib.langgraph import entrypoint as temporal_entrypoint + + +@task +def process_query(query: str) -> str: + """Process a query and return a response.""" + return f"Processed: {query}" + + +@entrypoint() +async def hello_entrypoint(query: str) -> dict: + """Process the query and return it in a result dict.""" + result = await process_query(query) + return {"result": result} + + +all_tasks = [process_query] + +activity_options = { + "process_query": { + "execute_in": "activity", + "start_to_close_timeout": timedelta(seconds=10), + }, +} + + +@workflow.defn +class HelloWorldFunctionalWorkflow: + @workflow.run + async def run(self, query: str) -> dict: + return await temporal_entrypoint("hello-world").ainvoke(query) diff --git a/langgraph_plugin/functional_api/human_in_the_loop/README.md b/langgraph_plugin/functional_api/human_in_the_loop/README.md new file mode 100644 index 00000000..01aeb2fa --- /dev/null +++ b/langgraph_plugin/functional_api/human_in_the_loop/README.md @@ -0,0 +1,37 @@ +# Human-in-the-Loop Chatbot (Functional API) + +Demonstrates using LangGraph's `interrupt()` to pause an entrypoint for human review, combined with Temporal signals and queries for asynchronous feedback, using the imperative `@task`/`@entrypoint` style. + +## What This Sample Demonstrates + +- Using `interrupt()` inside a `@task` to pause for human input +- Temporal signals and queries for asynchronous human feedback +- Resuming with `Command(resume=...)` via the v2 API +- Setting a checkpointer on the entrypoint for interrupt/resume support + +## How It Works + +1. The `generate_draft` task produces a draft response. +2. The `request_human_review` task calls `interrupt(draft)`, pausing the entrypoint. +3. The workflow stores the draft and waits for a signal. +4. After receiving feedback, the entrypoint resumes and returns the result. + +## Running the Sample + +Prerequisites: `uv sync --group langgraph` and a running Temporal dev server (`temporal server start-dev`). + +```bash +# Terminal 1 +uv run langgraph_plugin/functional_api/human_in_the_loop/run_worker.py + +# Terminal 2 +uv run langgraph_plugin/functional_api/human_in_the_loop/run_workflow.py +``` + +## Files + +| File | Description | +|------|-------------| +| `workflow.py` | `@task` functions, `@entrypoint`, and `ChatbotFunctionalWorkflow` | +| `run_worker.py` | Registers tasks and entrypoint with `LangGraphPlugin`, starts worker | +| `run_workflow.py` | Starts workflow, polls draft via query, sends approval via signal | diff --git a/langgraph_plugin/functional_api/human_in_the_loop/__init__.py b/langgraph_plugin/functional_api/human_in_the_loop/__init__.py new file mode 100644 index 00000000..72da02f8 --- /dev/null +++ b/langgraph_plugin/functional_api/human_in_the_loop/__init__.py @@ -0,0 +1 @@ +"""Human-in-the-loop chatbot using interrupt() and Temporal signals.""" diff --git a/langgraph_plugin/functional_api/human_in_the_loop/run_worker.py b/langgraph_plugin/functional_api/human_in_the_loop/run_worker.py new file mode 100644 index 00000000..c8f84b49 --- /dev/null +++ b/langgraph_plugin/functional_api/human_in_the_loop/run_worker.py @@ -0,0 +1,37 @@ +"""Worker for the human-in-the-loop chatbot (Functional API).""" + +import asyncio +import os + +from temporalio.client import Client +from temporalio.contrib.langgraph import LangGraphPlugin +from temporalio.worker import Worker + +from langgraph_plugin.functional_api.human_in_the_loop.workflow import ( + ChatbotFunctionalWorkflow, + activity_options, + all_tasks, + chatbot_entrypoint, +) + + +async def main() -> None: + client = await Client.connect(os.environ.get("TEMPORAL_ADDRESS", "localhost:7233")) + plugin = LangGraphPlugin( + entrypoints={"chatbot": chatbot_entrypoint}, + tasks=all_tasks, + activity_options=activity_options, + ) + + worker = Worker( + client, + task_queue="langgraph-chatbot-functional", + workflows=[ChatbotFunctionalWorkflow], + plugins=[plugin], + ) + print("Worker started. Ctrl+C to exit.") + await worker.run() + + +if __name__ == "__main__": + asyncio.run(main()) diff --git a/langgraph_plugin/functional_api/human_in_the_loop/run_workflow.py b/langgraph_plugin/functional_api/human_in_the_loop/run_workflow.py new file mode 100644 index 00000000..5074b937 --- /dev/null +++ b/langgraph_plugin/functional_api/human_in_the_loop/run_workflow.py @@ -0,0 +1,39 @@ +"""Start the human-in-the-loop chatbot workflow (Functional API).""" + +import asyncio +import os + +from temporalio.client import Client + +from langgraph_plugin.functional_api.human_in_the_loop.workflow import ( + ChatbotFunctionalWorkflow, +) + + +async def main() -> None: + client = await Client.connect(os.environ.get("TEMPORAL_ADDRESS", "localhost:7233")) + + handle = await client.start_workflow( + ChatbotFunctionalWorkflow.run, + "What is the meaning of life?", + id="chatbot-functional-workflow", + task_queue="langgraph-chatbot-functional", + ) + + # Poll until the draft is ready for review + draft = None + while draft is None: + await asyncio.sleep(0.5) + draft = await handle.query(ChatbotFunctionalWorkflow.get_draft) + + print(f"Draft for review: {draft}") + + # Send approval via signal + await handle.signal(ChatbotFunctionalWorkflow.provide_feedback, "approve") + + result = await handle.result() + print(f"Final response: {result}") + + +if __name__ == "__main__": + asyncio.run(main()) diff --git a/langgraph_plugin/functional_api/human_in_the_loop/workflow.py b/langgraph_plugin/functional_api/human_in_the_loop/workflow.py new file mode 100644 index 00000000..0798b838 --- /dev/null +++ b/langgraph_plugin/functional_api/human_in_the_loop/workflow.py @@ -0,0 +1,90 @@ +"""Human-in-the-loop chatbot using the LangGraph Functional API with Temporal. + +Same pattern as the Graph API version, but using @task and @entrypoint decorators. +""" + +from datetime import timedelta +from typing import Any + +from langchain_core.runnables import RunnableConfig +from langgraph.checkpoint.memory import InMemorySaver +from langgraph.func import entrypoint, task +from langgraph.types import Command, interrupt +from temporalio import workflow +from temporalio.contrib.langgraph import entrypoint as temporal_entrypoint + + +@task +def generate_draft(message: str) -> str: + """Generate a draft response. Replace with an LLM call in production.""" + return ( + f"Here's my response to '{message}': " + "The answer is 42. Let me know if this helps!" + ) + + +@task +def request_human_review(draft: str) -> str: + """Pause execution to request human review of the draft.""" + feedback = interrupt(draft) + if feedback == "approve": + return draft + return f"[Revised] {draft} (incorporating feedback: {feedback})" + + +@entrypoint() +async def chatbot_entrypoint(user_message: str) -> dict: + """Chatbot entrypoint: generate a draft, get human review, return result.""" + draft = await generate_draft(user_message) + final_response = await request_human_review(draft) + return {"response": final_response} + + +all_tasks: list[Any] = [generate_draft, request_human_review] + +activity_options = { + t.func.__name__: { + "execute_in": "activity", + "start_to_close_timeout": timedelta(seconds=30), + } + for t in all_tasks +} + + +@workflow.defn +class ChatbotFunctionalWorkflow: + def __init__(self) -> None: + self._human_input: str | None = None + self._draft: str | None = None + + @workflow.signal + async def provide_feedback(self, feedback: str) -> None: + """Signal handler: receives human feedback.""" + self._human_input = feedback + + @workflow.query + def get_draft(self) -> str | None: + """Query handler: returns the pending draft for review, or None.""" + return self._draft + + @workflow.run + async def run(self, user_message: str) -> dict[str, Any]: + app = temporal_entrypoint("chatbot") + app.checkpointer = InMemorySaver() + config = RunnableConfig( + {"configurable": {"thread_id": workflow.info().workflow_id}} + ) + + # First invocation: runs until interrupt() pauses for human review + result = await app.ainvoke(user_message, config, version="v2") + + self._draft = result.interrupts[0].value + + # Wait for human feedback via Temporal signal + await workflow.wait_condition(lambda: self._human_input is not None) + + # Resume with the human's feedback + resumed = await app.ainvoke( + Command(resume=self._human_input), config, version="v2" + ) + return resumed.value diff --git a/langgraph_plugin/functional_api/langsmith_tracing/README.md b/langgraph_plugin/functional_api/langsmith_tracing/README.md new file mode 100644 index 00000000..fa7f3b80 --- /dev/null +++ b/langgraph_plugin/functional_api/langsmith_tracing/README.md @@ -0,0 +1,37 @@ +# LangSmith Tracing (Functional API) + +Demonstrates combining `LangGraphPlugin` (durable task execution) with Temporal's `LangSmithPlugin` (observability) for full tracing of LLM calls through Temporal workflows, using LangGraph's `@task` and `@entrypoint` decorators. + +## What This Sample Demonstrates + +- Using `LangSmithPlugin` on the Temporal client for automatic trace propagation +- Using `LangGraphPlugin` on the Worker for durable LangGraph execution +- `@traceable` in three places: on the `@task` (Activity) itself, on a helper called from inside the `@task`, and on a helper called from inside the `@entrypoint` (Workflow) +- Both plugins working together: durability + observability + +## How It Works + +1. The Temporal client is created with `LangSmithPlugin(add_temporal_runs=True)`. +2. A Worker registers the `chat` task with `LangGraphPlugin`. +3. When the Workflow runs, the `chat` task executes as a Temporal Activity. +4. `@traceable` decorators emit trace data to LangSmith for the task, an in-task helper, and an in-entrypoint helper. + +## Running the Sample + +Prerequisites: `uv sync --group langgraph` and a running Temporal dev server (`temporal server start-dev`). + +```bash +export ANTHROPIC_API_KEY='your-key' +export LANGCHAIN_API_KEY='your-key' + +uv run langgraph_plugin/functional_api/langsmith_tracing/main.py +``` + +Traces will appear in your [LangSmith](https://smith.langchain.com/) dashboard. + +## Files + +| File | Description | +|------|-------------| +| `workflow.py` | `@traceable` chat task + helpers, `@entrypoint`, and `ChatFunctionalWorkflow` | +| `main.py` | Starts a Worker and executes the Workflow in a single process | diff --git a/langgraph_plugin/functional_api/langsmith_tracing/__init__.py b/langgraph_plugin/functional_api/langsmith_tracing/__init__.py new file mode 100644 index 00000000..4cb57b73 --- /dev/null +++ b/langgraph_plugin/functional_api/langsmith_tracing/__init__.py @@ -0,0 +1 @@ +"""LangSmith tracing with LangGraph Functional API and Temporal.""" diff --git a/langgraph_plugin/functional_api/langsmith_tracing/main.py b/langgraph_plugin/functional_api/langsmith_tracing/main.py new file mode 100644 index 00000000..0d29afb7 --- /dev/null +++ b/langgraph_plugin/functional_api/langsmith_tracing/main.py @@ -0,0 +1,51 @@ +"""Run the LangSmith tracing chat sample (Functional API). + +Single-process driver: starts a Worker, executes the Workflow once, prints +the result, then shuts down. Requires ANTHROPIC_API_KEY and LANGCHAIN_API_KEY. +""" + +import asyncio +import os + +from temporalio.client import Client +from temporalio.contrib.langgraph import LangGraphPlugin +from temporalio.contrib.langsmith import LangSmithPlugin +from temporalio.worker import Worker + +from langgraph_plugin.functional_api.langsmith_tracing.workflow import ( + ChatFunctionalWorkflow, + activity_options, + all_tasks, + chat_entrypoint, +) + + +async def main() -> None: + client = await Client.connect( + os.environ.get("TEMPORAL_ADDRESS", "localhost:7233"), + plugins=[LangSmithPlugin(add_temporal_runs=True)], + ) + + async with Worker( + client, + task_queue="langgraph-langsmith-functional", + workflows=[ChatFunctionalWorkflow], + plugins=[ + LangGraphPlugin( + entrypoints={"chat": chat_entrypoint}, + tasks=all_tasks, + activity_options=activity_options, + ) + ], + ): + result = await client.execute_workflow( + ChatFunctionalWorkflow.run, + "What is the meaning of life?", + id="langsmith-chat-functional-workflow", + task_queue="langgraph-langsmith-functional", + ) + print(f"Response: {result}") + + +if __name__ == "__main__": + asyncio.run(main()) diff --git a/langgraph_plugin/functional_api/langsmith_tracing/workflow.py b/langgraph_plugin/functional_api/langsmith_tracing/workflow.py new file mode 100644 index 00000000..ddea44bc --- /dev/null +++ b/langgraph_plugin/functional_api/langsmith_tracing/workflow.py @@ -0,0 +1,66 @@ +"""LangSmith tracing with LangGraph Functional API and Temporal. + +Demonstrates combining LangGraphPlugin (durable task execution) with +LangSmithPlugin (observability) for full tracing of LLM calls through +Temporal workflows. + +Three @traceable use cases are demonstrated: +1. The @task (Activity) function itself: `chat`. +2. A helper called from inside the @task: `format_prompt`. +3. A helper called from inside the @entrypoint (Workflow): `summarize_for_log`. + +Requires ANTHROPIC_API_KEY and LANGCHAIN_API_KEY environment variables. +""" + +from datetime import timedelta + +from langchain.chat_models import init_chat_model +from langgraph.func import entrypoint, task +from langsmith import traceable +from temporalio import workflow +from temporalio.contrib.langgraph import entrypoint as temporal_entrypoint + + +@traceable(name="format_prompt", run_type="prompt") +def format_prompt(message: str) -> str: + """Helper called from inside the @task. Traced by LangSmith.""" + return f"Please respond concisely to: {message}" + + +@task +@traceable(name="chat_task", run_type="chain") +def chat(message: str) -> str: + """Call an LLM to respond to the message. Traced by LangSmith.""" + prompt = format_prompt(message) + response = init_chat_model("claude-sonnet-4-6").invoke(prompt) + return str(response.content) + + +@traceable(name="summarize_for_log", run_type="chain") +def summarize_for_log(response: str) -> str: + """Helper called from inside the @entrypoint. Traced by LangSmith.""" + return f"Got {len(response)}-char response: {response[:60]}..." + + +@entrypoint() +async def chat_entrypoint(message: str) -> dict: + """Chat entrypoint: call the LLM and return the response.""" + response = await chat(message) + return {"response": response, "summary": summarize_for_log(response)} + + +all_tasks = [chat] + +activity_options = { + "chat": { + "execute_in": "activity", + "start_to_close_timeout": timedelta(seconds=30), + }, +} + + +@workflow.defn +class ChatFunctionalWorkflow: + @workflow.run + async def run(self, message: str) -> dict: + return await temporal_entrypoint("chat").ainvoke(message) diff --git a/langgraph_plugin/functional_api/react_agent/README.md b/langgraph_plugin/functional_api/react_agent/README.md new file mode 100644 index 00000000..79d1b96e --- /dev/null +++ b/langgraph_plugin/functional_api/react_agent/README.md @@ -0,0 +1,44 @@ +# ReAct Agent (Functional API) + +Demonstrates the ReAct agent pattern (think -> act -> observe -> repeat) as a natural `while` loop using LangGraph's `@task` and `@entrypoint` decorators. + +## What This Sample Demonstrates + +- The ReAct loop as a natural `while True` loop in Python +- `@task` functions for agent thinking and tool execution +- How the Functional API makes agent loops readable and extensible + +## How It Works + +1. The `agent_think` task examines the query and tool history, deciding the next action. +2. If a tool is needed, `execute_tool` runs it and the result is appended to history. +3. The loop continues until `agent_think` returns a final answer. + +```python +while True: + decision = await agent_think(query, history) + if decision["action"] == "final": + return decision["answer"] + result = await execute_tool(decision["tool_name"], decision["tool_input"]) + history.append(result) +``` + +## Running the Sample + +Prerequisites: `uv sync --group langgraph` and a running Temporal dev server (`temporal server start-dev`). + +```bash +# Terminal 1 +uv run langgraph_plugin/functional_api/react_agent/run_worker.py + +# Terminal 2 +uv run langgraph_plugin/functional_api/react_agent/run_workflow.py +``` + +## Files + +| File | Description | +|------|-------------| +| `workflow.py` | `@task` functions (agent_think, execute_tool), `@entrypoint`, and `ReactAgentFunctionalWorkflow` | +| `run_worker.py` | Registers tasks and entrypoint with `LangGraphPlugin`, starts worker | +| `run_workflow.py` | Executes the agent workflow and prints the answer | diff --git a/langgraph_plugin/functional_api/react_agent/__init__.py b/langgraph_plugin/functional_api/react_agent/__init__.py new file mode 100644 index 00000000..bd34ac2e --- /dev/null +++ b/langgraph_plugin/functional_api/react_agent/__init__.py @@ -0,0 +1 @@ +"""ReAct agent with while-loop tool dispatch.""" diff --git a/langgraph_plugin/functional_api/react_agent/run_worker.py b/langgraph_plugin/functional_api/react_agent/run_worker.py new file mode 100644 index 00000000..465e90ad --- /dev/null +++ b/langgraph_plugin/functional_api/react_agent/run_worker.py @@ -0,0 +1,37 @@ +"""Worker for the ReAct agent (Functional API).""" + +import asyncio +import os + +from temporalio.client import Client +from temporalio.contrib.langgraph import LangGraphPlugin +from temporalio.worker import Worker + +from langgraph_plugin.functional_api.react_agent.workflow import ( + ReactAgentFunctionalWorkflow, + activity_options, + all_tasks, + react_agent_entrypoint, +) + + +async def main() -> None: + client = await Client.connect(os.environ.get("TEMPORAL_ADDRESS", "localhost:7233")) + plugin = LangGraphPlugin( + entrypoints={"react-agent": react_agent_entrypoint}, + tasks=all_tasks, + activity_options=activity_options, + ) + + worker = Worker( + client, + task_queue="langgraph-react-agent-functional", + workflows=[ReactAgentFunctionalWorkflow], + plugins=[plugin], + ) + print("Worker started. Ctrl+C to exit.") + await worker.run() + + +if __name__ == "__main__": + asyncio.run(main()) diff --git a/langgraph_plugin/functional_api/react_agent/run_workflow.py b/langgraph_plugin/functional_api/react_agent/run_workflow.py new file mode 100644 index 00000000..8888cbd1 --- /dev/null +++ b/langgraph_plugin/functional_api/react_agent/run_workflow.py @@ -0,0 +1,28 @@ +"""Start the ReAct agent workflow (Functional API).""" + +import asyncio +import os + +from temporalio.client import Client + +from langgraph_plugin.functional_api.react_agent.workflow import ( + ReactAgentFunctionalWorkflow, +) + + +async def main() -> None: + client = await Client.connect(os.environ.get("TEMPORAL_ADDRESS", "localhost:7233")) + + result = await client.execute_workflow( + ReactAgentFunctionalWorkflow.run, + "Tell me about San Francisco", + id="react-agent-functional-workflow", + task_queue="langgraph-react-agent-functional", + ) + + print(f"Agent answer: {result['answer']}") + print(f"Tool calls made: {result['steps']}") + + +if __name__ == "__main__": + asyncio.run(main()) diff --git a/langgraph_plugin/functional_api/react_agent/workflow.py b/langgraph_plugin/functional_api/react_agent/workflow.py new file mode 100644 index 00000000..e1f5f488 --- /dev/null +++ b/langgraph_plugin/functional_api/react_agent/workflow.py @@ -0,0 +1,87 @@ +"""ReAct agent using the LangGraph Functional API with Temporal. + +Same pattern as the Graph API version, but using @task and @entrypoint. +The Functional API naturally expresses the ReAct loop as a while loop, +making the control flow explicit and easy to extend. +""" + +from datetime import timedelta +from typing import Any + +from langgraph.func import entrypoint, task +from temporalio import workflow +from temporalio.contrib.langgraph import entrypoint as temporal_entrypoint + + +@task +def agent_think(query: str, history: list[str]) -> dict: + """The agent decides the next action based on query and tool history. + + In production, replace this with an LLM call (e.g., Claude with tools). + """ + tool_results = [h for h in history if h.startswith("[Tool]")] + + if len(tool_results) == 0: + return { + "action": "tool", + "tool_name": "get_weather", + "tool_input": "San Francisco", + } + elif len(tool_results) == 1: + return { + "action": "tool", + "tool_name": "get_population", + "tool_input": "San Francisco", + } + else: + facts = "; ".join(tool_results) + return { + "action": "final", + "answer": (f"Here's what I found about San Francisco: {facts}"), + } + + +@task +def execute_tool(tool_name: str, tool_input: str) -> str: + """Execute a tool by name. In production, dispatch to real implementations.""" + tool_registry = { + "get_weather": lambda inp: f"[Tool] Weather in {inp}: 72°F and sunny.", + "get_population": lambda inp: f"[Tool] {inp} population: ~870,000 residents.", + } + handler = tool_registry.get(tool_name) + if handler: + return handler(tool_input) + return f"[Tool] Unknown tool: {tool_name}" + + +@entrypoint() +async def react_agent_entrypoint(query: str) -> dict: + """ReAct agent loop: think -> act -> observe -> repeat.""" + history: list[str] = [] + + while True: + decision = await agent_think(query, history) + + if decision["action"] == "final": + return {"answer": decision["answer"], "steps": len(history)} + + result = await execute_tool(decision["tool_name"], decision["tool_input"]) + history.append(result) + + +all_tasks: list[Any] = [agent_think, execute_tool] + +activity_options = { + t.func.__name__: { + "execute_in": "activity", + "start_to_close_timeout": timedelta(seconds=30), + } + for t in all_tasks +} + + +@workflow.defn +class ReactAgentFunctionalWorkflow: + @workflow.run + async def run(self, query: str) -> dict: + return await temporal_entrypoint("react-agent").ainvoke(query) diff --git a/langgraph_plugin/graph_api/__init__.py b/langgraph_plugin/graph_api/__init__.py new file mode 100644 index 00000000..579887b6 --- /dev/null +++ b/langgraph_plugin/graph_api/__init__.py @@ -0,0 +1 @@ +"""LangGraph Graph API samples using StateGraph.""" diff --git a/langgraph_plugin/graph_api/continue_as_new/README.md b/langgraph_plugin/graph_api/continue_as_new/README.md new file mode 100644 index 00000000..ea80671f --- /dev/null +++ b/langgraph_plugin/graph_api/continue_as_new/README.md @@ -0,0 +1,37 @@ +# Continue-as-New with Caching (Graph API) + +Demonstrates Temporal's continue-as-new with LangGraph's task result caching to avoid re-executing completed graph nodes across workflow boundaries. + +## What This Sample Demonstrates + +- Using `workflow.continue_as_new()` to reset event history for long-running pipelines +- Capturing task results with `cache()` before continuing +- Restoring cached results with `temporal_graph(name, cache=...)` so completed nodes are skipped +- Each node executes exactly once despite multiple workflow invocations + +## How It Works + +1. A 3-stage pipeline runs: `double` (x2) -> `add_50` (+50) -> `triple` (x3). +2. After the first invocation, the workflow continues-as-new with the cached results. +3. On the second and third invocations, all three nodes return cached results instantly. +4. The final result is returned: input 10 -> 20 -> 70 -> 210. + +## Running the Sample + +Prerequisites: `uv sync --group langgraph` and a running Temporal dev server (`temporal server start-dev`). + +```bash +# Terminal 1 +uv run langgraph_plugin/graph_api/continue_as_new/run_worker.py + +# Terminal 2 +uv run langgraph_plugin/graph_api/continue_as_new/run_workflow.py +``` + +## Files + +| File | Description | +|------|-------------| +| `workflow.py` | Pipeline node functions, graph definition, `PipelineInput`, and `PipelineWorkflow` | +| `run_worker.py` | Builds graph, registers with `LangGraphPlugin`, starts Worker | +| `run_workflow.py` | Executes the pipeline Workflow and prints the result | diff --git a/langgraph_plugin/graph_api/continue_as_new/__init__.py b/langgraph_plugin/graph_api/continue_as_new/__init__.py new file mode 100644 index 00000000..5b90703e --- /dev/null +++ b/langgraph_plugin/graph_api/continue_as_new/__init__.py @@ -0,0 +1 @@ +"""Continue-as-new pipeline with task result caching.""" diff --git a/langgraph_plugin/graph_api/continue_as_new/run_worker.py b/langgraph_plugin/graph_api/continue_as_new/run_worker.py new file mode 100644 index 00000000..653874a6 --- /dev/null +++ b/langgraph_plugin/graph_api/continue_as_new/run_worker.py @@ -0,0 +1,31 @@ +"""Worker for the continue-as-new pipeline (Graph API).""" + +import asyncio +import os + +from temporalio.client import Client +from temporalio.contrib.langgraph import LangGraphPlugin +from temporalio.worker import Worker + +from langgraph_plugin.graph_api.continue_as_new.workflow import ( + PipelineWorkflow, + make_pipeline_graph, +) + + +async def main() -> None: + client = await Client.connect(os.environ.get("TEMPORAL_ADDRESS", "localhost:7233")) + plugin = LangGraphPlugin(graphs={"pipeline": make_pipeline_graph()}) + + worker = Worker( + client, + task_queue="langgraph-pipeline", + workflows=[PipelineWorkflow], + plugins=[plugin], + ) + print("Worker started. Ctrl+C to exit.") + await worker.run() + + +if __name__ == "__main__": + asyncio.run(main()) diff --git a/langgraph_plugin/graph_api/continue_as_new/run_workflow.py b/langgraph_plugin/graph_api/continue_as_new/run_workflow.py new file mode 100644 index 00000000..a52d1716 --- /dev/null +++ b/langgraph_plugin/graph_api/continue_as_new/run_workflow.py @@ -0,0 +1,31 @@ +"""Start the continue-as-new pipeline workflow (Graph API).""" + +import asyncio +import os +from datetime import timedelta + +from temporalio.client import Client + +from langgraph_plugin.graph_api.continue_as_new.workflow import ( + PipelineInput, + PipelineWorkflow, +) + + +async def main() -> None: + client = await Client.connect(os.environ.get("TEMPORAL_ADDRESS", "localhost:7233")) + + result = await client.execute_workflow( + PipelineWorkflow.run, + PipelineInput(data=10), + id="pipeline-workflow", + task_queue="langgraph-pipeline", + execution_timeout=timedelta(seconds=60), + ) + + # 10*2=20 -> 20+50=70 -> 70*3=210 + print(f"Pipeline result: {result}") + + +if __name__ == "__main__": + asyncio.run(main()) diff --git a/langgraph_plugin/graph_api/continue_as_new/workflow.py b/langgraph_plugin/graph_api/continue_as_new/workflow.py new file mode 100644 index 00000000..1475e63f --- /dev/null +++ b/langgraph_plugin/graph_api/continue_as_new/workflow.py @@ -0,0 +1,85 @@ +"""Continue-as-new with caching using the LangGraph Graph API with Temporal. + +Demonstrates how to use Temporal's continue-as-new with LangGraph's task result +caching to avoid re-executing already-completed graph nodes across workflow +boundaries. +""" + +from dataclasses import dataclass +from datetime import timedelta +from typing import Any + +from langgraph.graph import START, StateGraph +from temporalio import workflow +from temporalio.contrib.langgraph import cache +from temporalio.contrib.langgraph import graph as temporal_graph +from typing_extensions import TypedDict + + +class State(TypedDict): + value: int + + +async def double(state: State) -> dict[str, int]: + """Stage 1: double the input.""" + return {"value": state["value"] * 2} + + +async def add_50(state: State) -> dict[str, int]: + """Stage 2: add 50.""" + return {"value": state["value"] + 50} + + +async def triple(state: State) -> dict[str, int]: + """Stage 3: triple the result.""" + return {"value": state["value"] * 3} + + +def make_pipeline_graph() -> StateGraph: + node_metadata = { + "execute_in": "activity", + "start_to_close_timeout": timedelta(seconds=30), + } + g = StateGraph(State) + g.add_node("double", double, metadata=node_metadata) + g.add_node("add_50", add_50, metadata=node_metadata) + g.add_node("triple", triple, metadata=node_metadata) + g.add_edge(START, "double") + g.add_edge("double", "add_50") + g.add_edge("add_50", "triple") + return g + + +@dataclass +class PipelineInput: + data: int + cache: dict[str, Any] | None = None + phase: int = 1 # continues-as-new after phases 1 and 2 + + +@workflow.defn +class PipelineWorkflow: + """Runs a 3-stage pipeline, continuing-as-new after each phase. + + Phase 1: all 3 stages execute, continues-as-new with cache. + Phase 2: all 3 stages cached (instant), continues-as-new. + Phase 3: all 3 stages cached (instant), returns final result. + + Input 10: 10*2=20 -> 20+50=70 -> 70*3=210 + """ + + @workflow.run + async def run(self, input_data: PipelineInput) -> int: + app = temporal_graph("pipeline", cache=input_data.cache).compile() + result = await app.ainvoke({"value": input_data.data}) + + if input_data.phase < 3: + workflow.continue_as_new( + PipelineInput( + data=input_data.data, + cache=cache(), + phase=input_data.phase + 1, + ) + ) + + return result["value"] diff --git a/langgraph_plugin/graph_api/hello_world/README.md b/langgraph_plugin/graph_api/hello_world/README.md new file mode 100644 index 00000000..8d0ebdf1 --- /dev/null +++ b/langgraph_plugin/graph_api/hello_world/README.md @@ -0,0 +1,29 @@ +# Hello World (Graph API) + +The simplest possible LangGraph + Temporal sample: a single-node graph that processes a query string. + +## What This Sample Demonstrates + +- Defining a `StateGraph` with a single node +- Wrapping it with `LangGraphPlugin` so the node runs as a Temporal activity +- Invoking the graph from a Temporal workflow + +## Running the Sample + +Prerequisites: `uv sync --group langgraph` and a running Temporal dev server (`temporal server start-dev`). + +```bash +# Terminal 1 +uv run langgraph_plugin/graph_api/hello_world/run_worker.py + +# Terminal 2 +uv run langgraph_plugin/graph_api/hello_world/run_workflow.py +``` + +## Files + +| File | Description | +|------|-------------| +| `workflow.py` | `process_query` node, graph definition and `HelloWorldWorkflow` | +| `run_worker.py` | Registers graph with `LangGraphPlugin`, starts worker | +| `run_workflow.py` | Executes the workflow and prints the result | diff --git a/langgraph_plugin/graph_api/hello_world/__init__.py b/langgraph_plugin/graph_api/hello_world/__init__.py new file mode 100644 index 00000000..166b2c69 --- /dev/null +++ b/langgraph_plugin/graph_api/hello_world/__init__.py @@ -0,0 +1 @@ +"""Minimal hello world — single-node StateGraph.""" diff --git a/langgraph_plugin/graph_api/hello_world/run_worker.py b/langgraph_plugin/graph_api/hello_world/run_worker.py new file mode 100644 index 00000000..bf2ece96 --- /dev/null +++ b/langgraph_plugin/graph_api/hello_world/run_worker.py @@ -0,0 +1,31 @@ +"""Worker for the hello world sample (Graph API).""" + +import asyncio +import os + +from temporalio.client import Client +from temporalio.contrib.langgraph import LangGraphPlugin +from temporalio.worker import Worker + +from langgraph_plugin.graph_api.hello_world.workflow import ( + HelloWorldWorkflow, + make_hello_graph, +) + + +async def main() -> None: + client = await Client.connect(os.environ.get("TEMPORAL_ADDRESS", "localhost:7233")) + plugin = LangGraphPlugin(graphs={"hello-world": make_hello_graph()}) + + worker = Worker( + client, + task_queue="langgraph-hello-world", + workflows=[HelloWorldWorkflow], + plugins=[plugin], + ) + print("Worker started. Ctrl+C to exit.") + await worker.run() + + +if __name__ == "__main__": + asyncio.run(main()) diff --git a/langgraph_plugin/graph_api/hello_world/run_workflow.py b/langgraph_plugin/graph_api/hello_world/run_workflow.py new file mode 100644 index 00000000..e1a02a43 --- /dev/null +++ b/langgraph_plugin/graph_api/hello_world/run_workflow.py @@ -0,0 +1,25 @@ +"""Start the hello world workflow (Graph API).""" + +import asyncio +import os + +from temporalio.client import Client + +from langgraph_plugin.graph_api.hello_world.workflow import HelloWorldWorkflow + + +async def main() -> None: + client = await Client.connect(os.environ.get("TEMPORAL_ADDRESS", "localhost:7233")) + + result = await client.execute_workflow( + HelloWorldWorkflow.run, + "Hello, Temporal + LangGraph!", + id="hello-world-workflow", + task_queue="langgraph-hello-world", + ) + + print(f"Result: {result}") + + +if __name__ == "__main__": + asyncio.run(main()) diff --git a/langgraph_plugin/graph_api/hello_world/workflow.py b/langgraph_plugin/graph_api/hello_world/workflow.py new file mode 100644 index 00000000..84eafe67 --- /dev/null +++ b/langgraph_plugin/graph_api/hello_world/workflow.py @@ -0,0 +1,42 @@ +"""Hello world using the LangGraph Graph API with Temporal. + +The simplest possible sample: a single-node graph that processes a query string. +""" + +from datetime import timedelta + +from langgraph.graph import START, StateGraph +from temporalio import workflow +from temporalio.contrib.langgraph import graph as temporal_graph +from typing_extensions import TypedDict + + +class State(TypedDict): + value: str + + +async def process_query(state: State) -> dict[str, str]: + """Process a query and return a response.""" + return {"value": f"Processed: {state['value']}"} + + +def make_hello_graph() -> StateGraph: + g = StateGraph(State) + g.add_node( + "process_query", + process_query, + metadata={ + "execute_in": "activity", + "start_to_close_timeout": timedelta(seconds=10), + }, + ) + g.add_edge(START, "process_query") + return g + + +@workflow.defn +class HelloWorldWorkflow: + @workflow.run + async def run(self, query: str) -> str: + result = await temporal_graph("hello-world").compile().ainvoke({"value": query}) + return result["value"] diff --git a/langgraph_plugin/graph_api/human_in_the_loop/README.md b/langgraph_plugin/graph_api/human_in_the_loop/README.md new file mode 100644 index 00000000..72b13d8c --- /dev/null +++ b/langgraph_plugin/graph_api/human_in_the_loop/README.md @@ -0,0 +1,38 @@ +# Human-in-the-Loop Chatbot (Graph API) + +Demonstrates using LangGraph's `interrupt()` to pause a workflow for human review, combined with Temporal signals and queries for asynchronous feedback. + +## What This Sample Demonstrates + +- Pausing a graph mid-execution with `interrupt()` to wait for human input +- Using Temporal **signals** to deliver human feedback to a running workflow +- Using Temporal **queries** to expose pending review state to external UIs +- Resuming the graph with `Command(resume=...)` after receiving input + +## How It Works + +1. The workflow starts and the `generate_draft` node produces a response. +2. The `human_review` node calls `interrupt(draft)`, pausing execution. +3. The workflow exposes the draft via a query and waits for a signal. +4. An external process (UI, CLI, etc.) queries the draft and sends approval via signal. +5. The graph resumes — `interrupt()` returns the signal value and the node completes. + +## Running the Sample + +Prerequisites: `uv sync --group langgraph` and a running Temporal dev server (`temporal server start-dev`). + +```bash +# Terminal 1: start the worker +uv run langgraph_plugin/graph_api/human_in_the_loop/run_worker.py + +# Terminal 2: start the workflow (polls for draft, then auto-approves) +uv run langgraph_plugin/graph_api/human_in_the_loop/run_workflow.py +``` + +## Files + +| File | Description | +|------|-------------| +| `workflow.py` | Graph node functions, graph definition, and `ChatbotWorkflow` definition | +| `run_worker.py` | Builds graph, registers with `LangGraphPlugin`, starts worker | +| `run_workflow.py` | Starts workflow, polls draft via query, sends approval via signal | diff --git a/langgraph_plugin/graph_api/human_in_the_loop/__init__.py b/langgraph_plugin/graph_api/human_in_the_loop/__init__.py new file mode 100644 index 00000000..72da02f8 --- /dev/null +++ b/langgraph_plugin/graph_api/human_in_the_loop/__init__.py @@ -0,0 +1 @@ +"""Human-in-the-loop chatbot using interrupt() and Temporal signals.""" diff --git a/langgraph_plugin/graph_api/human_in_the_loop/run_worker.py b/langgraph_plugin/graph_api/human_in_the_loop/run_worker.py new file mode 100644 index 00000000..2090366c --- /dev/null +++ b/langgraph_plugin/graph_api/human_in_the_loop/run_worker.py @@ -0,0 +1,31 @@ +"""Worker for the human-in-the-loop chatbot (Graph API).""" + +import asyncio +import os + +from temporalio.client import Client +from temporalio.contrib.langgraph import LangGraphPlugin +from temporalio.worker import Worker + +from langgraph_plugin.graph_api.human_in_the_loop.workflow import ( + ChatbotWorkflow, + make_chatbot_graph, +) + + +async def main() -> None: + client = await Client.connect(os.environ.get("TEMPORAL_ADDRESS", "localhost:7233")) + plugin = LangGraphPlugin(graphs={"chatbot": make_chatbot_graph()}) + + worker = Worker( + client, + task_queue="langgraph-chatbot", + workflows=[ChatbotWorkflow], + plugins=[plugin], + ) + print("Worker started. Ctrl+C to exit.") + await worker.run() + + +if __name__ == "__main__": + asyncio.run(main()) diff --git a/langgraph_plugin/graph_api/human_in_the_loop/run_workflow.py b/langgraph_plugin/graph_api/human_in_the_loop/run_workflow.py new file mode 100644 index 00000000..7abfcdcf --- /dev/null +++ b/langgraph_plugin/graph_api/human_in_the_loop/run_workflow.py @@ -0,0 +1,38 @@ +"""Start the human-in-the-loop chatbot workflow (Graph API).""" + +import asyncio +import os + +from temporalio.client import Client + +from langgraph_plugin.graph_api.human_in_the_loop.workflow import ChatbotWorkflow + + +async def main() -> None: + client = await Client.connect(os.environ.get("TEMPORAL_ADDRESS", "localhost:7233")) + + handle = await client.start_workflow( + ChatbotWorkflow.run, + "What is the meaning of life?", + id="chatbot-workflow", + task_queue="langgraph-chatbot", + ) + + # Poll until the draft is ready for review. + # In a real app, a UI would call this query endpoint. + draft = None + while draft is None: + await asyncio.sleep(0.5) + draft = await handle.query(ChatbotWorkflow.get_draft) + + print(f"Draft for review: {draft}") + + # Send approval via signal (a UI would trigger this) + await handle.signal(ChatbotWorkflow.provide_feedback, "approve") + + result = await handle.result() + print(f"Final response: {result}") + + +if __name__ == "__main__": + asyncio.run(main()) diff --git a/langgraph_plugin/graph_api/human_in_the_loop/workflow.py b/langgraph_plugin/graph_api/human_in_the_loop/workflow.py new file mode 100644 index 00000000..f39275a6 --- /dev/null +++ b/langgraph_plugin/graph_api/human_in_the_loop/workflow.py @@ -0,0 +1,89 @@ +"""Human-in-the-loop chatbot using the LangGraph Graph API with Temporal. + +Demonstrates using LangGraph's interrupt() to pause a workflow for human input, +combined with Temporal signals to receive the input asynchronously. +""" + +from datetime import timedelta + +from langchain_core.runnables import RunnableConfig +from langgraph.checkpoint.memory import InMemorySaver +from langgraph.graph import START, StateGraph +from langgraph.types import Command, interrupt +from temporalio import workflow +from temporalio.contrib.langgraph import graph as temporal_graph +from typing_extensions import TypedDict + + +class State(TypedDict): + value: str + + +async def generate_draft(state: State) -> dict[str, str]: + """Generate a draft response. Replace with an LLM call in production.""" + return { + "value": ( + f"Here's my response to '{state['value']}': " + "The answer is 42. Let me know if this helps!" + ) + } + + +async def human_review(state: State) -> dict[str, str]: + """Present draft to human for review via interrupt.""" + feedback = interrupt(state["value"]) + if feedback == "approve": + return {"value": state["value"]} + return {"value": f"[Revised] {state['value']} (incorporating feedback: {feedback})"} + + +def make_chatbot_graph() -> StateGraph: + node_metadata = { + "execute_in": "activity", + "start_to_close_timeout": timedelta(seconds=30), + } + g = StateGraph(State) + g.add_node("generate_draft", generate_draft, metadata=node_metadata) + g.add_node("human_review", human_review, metadata=node_metadata) + g.add_edge(START, "generate_draft") + g.add_edge("generate_draft", "human_review") + return g + + +@workflow.defn +class ChatbotWorkflow: + def __init__(self) -> None: + self._human_input: str | None = None + self._draft: str | None = None + + @workflow.signal + async def provide_feedback(self, feedback: str) -> None: + """Signal handler: receives human feedback (approval or revision).""" + self._human_input = feedback + + @workflow.query + def get_draft(self) -> str | None: + """Query handler: returns the pending draft for review, or None.""" + return self._draft + + @workflow.run + async def run(self, user_message: str) -> str: + app = temporal_graph("chatbot").compile(checkpointer=InMemorySaver()) + config = RunnableConfig( + {"configurable": {"thread_id": workflow.info().workflow_id}} + ) + + # First invocation: runs generate_draft, then pauses at interrupt() + result = await app.ainvoke({"value": user_message}, config, version="v2") + + # Store the draft from the interrupt for the query handler + self._draft = result.interrupts[0].value + + # Wait for human feedback via Temporal signal + await workflow.wait_condition(lambda: self._human_input is not None) + + # Resume the graph with the human's feedback + resumed = await app.ainvoke( + Command(resume=self._human_input), config, version="v2" + ) + return resumed.value["value"] diff --git a/langgraph_plugin/graph_api/langsmith_tracing/README.md b/langgraph_plugin/graph_api/langsmith_tracing/README.md new file mode 100644 index 00000000..47165ee8 --- /dev/null +++ b/langgraph_plugin/graph_api/langsmith_tracing/README.md @@ -0,0 +1,38 @@ +# LangSmith Tracing (Graph API) + +Demonstrates combining the LangGraph plugin (durable execution) with Temporal's LangSmith plugin (observability) for full tracing of LLM calls through Temporal workflows. + +## What This Sample Demonstrates + +- Using `LangSmithPlugin` on the Temporal client for automatic trace propagation +- Using `LangGraphPlugin` on the Worker for durable LangGraph execution +- `@traceable` in three places: on the Activity itself, on a helper called from inside the Activity, and on a helper called from inside the Workflow +- Both plugins working together: durability + observability + +## How It Works + +1. The Temporal client is created with `LangSmithPlugin(add_temporal_runs=True)`. +2. A Worker is created with `LangGraphPlugin` wrapping the chat graph. +3. When the Workflow runs, the `chat` node executes as a Temporal Activity. +4. `@traceable` decorators emit trace data to LangSmith for the Activity, an in-Activity helper, and an in-Workflow helper. +5. The `LangSmithPlugin` adds Temporal-specific metadata to the traces. + +## Running the Sample + +Prerequisites: `uv sync --group langgraph` and a running Temporal dev server (`temporal server start-dev`). + +```bash +export ANTHROPIC_API_KEY='your-key' +export LANGCHAIN_API_KEY='your-key' + +uv run langgraph_plugin/graph_api/langsmith_tracing/main.py +``` + +Traces will appear in your [LangSmith](https://smith.langchain.com/) dashboard. + +## Files + +| File | Description | +|------|-------------| +| `workflow.py` | `@traceable` chat node + helpers, graph definition, and `ChatWorkflow` | +| `main.py` | Starts a Worker and executes the Workflow in a single process | diff --git a/langgraph_plugin/graph_api/langsmith_tracing/__init__.py b/langgraph_plugin/graph_api/langsmith_tracing/__init__.py new file mode 100644 index 00000000..dab692ec --- /dev/null +++ b/langgraph_plugin/graph_api/langsmith_tracing/__init__.py @@ -0,0 +1 @@ +"""LangSmith tracing with LangGraph Graph API and Temporal.""" diff --git a/langgraph_plugin/graph_api/langsmith_tracing/main.py b/langgraph_plugin/graph_api/langsmith_tracing/main.py new file mode 100644 index 00000000..71294f4a --- /dev/null +++ b/langgraph_plugin/graph_api/langsmith_tracing/main.py @@ -0,0 +1,43 @@ +"""Run the LangSmith tracing chat sample (Graph API). + +Single-process driver: starts a Worker, executes the Workflow once, prints +the result, then shuts down. Requires ANTHROPIC_API_KEY and LANGCHAIN_API_KEY. +""" + +import asyncio +import os + +from temporalio.client import Client +from temporalio.contrib.langgraph import LangGraphPlugin +from temporalio.contrib.langsmith import LangSmithPlugin +from temporalio.worker import Worker + +from langgraph_plugin.graph_api.langsmith_tracing.workflow import ( + ChatWorkflow, + make_chat_graph, +) + + +async def main() -> None: + client = await Client.connect( + os.environ.get("TEMPORAL_ADDRESS", "localhost:7233"), + plugins=[LangSmithPlugin(add_temporal_runs=True)], + ) + + async with Worker( + client, + task_queue="langgraph-langsmith", + workflows=[ChatWorkflow], + plugins=[LangGraphPlugin(graphs={"chat": make_chat_graph()})], + ): + result = await client.execute_workflow( + ChatWorkflow.run, + "What is the meaning of life?", + id="langsmith-chat-workflow", + task_queue="langgraph-langsmith", + ) + print(f"Response: {result}") + + +if __name__ == "__main__": + asyncio.run(main()) diff --git a/langgraph_plugin/graph_api/langsmith_tracing/workflow.py b/langgraph_plugin/graph_api/langsmith_tracing/workflow.py new file mode 100644 index 00000000..50649f31 --- /dev/null +++ b/langgraph_plugin/graph_api/langsmith_tracing/workflow.py @@ -0,0 +1,70 @@ +"""LangSmith tracing with LangGraph Graph API and Temporal. + +Demonstrates combining LangGraphPlugin (durable graph execution) with +LangSmithPlugin (observability) for full tracing of LLM calls through +Temporal workflows. + +Three @traceable use cases are demonstrated: +1. The Activity (graph node) function itself: `chat`. +2. A helper called from inside the Activity: `format_prompt`. +3. A helper called from inside the Workflow: `summarize_for_log`. + +Requires ANTHROPIC_API_KEY and LANGCHAIN_API_KEY environment variables. +""" + +from datetime import timedelta + +from langchain.chat_models import init_chat_model +from langgraph.graph import START, StateGraph +from langsmith import traceable +from temporalio import workflow +from temporalio.contrib.langgraph import graph as temporal_graph +from typing_extensions import TypedDict + + +class State(TypedDict): + value: str + + +@traceable(name="format_prompt", run_type="prompt") +def format_prompt(message: str) -> str: + """Helper called from inside the Activity. Traced by LangSmith.""" + return f"Please respond concisely to: {message}" + + +@traceable(name="chat_activity", run_type="chain") +async def chat(state: State) -> dict[str, str]: + """Call an LLM to respond to the message. Traced by LangSmith.""" + prompt = format_prompt(state["value"]) + response = await init_chat_model("claude-sonnet-4-6").ainvoke(prompt) + return {"value": str(response.content)} + + +@traceable(name="summarize_for_log", run_type="chain") +def summarize_for_log(response: str) -> str: + """Helper called from inside the Workflow. Traced by LangSmith.""" + return f"Got {len(response)}-char response: {response[:60]}..." + + +def make_chat_graph() -> StateGraph: + g = StateGraph(State) + g.add_node( + "chat", + chat, + metadata={ + "execute_in": "activity", + "start_to_close_timeout": timedelta(seconds=30), + }, + ) + g.add_edge(START, "chat") + return g + + +@workflow.defn +class ChatWorkflow: + @workflow.run + async def run(self, message: str) -> str: + result = await temporal_graph("chat").compile().ainvoke({"value": message}) + response = result["value"] + workflow.logger.info(summarize_for_log(response)) + return response diff --git a/langgraph_plugin/graph_api/react_agent/README.md b/langgraph_plugin/graph_api/react_agent/README.md new file mode 100644 index 00000000..de5149ad --- /dev/null +++ b/langgraph_plugin/graph_api/react_agent/README.md @@ -0,0 +1,42 @@ +# ReAct Agent (Graph API) + +Demonstrates the most common LangGraph pattern: a tool-calling agent that loops between deciding and acting, using conditional edges for routing. + +## What This Sample Demonstrates + +- Defining a `StateGraph` with an agent->tools loop +- Using `add_conditional_edges` for conditional routing (call tool or finish) +- Accumulating conversation history with `Annotated[list, operator.add]` +- The full ReAct cycle: think -> act -> observe -> repeat + +## How It Works + +1. The `agent` node examines the conversation history and decides the next action. +2. If a tool is needed, `should_continue` routes to the `tools` node. +3. The `tools` node executes the tool and appends the result to history. +4. Control returns to `agent`, which decides again — loop or finish. +5. When the agent has enough information, `should_continue` routes to `END`. + +``` +START -> agent -> tools -> agent -> tools -> agent -> END +``` + +## Running the Sample + +Prerequisites: `uv sync --group langgraph` and a running Temporal dev server (`temporal server start-dev`). + +```bash +# Terminal 1 +uv run langgraph_plugin/graph_api/react_agent/run_worker.py + +# Terminal 2 +uv run langgraph_plugin/graph_api/react_agent/run_workflow.py +``` + +## Files + +| File | Description | +|------|-------------| +| `workflow.py` | `AgentState`, node functions, `should_continue` router, graph definition, and `ReactAgentWorkflow` | +| `run_worker.py` | Builds graph, registers with `LangGraphPlugin`, starts worker | +| `run_workflow.py` | Executes the agent workflow and prints the answer | diff --git a/langgraph_plugin/graph_api/react_agent/__init__.py b/langgraph_plugin/graph_api/react_agent/__init__.py new file mode 100644 index 00000000..64c24c36 --- /dev/null +++ b/langgraph_plugin/graph_api/react_agent/__init__.py @@ -0,0 +1 @@ +"""ReAct agent with conditional edges and tool-calling loop.""" diff --git a/langgraph_plugin/graph_api/react_agent/run_worker.py b/langgraph_plugin/graph_api/react_agent/run_worker.py new file mode 100644 index 00000000..04532a32 --- /dev/null +++ b/langgraph_plugin/graph_api/react_agent/run_worker.py @@ -0,0 +1,31 @@ +"""Worker for the ReAct agent (Graph API).""" + +import asyncio +import os + +from temporalio.client import Client +from temporalio.contrib.langgraph import LangGraphPlugin +from temporalio.worker import Worker + +from langgraph_plugin.graph_api.react_agent.workflow import ( + ReactAgentWorkflow, + make_agent_graph, +) + + +async def main() -> None: + client = await Client.connect(os.environ.get("TEMPORAL_ADDRESS", "localhost:7233")) + plugin = LangGraphPlugin(graphs={"react-agent": make_agent_graph()}) + + worker = Worker( + client, + task_queue="langgraph-react-agent", + workflows=[ReactAgentWorkflow], + plugins=[plugin], + ) + print("Worker started. Ctrl+C to exit.") + await worker.run() + + +if __name__ == "__main__": + asyncio.run(main()) diff --git a/langgraph_plugin/graph_api/react_agent/run_workflow.py b/langgraph_plugin/graph_api/react_agent/run_workflow.py new file mode 100644 index 00000000..d4649f1e --- /dev/null +++ b/langgraph_plugin/graph_api/react_agent/run_workflow.py @@ -0,0 +1,25 @@ +"""Start the ReAct agent workflow (Graph API).""" + +import asyncio +import os + +from temporalio.client import Client + +from langgraph_plugin.graph_api.react_agent.workflow import ReactAgentWorkflow + + +async def main() -> None: + client = await Client.connect(os.environ.get("TEMPORAL_ADDRESS", "localhost:7233")) + + result = await client.execute_workflow( + ReactAgentWorkflow.run, + "Tell me about San Francisco", + id="react-agent-workflow", + task_queue="langgraph-react-agent", + ) + + print(f"Agent answer: {result}") + + +if __name__ == "__main__": + asyncio.run(main()) diff --git a/langgraph_plugin/graph_api/react_agent/workflow.py b/langgraph_plugin/graph_api/react_agent/workflow.py new file mode 100644 index 00000000..1bcae10d --- /dev/null +++ b/langgraph_plugin/graph_api/react_agent/workflow.py @@ -0,0 +1,109 @@ +"""ReAct agent using the LangGraph Graph API with Temporal. + +Demonstrates the most common LangGraph pattern: a tool-calling agent that loops +between "thinking" (deciding the next action) and "acting" (executing a tool), +using conditional edges to control the loop. + +Graph topology: + START -> agent -> (tools -> agent)* -> END +""" + +import operator +from datetime import timedelta +from typing import Annotated, Any, TypedDict + +from langgraph.graph import END, START, StateGraph +from temporalio import workflow +from temporalio.contrib.langgraph import graph as temporal_graph + + +class AgentState(TypedDict): + """State for the ReAct agent. + + 'messages' uses operator.add so each node appends to the list rather + than replacing it, accumulating the full conversation history. + """ + + input: str + messages: Annotated[list[str], operator.add] + final_answer: str + + +async def agent(state: AgentState) -> dict[str, Any]: + """The agent decides what to do next based on the conversation history. + + In production, replace this with an LLM call (e.g., Claude with tools). + This stub simulates a 2-step research process. + """ + messages = state.get("messages", []) + tool_results = [m for m in messages if m.startswith("[Tool]")] + + if len(tool_results) == 0: + return { + "messages": [ + "[Agent] I need weather data. Calling get_weather for San Francisco." + ] + } + elif len(tool_results) == 1: + return { + "messages": [ + "[Agent] Now I need population data. " + "Calling get_population for San Francisco." + ] + } + else: + facts = "; ".join(tool_results) + return { + "messages": ["[Agent] I have all the information I need."], + "final_answer": (f"Here's what I found about San Francisco: {facts}"), + } + + +async def tools(state: AgentState) -> dict[str, Any]: + """Execute the tool requested by the agent.""" + last_msg = state["messages"][-1] + + if "get_weather" in last_msg: + return {"messages": ["[Tool] Weather in San Francisco: 72°F and sunny."]} + elif "get_population" in last_msg: + return {"messages": ["[Tool] San Francisco population: ~870,000 residents."]} + else: + return {"messages": ["[Tool] Unknown tool requested."]} + + +async def should_continue(state: AgentState) -> str: + """Route: if the agent requested a tool, go to 'tools'. Otherwise, end. + + Must be async to avoid run_in_executor inside Temporal's workflow sandbox. + """ + last_msg = state["messages"][-1] + if last_msg.startswith("[Agent]") and "Calling" in last_msg: + return "tools" + return END + + +def make_agent_graph() -> StateGraph: + node_metadata = { + "execute_in": "activity", + "start_to_close_timeout": timedelta(seconds=30), + } + g = StateGraph(AgentState) + g.add_node("agent", agent, metadata=node_metadata) + g.add_node("tools", tools, metadata=node_metadata) + g.add_edge(START, "agent") + g.add_conditional_edges("agent", should_continue) + g.add_edge("tools", "agent") + return g + + +@workflow.defn +class ReactAgentWorkflow: + @workflow.run + async def run(self, query: str) -> str: + initial_state: AgentState = { + "input": query, + "messages": [], + "final_answer": "", + } + result = await temporal_graph("react-agent").compile().ainvoke(initial_state) + return result["final_answer"] diff --git a/pyproject.toml b/pyproject.toml index 0927eea3..d742ff7e 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -35,6 +35,12 @@ langsmith-tracing = [ "langsmith>=0.7.0", "temporalio[pydantic,langsmith]>=1.26.0", ] +langgraph = [ + "langgraph>=1.1.3", + "langchain>=0.3.0", + "langchain-anthropic>=0.3.0", + "temporalio[langgraph,langsmith]>=1.26.0", +] nexus = ["nexus-rpc>=1.1.0,<2"] open-telemetry = [ "temporalio[opentelemetry]", @@ -75,6 +81,7 @@ packages = [ "encryption", "gevent_async", "hello", + "langgraph_plugin", "langsmith_tracing", "message_passing", "nexus", @@ -103,6 +110,9 @@ packages = [ requires = ["hatchling"] build-backend = "hatchling.build" +[tool.uv.sources] +temporalio = { git = "https://github.com/temporalio/sdk-python.git", branch = "main" } + [tool.poe.tasks] format = [ { cmd = "uv run ruff check --select I --fix" }, @@ -113,8 +123,8 @@ lint = [ { cmd = "uv run ruff format --check" }, { ref = "lint-types" }, ] -lint-types = "uv run --all-groups mypy --check-untyped-defs --namespace-packages ." -test = "uv run --all-groups pytest" +lint-types = "uv run --all-groups --no-group cloud-export-to-parquet mypy --check-untyped-defs --namespace-packages ." +test = "uv run --all-groups --no-group cloud-export-to-parquet pytest" [tool.pytest.ini_options] asyncio_mode = "auto" diff --git a/tests/langgraph_plugin/__init__.py b/tests/langgraph_plugin/__init__.py new file mode 100644 index 00000000..133a43fd --- /dev/null +++ b/tests/langgraph_plugin/__init__.py @@ -0,0 +1 @@ +"""LangGraph plugin sample tests.""" diff --git a/tests/langgraph_plugin/continue_as_new_test.py b/tests/langgraph_plugin/continue_as_new_test.py new file mode 100644 index 00000000..71392f5f --- /dev/null +++ b/tests/langgraph_plugin/continue_as_new_test.py @@ -0,0 +1,34 @@ +import uuid +from datetime import timedelta + +from temporalio.client import Client +from temporalio.contrib.langgraph import LangGraphPlugin +from temporalio.worker import Worker + +from langgraph_plugin.graph_api.continue_as_new.workflow import ( + PipelineInput, + PipelineWorkflow, + make_pipeline_graph, +) + + +async def test_continue_as_new_graph_api(client: Client) -> None: + """Input 10: 10*2=20 -> 20+50=70 -> 70*3=210.""" + task_queue = f"continue-as-new-test-{uuid.uuid4()}" + plugin = LangGraphPlugin(graphs={"pipeline": make_pipeline_graph()}) + + async with Worker( + client, + task_queue=task_queue, + workflows=[PipelineWorkflow], + plugins=[plugin], + ): + result = await client.execute_workflow( + PipelineWorkflow.run, + PipelineInput(data=10), + id=f"continue-as-new-{uuid.uuid4()}", + task_queue=task_queue, + execution_timeout=timedelta(seconds=60), + ) + + assert result == 210 diff --git a/tests/langgraph_plugin/functional_continue_as_new_test.py b/tests/langgraph_plugin/functional_continue_as_new_test.py new file mode 100644 index 00000000..9d627d62 --- /dev/null +++ b/tests/langgraph_plugin/functional_continue_as_new_test.py @@ -0,0 +1,47 @@ +import sys +import uuid +from datetime import timedelta + +import pytest +from temporalio.client import Client +from temporalio.contrib.langgraph import LangGraphPlugin +from temporalio.worker import Worker + +from langgraph_plugin.functional_api.continue_as_new.workflow import ( + PipelineFunctionalWorkflow, + PipelineInput, + activity_options, + all_tasks, + pipeline_entrypoint, +) + +pytestmark = pytest.mark.skipif( + sys.version_info < (3, 11), + reason="LangGraph Functional API requires Python >= 3.11 for async context propagation", +) + + +async def test_functional_continue_as_new(client: Client) -> None: + """Input 10: 10*2=20 -> 20+50=70 -> 70*3=210.""" + task_queue = f"functional-continue-test-{uuid.uuid4()}" + plugin = LangGraphPlugin( + entrypoints={"pipeline": pipeline_entrypoint}, + tasks=all_tasks, + activity_options=activity_options, + ) + + async with Worker( + client, + task_queue=task_queue, + workflows=[PipelineFunctionalWorkflow], + plugins=[plugin], + ): + result = await client.execute_workflow( + PipelineFunctionalWorkflow.run, + PipelineInput(data=10), + id=f"functional-continue-{uuid.uuid4()}", + task_queue=task_queue, + execution_timeout=timedelta(seconds=60), + ) + + assert result == {"result": 210} diff --git a/tests/langgraph_plugin/functional_control_flow_test.py b/tests/langgraph_plugin/functional_control_flow_test.py new file mode 100644 index 00000000..a8fd337a --- /dev/null +++ b/tests/langgraph_plugin/functional_control_flow_test.py @@ -0,0 +1,59 @@ +import sys +import uuid + +import pytest +from temporalio.client import Client +from temporalio.contrib.langgraph import LangGraphPlugin +from temporalio.worker import Worker + +from langgraph_plugin.functional_api.control_flow.workflow import ( + ControlFlowWorkflow, + activity_options, + all_tasks, + control_flow_pipeline, +) + +pytestmark = pytest.mark.skipif( + sys.version_info < (3, 11), + reason="LangGraph Functional API requires Python >= 3.11 for async context propagation", +) + + +async def test_functional_control_flow(client: Client) -> None: + task_queue = f"functional-control-flow-test-{uuid.uuid4()}" + plugin = LangGraphPlugin( + entrypoints={"control-flow": control_flow_pipeline}, + tasks=all_tasks, + activity_options=activity_options, + ) + + items = [ + "Fix login bug", + "URGENT: Production outage", + "Update README", + "INVALID:", + "Urgent: Security patch", + ] + + async with Worker( + client, + task_queue=task_queue, + workflows=[ControlFlowWorkflow], + plugins=[plugin], + ): + result = await client.execute_workflow( + ControlFlowWorkflow.run, + items, + id=f"functional-control-flow-{uuid.uuid4()}", + task_queue=task_queue, + ) + + # "INVALID:" should be filtered out + assert result["total"] == 4 + # Check urgent vs normal routing + urgent = [r for r in result["results"] if r.startswith("[PRIORITY]")] + normal = [r for r in result["results"] if r.startswith("[STANDARD]")] + assert len(urgent) == 2 + assert len(normal) == 2 + assert "2 urgent" in result["summary"] + assert "2 normal" in result["summary"] diff --git a/tests/langgraph_plugin/functional_hello_world_test.py b/tests/langgraph_plugin/functional_hello_world_test.py new file mode 100644 index 00000000..4603ddba --- /dev/null +++ b/tests/langgraph_plugin/functional_hello_world_test.py @@ -0,0 +1,43 @@ +import sys +import uuid + +import pytest +from temporalio.client import Client +from temporalio.contrib.langgraph import LangGraphPlugin +from temporalio.worker import Worker + +from langgraph_plugin.functional_api.hello_world.workflow import ( + HelloWorldFunctionalWorkflow, + activity_options, + all_tasks, + hello_entrypoint, +) + +pytestmark = pytest.mark.skipif( + sys.version_info < (3, 11), + reason="LangGraph Functional API requires Python >= 3.11 for async context propagation", +) + + +async def test_functional_hello_world(client: Client) -> None: + task_queue = f"functional-hello-test-{uuid.uuid4()}" + plugin = LangGraphPlugin( + entrypoints={"hello-world": hello_entrypoint}, + tasks=all_tasks, + activity_options=activity_options, + ) + + async with Worker( + client, + task_queue=task_queue, + workflows=[HelloWorldFunctionalWorkflow], + plugins=[plugin], + ): + result = await client.execute_workflow( + HelloWorldFunctionalWorkflow.run, + "test query", + id=f"functional-hello-{uuid.uuid4()}", + task_queue=task_queue, + ) + + assert result == {"result": "Processed: test query"} diff --git a/tests/langgraph_plugin/functional_human_in_the_loop_test.py b/tests/langgraph_plugin/functional_human_in_the_loop_test.py new file mode 100644 index 00000000..746647f8 --- /dev/null +++ b/tests/langgraph_plugin/functional_human_in_the_loop_test.py @@ -0,0 +1,58 @@ +import asyncio +import sys +import uuid + +import pytest +from temporalio.client import Client +from temporalio.contrib.langgraph import LangGraphPlugin +from temporalio.worker import Worker + +from langgraph_plugin.functional_api.human_in_the_loop.workflow import ( + ChatbotFunctionalWorkflow, + activity_options, + all_tasks, + chatbot_entrypoint, +) + +pytestmark = pytest.mark.skipif( + sys.version_info < (3, 11), + reason="LangGraph Functional API and interrupt() require Python >= 3.11 for async context propagation", +) + + +async def test_functional_human_in_the_loop_approve(client: Client) -> None: + task_queue = f"functional-hitl-test-{uuid.uuid4()}" + plugin = LangGraphPlugin( + entrypoints={"chatbot": chatbot_entrypoint}, + tasks=all_tasks, + activity_options=activity_options, + ) + + async with Worker( + client, + task_queue=task_queue, + workflows=[ChatbotFunctionalWorkflow], + plugins=[plugin], + ): + handle = await client.start_workflow( + ChatbotFunctionalWorkflow.run, + "test message", + id=f"functional-hitl-{uuid.uuid4()}", + task_queue=task_queue, + ) + + # Poll for draft to be ready + draft = None + for _ in range(40): + await asyncio.sleep(0.25) + draft = await handle.query(ChatbotFunctionalWorkflow.get_draft) + if draft is not None: + break + assert draft is not None + assert "test message" in draft + + # Approve + await handle.signal(ChatbotFunctionalWorkflow.provide_feedback, "approve") + result = await handle.result() + + assert result["response"] == draft diff --git a/tests/langgraph_plugin/functional_react_agent_test.py b/tests/langgraph_plugin/functional_react_agent_test.py new file mode 100644 index 00000000..088cd5c4 --- /dev/null +++ b/tests/langgraph_plugin/functional_react_agent_test.py @@ -0,0 +1,44 @@ +import sys +import uuid + +import pytest +from temporalio.client import Client +from temporalio.contrib.langgraph import LangGraphPlugin +from temporalio.worker import Worker + +from langgraph_plugin.functional_api.react_agent.workflow import ( + ReactAgentFunctionalWorkflow, + activity_options, + all_tasks, + react_agent_entrypoint, +) + +pytestmark = pytest.mark.skipif( + sys.version_info < (3, 11), + reason="LangGraph Functional API requires Python >= 3.11 for async context propagation", +) + + +async def test_functional_react_agent(client: Client) -> None: + task_queue = f"functional-react-agent-test-{uuid.uuid4()}" + plugin = LangGraphPlugin( + entrypoints={"react-agent": react_agent_entrypoint}, + tasks=all_tasks, + activity_options=activity_options, + ) + + async with Worker( + client, + task_queue=task_queue, + workflows=[ReactAgentFunctionalWorkflow], + plugins=[plugin], + ): + result = await client.execute_workflow( + ReactAgentFunctionalWorkflow.run, + "Tell me about San Francisco", + id=f"functional-react-agent-{uuid.uuid4()}", + task_queue=task_queue, + ) + + assert "San Francisco" in result["answer"] + assert result["steps"] == 2 # two tool calls diff --git a/tests/langgraph_plugin/hello_world_test.py b/tests/langgraph_plugin/hello_world_test.py new file mode 100644 index 00000000..430989e5 --- /dev/null +++ b/tests/langgraph_plugin/hello_world_test.py @@ -0,0 +1,50 @@ +import uuid + +from temporalio.client import Client +from temporalio.contrib.langgraph import LangGraphPlugin +from temporalio.worker import Worker + +from langgraph_plugin.graph_api.hello_world.workflow import ( + HelloWorldWorkflow, + make_hello_graph, +) + + +async def test_hello_world_graph_api(client: Client) -> None: + task_queue = f"hello-world-test-{uuid.uuid4()}" + plugin = LangGraphPlugin(graphs={"hello-world": make_hello_graph()}) + + async with Worker( + client, + task_queue=task_queue, + workflows=[HelloWorldWorkflow], + plugins=[plugin], + ): + result = await client.execute_workflow( + HelloWorldWorkflow.run, + "test query", + id=f"hello-world-{uuid.uuid4()}", + task_queue=task_queue, + ) + + assert result == "Processed: test query" + + +async def test_hello_world_empty_string(client: Client) -> None: + task_queue = f"hello-world-empty-test-{uuid.uuid4()}" + plugin = LangGraphPlugin(graphs={"hello-world": make_hello_graph()}) + + async with Worker( + client, + task_queue=task_queue, + workflows=[HelloWorldWorkflow], + plugins=[plugin], + ): + result = await client.execute_workflow( + HelloWorldWorkflow.run, + "", + id=f"hello-world-empty-{uuid.uuid4()}", + task_queue=task_queue, + ) + + assert result == "Processed: " diff --git a/tests/langgraph_plugin/human_in_the_loop_test.py b/tests/langgraph_plugin/human_in_the_loop_test.py new file mode 100644 index 00000000..ad81e492 --- /dev/null +++ b/tests/langgraph_plugin/human_in_the_loop_test.py @@ -0,0 +1,86 @@ +import asyncio +import sys +import uuid + +import pytest +from temporalio.client import Client +from temporalio.contrib.langgraph import LangGraphPlugin +from temporalio.worker import Worker + +from langgraph_plugin.graph_api.human_in_the_loop.workflow import ( + ChatbotWorkflow, + make_chatbot_graph, +) + +pytestmark = pytest.mark.skipif( + sys.version_info < (3, 11), + reason="langgraph.types.interrupt() requires Python >= 3.11 for async context propagation", +) + + +async def test_human_in_the_loop_approve(client: Client) -> None: + task_queue = f"hitl-test-{uuid.uuid4()}" + plugin = LangGraphPlugin(graphs={"chatbot": make_chatbot_graph()}) + + async with Worker( + client, + task_queue=task_queue, + workflows=[ChatbotWorkflow], + plugins=[plugin], + ): + handle = await client.start_workflow( + ChatbotWorkflow.run, + "test message", + id=f"hitl-{uuid.uuid4()}", + task_queue=task_queue, + ) + + # Poll for draft to be ready + draft = None + for _ in range(40): + await asyncio.sleep(0.25) + draft = await handle.query(ChatbotWorkflow.get_draft) + if draft is not None: + break + assert draft is not None + assert "test message" in draft + + # Approve + await handle.signal(ChatbotWorkflow.provide_feedback, "approve") + result = await handle.result() + + assert result == draft # approved draft returned as-is + + +async def test_human_in_the_loop_revise(client: Client) -> None: + task_queue = f"hitl-revise-test-{uuid.uuid4()}" + plugin = LangGraphPlugin(graphs={"chatbot": make_chatbot_graph()}) + + async with Worker( + client, + task_queue=task_queue, + workflows=[ChatbotWorkflow], + plugins=[plugin], + ): + handle = await client.start_workflow( + ChatbotWorkflow.run, + "test message", + id=f"hitl-revise-{uuid.uuid4()}", + task_queue=task_queue, + ) + + # Poll for draft + draft = None + for _ in range(40): + await asyncio.sleep(0.25) + draft = await handle.query(ChatbotWorkflow.get_draft) + if draft is not None: + break + assert draft is not None + + # Send revision feedback + await handle.signal(ChatbotWorkflow.provide_feedback, "please be more concise") + result = await handle.result() + + assert "[Revised]" in result + assert "please be more concise" in result diff --git a/tests/langgraph_plugin/react_agent_test.py b/tests/langgraph_plugin/react_agent_test.py new file mode 100644 index 00000000..e9db3ec6 --- /dev/null +++ b/tests/langgraph_plugin/react_agent_test.py @@ -0,0 +1,32 @@ +import uuid + +from temporalio.client import Client +from temporalio.contrib.langgraph import LangGraphPlugin +from temporalio.worker import Worker + +from langgraph_plugin.graph_api.react_agent.workflow import ( + ReactAgentWorkflow, + make_agent_graph, +) + + +async def test_react_agent_graph_api(client: Client) -> None: + task_queue = f"react-agent-test-{uuid.uuid4()}" + plugin = LangGraphPlugin(graphs={"react-agent": make_agent_graph()}) + + async with Worker( + client, + task_queue=task_queue, + workflows=[ReactAgentWorkflow], + plugins=[plugin], + ): + result = await client.execute_workflow( + ReactAgentWorkflow.run, + "Tell me about San Francisco", + id=f"react-agent-{uuid.uuid4()}", + task_queue=task_queue, + ) + + assert "San Francisco" in result + assert "72" in result or "weather" in result.lower() + assert "870,000" in result or "population" in result.lower() diff --git a/uv.lock b/uv.lock index 3c9990eb..8fc2d318 100644 --- a/uv.lock +++ b/uv.lock @@ -125,6 +125,25 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/78/b6/6307fbef88d9b5ee7421e68d78a9f162e0da4900bc5f5793f6d3d0e34fb8/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53", size = 13643, upload-time = "2024-05-20T21:33:24.1Z" }, ] +[[package]] +name = "anthropic" +version = "0.97.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anyio" }, + { name = "distro" }, + { name = "docstring-parser" }, + { name = "httpx" }, + { name = "jiter" }, + { name = "pydantic" }, + { name = "sniffio" }, + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/14/93/f66ea8bfe39f2e6bb9da8e27fa5457ad2520e8f7612dfc547b17fad55c4d/anthropic-0.97.0.tar.gz", hash = "sha256:021e79fd8e21e90ad94dc5ba2bbbd8b1599f424f5b1fab6c06204009cab764be", size = 669502, upload-time = "2026-04-23T20:52:34.445Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/53/b6/8e851369fa661ad0fef2ae6266bf3b7d52b78ccf011720058f4adaca59e2/anthropic-0.97.0-py3-none-any.whl", hash = "sha256:8a1a472dfabcfc0c52ff6a3eecf724ac7e07107a2f6e2367be55ceb42f5d5613", size = 662126, upload-time = "2026-04-23T20:52:32.377Z" }, +] + [[package]] name = "anyio" version = "4.9.0" @@ -375,6 +394,15 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/12/b3/231ffd4ab1fc9d679809f356cebee130ac7daa00d6d6f3206dd4fd137e9e/distro-1.9.0-py3-none-any.whl", hash = "sha256:7bffd925d65168f85027d8da9af6bddab658135b840670a223589bc0c8ef02b2", size = 20277, upload-time = "2023-12-24T09:54:30.421Z" }, ] +[[package]] +name = "docstring-parser" +version = "0.18.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/e0/4d/f332313098c1de1b2d2ff91cf2674415cc7cddab2ca1b01ae29774bd5fdf/docstring_parser-0.18.0.tar.gz", hash = "sha256:292510982205c12b1248696f44959db3cdd1740237a968ea1e2e7a900eeb2015", size = 29341, upload-time = "2026-04-14T04:09:19.867Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a7/5f/ed01f9a3cdffbd5a008556fc7b2a08ddb1cc6ace7effa7340604b1d16699/docstring_parser-0.18.0-py3-none-any.whl", hash = "sha256:b3fcbed555c47d8479be0796ef7e19c2670d428d72e96da63f3a40122860374b", size = 22484, upload-time = "2026-04-14T04:09:18.638Z" }, +] + [[package]] name = "exceptiongroup" version = "1.3.0" @@ -950,6 +978,27 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/31/b4/b9b800c45527aadd64d5b442f9b932b00648617eb5d63d2c7a6587b7cafc/jmespath-1.0.1-py3-none-any.whl", hash = "sha256:02e2e4cc71b5bcab88332eebf907519190dd9e6e82107fa7f83b1003a6252980", size = 20256, upload-time = "2022-06-17T18:00:10.251Z" }, ] +[[package]] +name = "jsonpatch" +version = "1.33" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "jsonpointer" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/42/78/18813351fe5d63acad16aec57f94ec2b70a09e53ca98145589e185423873/jsonpatch-1.33.tar.gz", hash = "sha256:9fcd4009c41e6d12348b4a0ff2563ba56a2923a7dfee731d004e212e1ee5030c", size = 21699, upload-time = "2023-06-26T12:07:29.144Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/73/07/02e16ed01e04a374e644b575638ec7987ae846d25ad97bcc9945a3ee4b0e/jsonpatch-1.33-py2.py3-none-any.whl", hash = "sha256:0ae28c0cd062bbd8b8ecc26d7d164fbbea9652a1a3693f3b956c1eae5145dade", size = 12898, upload-time = "2023-06-16T21:01:28.466Z" }, +] + +[[package]] +name = "jsonpointer" +version = "3.1.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/18/c7/af399a2e7a67fd18d63c40c5e62d3af4e67b836a2107468b6a5ea24c4304/jsonpointer-3.1.1.tar.gz", hash = "sha256:0b801c7db33a904024f6004d526dcc53bbb8a4a0f4e32bfd10beadf60adf1900", size = 9068, upload-time = "2026-03-23T22:32:32.458Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/9e/6a/a83720e953b1682d2d109d3c2dbb0bc9bf28cc1cbc205be4ef4be5da709d/jsonpointer-3.1.1-py3-none-any.whl", hash = "sha256:8ff8b95779d071ba472cf5bc913028df06031797532f08a7d5b602d8b2a488ca", size = 7659, upload-time = "2026-03-23T22:32:31.568Z" }, +] + [[package]] name = "jsonschema" version = "4.24.0" @@ -977,6 +1026,122 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/01/0e/b27cdbaccf30b890c40ed1da9fd4a3593a5cf94dae54fb34f8a4b74fcd3f/jsonschema_specifications-2025.4.1-py3-none-any.whl", hash = "sha256:4653bffbd6584f7de83a67e0d620ef16900b390ddc7939d56684d6c81e33f1af", size = 18437, upload-time = "2025-04-23T12:34:05.422Z" }, ] +[[package]] +name = "langchain" +version = "1.2.15" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "langchain-core" }, + { name = "langgraph" }, + { name = "pydantic" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/98/3f/888a7099d2bd2917f8b0c3ffc7e347f1e664cf64267820b0b923c4f339fc/langchain-1.2.15.tar.gz", hash = "sha256:1717b6719daefae90b2728314a5e2a117ff916291e2862595b6c3d6fba33d652", size = 574732, upload-time = "2026-04-03T14:26:03.994Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/3f/e8/a3b8cb0005553f6a876865073c81ef93bd7c5b18381bcb9ba4013af96ebc/langchain-1.2.15-py3-none-any.whl", hash = "sha256:e349db349cb3e9550c4044077cf90a1717691756cc236438404b23500e615874", size = 112714, upload-time = "2026-04-03T14:26:02.557Z" }, +] + +[[package]] +name = "langchain-anthropic" +version = "1.4.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "anthropic" }, + { name = "langchain-core" }, + { name = "pydantic" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/d2/57/1f6be85f408d28864ec9b04c989011915f48ddbd43545d29e56a1e6818f2/langchain_anthropic-1.4.2.tar.gz", hash = "sha256:cc75cf4facfaf9f345e789b22986fdf23c6524921d4453a75bba32251e9fb0ec", size = 685094, upload-time = "2026-04-28T20:50:39.068Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/07/66/2b26b1f7949a7484bce68832a1e4f558bb24f9e2e8ebced504a19351133c/langchain_anthropic-1.4.2-py3-none-any.whl", hash = "sha256:f48e340dffcac2b760d2024ca6b3f83647782929bd01e62199a4b29d72e18c3e", size = 50423, upload-time = "2026-04-28T20:50:37.684Z" }, +] + +[[package]] +name = "langchain-core" +version = "1.3.2" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "jsonpatch" }, + { name = "langchain-protocol" }, + { name = "langsmith" }, + { name = "packaging" }, + { name = "pydantic" }, + { name = "pyyaml" }, + { name = "tenacity" }, + { name = "typing-extensions" }, + { name = "uuid-utils" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/a8/03/7219502e8ca728d65eb44d7a3eb60239230742a70dbfc9241b9bfd61c4ab/langchain_core-1.3.2.tar.gz", hash = "sha256:fd7a50b2f28ba561fd9d7f5d2760bc9e06cf00cdf820a3ccafe88a94ffa8d5b7", size = 911813, upload-time = "2026-04-24T15:49:23.699Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7d/d5/8fa4431007cbb7cfed7590f4d6a5dea3ad724f4174d248f6642ef5ce7d05/langchain_core-1.3.2-py3-none-any.whl", hash = "sha256:d44a66127f9f8db735bdfd0ab9661bccb47a97113cfd3f2d89c74864422b7274", size = 542390, upload-time = "2026-04-24T15:49:21.991Z" }, +] + +[[package]] +name = "langchain-protocol" +version = "0.0.13" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "typing-extensions" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/ba/7d/ac74b64f9d3150cc90c172490e7321b26ccc76ecef87c3629c83eda6948f/langchain_protocol-0.0.13.tar.gz", hash = "sha256:7448ca507407a6aaa28a73884d74765540e65da891a14a6e062a196412bc554c", size = 5713, upload-time = "2026-04-28T21:08:11.584Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d2/04/89db2cf7d839aef9c6544b9e3495f522ec3e408cbab34c9c05e41a3d87b0/langchain_protocol-0.0.13-py3-none-any.whl", hash = "sha256:47d4f2a05827bf3a66b238082bf59e8313fd3a14e1b268bdd65c85b67a6b1f6c", size = 6832, upload-time = "2026-04-28T21:08:10.564Z" }, +] + +[[package]] +name = "langgraph" +version = "1.1.10" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "langchain-core" }, + { name = "langgraph-checkpoint" }, + { name = "langgraph-prebuilt" }, + { name = "langgraph-sdk" }, + { name = "pydantic" }, + { name = "xxhash" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/9a/b3/7dec224369c7938eb3227ff69542a0d0f517862a0d27945b8c395f2a781f/langgraph-1.1.10.tar.gz", hash = "sha256:3115beb58203283c98d8752a90c034f3432177d2979a1fe205f76e5f1b744500", size = 560685, upload-time = "2026-04-27T17:19:10.426Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/80/07/057dc1aa7991115fca53f1fa6573a7cc0dd296c05360c672cc67fdb6245b/langgraph-1.1.10-py3-none-any.whl", hash = "sha256:8a4f163f72f4401648d0c11b48ee906947d938ba8cf1f474540fe591534f0d17", size = 173750, upload-time = "2026-04-27T17:19:09.073Z" }, +] + +[[package]] +name = "langgraph-checkpoint" +version = "4.0.3" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "langchain-core" }, + { name = "ormsgpack" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/7c/e1/885e49cdafceb4c74dae4573bc5dd6054c6c640382ee73104532f33dca46/langgraph_checkpoint-4.0.3.tar.gz", hash = "sha256:a7b5e2ca18fb79b55edf19396d4ee446f8a53dcb7a4ec62ce6f1c7e00bb5af7f", size = 174009, upload-time = "2026-04-27T14:34:02.777Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/19/ee/ecd3fa2e893746dde3b768daca2a4935208bc77d09445437ccfffb4a8c9b/langgraph_checkpoint-4.0.3-py3-none-any.whl", hash = "sha256:b91b765712a2311a5b198760f714b7ab9b376d01c047ed78d9b9a3e80df802a3", size = 51682, upload-time = "2026-04-27T14:34:01.51Z" }, +] + +[[package]] +name = "langgraph-prebuilt" +version = "1.0.12" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "langchain-core" }, + { name = "langgraph-checkpoint" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/ed/8b/5fff4c63bbfef1475d577e13f5970f91955a4069d8dc4adbaeef92f36732/langgraph_prebuilt-1.0.12.tar.gz", hash = "sha256:edcb11ff29996def816243f267fb2c85c0a2e4fb618c275f3d238aee8dd6a5ec", size = 172831, upload-time = "2026-04-27T17:14:27.152Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/53/75/1e6e6fd478a1b1e643de03505570103dcb89c57c429c0fd3084d521e522e/langgraph_prebuilt-1.0.12-py3-none-any.whl", hash = "sha256:ab83822d2724d434d3536dc127b86c7d16fe3fb8dc02a89a683bc77b2e55f6e9", size = 37195, upload-time = "2026-04-27T17:14:25.788Z" }, +] + +[[package]] +name = "langgraph-sdk" +version = "0.3.13" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "httpx" }, + { name = "orjson" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/0e/db/77a45127dddcfea5e4256ba916182903e4c31dc4cfca305b8c386f0a9e53/langgraph_sdk-0.3.13.tar.gz", hash = "sha256:419ca5663eec3cec192ad194ac0647c0c826866b446073eb40f384f950986cd5", size = 196360, upload-time = "2026-04-07T20:34:18.766Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/fe/ef/64d64e9f8eea47ce7b939aa6da6863b674c8d418647813c20111645fcc62/langgraph_sdk-0.3.13-py3-none-any.whl", hash = "sha256:aee09e345c90775f6de9d6f4c7b847cfc652e49055c27a2aed0d981af2af3bd0", size = 96668, upload-time = "2026-04-07T20:34:17.866Z" }, +] + [[package]] name = "langsmith" version = "0.7.32" @@ -1454,83 +1619,139 @@ wheels = [ [[package]] name = "orjson" -version = "3.11.4" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/c6/fe/ed708782d6709cc60eb4c2d8a361a440661f74134675c72990f2c48c785f/orjson-3.11.4.tar.gz", hash = "sha256:39485f4ab4c9b30a3943cfe99e1a213c4776fb69e8abd68f66b83d5a0b0fdc6d", size = 5945188, upload-time = "2025-10-24T15:50:38.027Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/e0/30/5aed63d5af1c8b02fbd2a8d83e2a6c8455e30504c50dbf08c8b51403d873/orjson-3.11.4-cp310-cp310-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:e3aa2118a3ece0d25489cbe48498de8a5d580e42e8d9979f65bf47900a15aba1", size = 243870, upload-time = "2025-10-24T15:48:28.908Z" }, - { url = "https://files.pythonhosted.org/packages/44/1f/da46563c08bef33c41fd63c660abcd2184b4d2b950c8686317d03b9f5f0c/orjson-3.11.4-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a69ab657a4e6733133a3dca82768f2f8b884043714e8d2b9ba9f52b6efef5c44", size = 130622, upload-time = "2025-10-24T15:48:31.361Z" }, - { url = "https://files.pythonhosted.org/packages/02/bd/b551a05d0090eab0bf8008a13a14edc0f3c3e0236aa6f5b697760dd2817b/orjson-3.11.4-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:3740bffd9816fc0326ddc406098a3a8f387e42223f5f455f2a02a9f834ead80c", size = 129344, upload-time = "2025-10-24T15:48:32.71Z" }, - { url = "https://files.pythonhosted.org/packages/87/6c/9ddd5e609f443b2548c5e7df3c44d0e86df2c68587a0e20c50018cdec535/orjson-3.11.4-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:65fd2f5730b1bf7f350c6dc896173d3460d235c4be007af73986d7cd9a2acd23", size = 136633, upload-time = "2025-10-24T15:48:34.128Z" }, - { url = "https://files.pythonhosted.org/packages/95/f2/9f04f2874c625a9fb60f6918c33542320661255323c272e66f7dcce14df2/orjson-3.11.4-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9fdc3ae730541086158d549c97852e2eea6820665d4faf0f41bf99df41bc11ea", size = 137695, upload-time = "2025-10-24T15:48:35.654Z" }, - { url = "https://files.pythonhosted.org/packages/d2/c2/c7302afcbdfe8a891baae0e2cee091583a30e6fa613e8bdf33b0e9c8a8c7/orjson-3.11.4-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e10b4d65901da88845516ce9f7f9736f9638d19a1d483b3883dc0182e6e5edba", size = 136879, upload-time = "2025-10-24T15:48:37.483Z" }, - { url = "https://files.pythonhosted.org/packages/c6/3a/b31c8f0182a3e27f48e703f46e61bb769666cd0dac4700a73912d07a1417/orjson-3.11.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fb6a03a678085f64b97f9d4a9ae69376ce91a3a9e9b56a82b1580d8e1d501aff", size = 136374, upload-time = "2025-10-24T15:48:38.624Z" }, - { url = "https://files.pythonhosted.org/packages/29/d0/fd9ab96841b090d281c46df566b7f97bc6c8cd9aff3f3ebe99755895c406/orjson-3.11.4-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:2c82e4f0b1c712477317434761fbc28b044c838b6b1240d895607441412371ac", size = 140519, upload-time = "2025-10-24T15:48:39.756Z" }, - { url = "https://files.pythonhosted.org/packages/d6/ce/36eb0f15978bb88e33a3480e1a3fb891caa0f189ba61ce7713e0ccdadabf/orjson-3.11.4-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:d58c166a18f44cc9e2bad03a327dc2d1a3d2e85b847133cfbafd6bfc6719bd79", size = 406522, upload-time = "2025-10-24T15:48:41.198Z" }, - { url = "https://files.pythonhosted.org/packages/85/11/e8af3161a288f5c6a00c188fc729c7ba193b0cbc07309a1a29c004347c30/orjson-3.11.4-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:94f206766bf1ea30e1382e4890f763bd1eefddc580e08fec1ccdc20ddd95c827", size = 149790, upload-time = "2025-10-24T15:48:42.664Z" }, - { url = "https://files.pythonhosted.org/packages/ea/96/209d52db0cf1e10ed48d8c194841e383e23c2ced5a2ee766649fe0e32d02/orjson-3.11.4-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:41bf25fb39a34cf8edb4398818523277ee7096689db352036a9e8437f2f3ee6b", size = 140040, upload-time = "2025-10-24T15:48:44.042Z" }, - { url = "https://files.pythonhosted.org/packages/ef/0e/526db1395ccb74c3d59ac1660b9a325017096dc5643086b38f27662b4add/orjson-3.11.4-cp310-cp310-win32.whl", hash = "sha256:fa9627eba4e82f99ca6d29bc967f09aba446ee2b5a1ea728949ede73d313f5d3", size = 135955, upload-time = "2025-10-24T15:48:45.495Z" }, - { url = "https://files.pythonhosted.org/packages/e6/69/18a778c9de3702b19880e73c9866b91cc85f904b885d816ba1ab318b223c/orjson-3.11.4-cp310-cp310-win_amd64.whl", hash = "sha256:23ef7abc7fca96632d8174ac115e668c1e931b8fe4dde586e92a500bf1914dcc", size = 131577, upload-time = "2025-10-24T15:48:46.609Z" }, - { url = "https://files.pythonhosted.org/packages/63/1d/1ea6005fffb56715fd48f632611e163d1604e8316a5bad2288bee9a1c9eb/orjson-3.11.4-cp311-cp311-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:5e59d23cd93ada23ec59a96f215139753fbfe3a4d989549bcb390f8c00370b39", size = 243498, upload-time = "2025-10-24T15:48:48.101Z" }, - { url = "https://files.pythonhosted.org/packages/37/d7/ffed10c7da677f2a9da307d491b9eb1d0125b0307019c4ad3d665fd31f4f/orjson-3.11.4-cp311-cp311-macosx_15_0_arm64.whl", hash = "sha256:5c3aedecfc1beb988c27c79d52ebefab93b6c3921dbec361167e6559aba2d36d", size = 128961, upload-time = "2025-10-24T15:48:49.571Z" }, - { url = "https://files.pythonhosted.org/packages/a2/96/3e4d10a18866d1368f73c8c44b7fe37cc8a15c32f2a7620be3877d4c55a3/orjson-3.11.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:da9e5301f1c2caa2a9a4a303480d79c9ad73560b2e7761de742ab39fe59d9175", size = 130321, upload-time = "2025-10-24T15:48:50.713Z" }, - { url = "https://files.pythonhosted.org/packages/eb/1f/465f66e93f434f968dd74d5b623eb62c657bdba2332f5a8be9f118bb74c7/orjson-3.11.4-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8873812c164a90a79f65368f8f96817e59e35d0cc02786a5356f0e2abed78040", size = 129207, upload-time = "2025-10-24T15:48:52.193Z" }, - { url = "https://files.pythonhosted.org/packages/28/43/d1e94837543321c119dff277ae8e348562fe8c0fafbb648ef7cb0c67e521/orjson-3.11.4-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5d7feb0741ebb15204e748f26c9638e6665a5fa93c37a2c73d64f1669b0ddc63", size = 136323, upload-time = "2025-10-24T15:48:54.806Z" }, - { url = "https://files.pythonhosted.org/packages/bf/04/93303776c8890e422a5847dd012b4853cdd88206b8bbd3edc292c90102d1/orjson-3.11.4-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:01ee5487fefee21e6910da4c2ee9eef005bee568a0879834df86f888d2ffbdd9", size = 137440, upload-time = "2025-10-24T15:48:56.326Z" }, - { url = "https://files.pythonhosted.org/packages/1e/ef/75519d039e5ae6b0f34d0336854d55544ba903e21bf56c83adc51cd8bf82/orjson-3.11.4-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3d40d46f348c0321df01507f92b95a377240c4ec31985225a6668f10e2676f9a", size = 136680, upload-time = "2025-10-24T15:48:57.476Z" }, - { url = "https://files.pythonhosted.org/packages/b5/18/bf8581eaae0b941b44efe14fee7b7862c3382fbc9a0842132cfc7cf5ecf4/orjson-3.11.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:95713e5fc8af84d8edc75b785d2386f653b63d62b16d681687746734b4dfc0be", size = 136160, upload-time = "2025-10-24T15:48:59.631Z" }, - { url = "https://files.pythonhosted.org/packages/c4/35/a6d582766d351f87fc0a22ad740a641b0a8e6fc47515e8614d2e4790ae10/orjson-3.11.4-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:ad73ede24f9083614d6c4ca9a85fe70e33be7bf047ec586ee2363bc7418fe4d7", size = 140318, upload-time = "2025-10-24T15:49:00.834Z" }, - { url = "https://files.pythonhosted.org/packages/76/b3/5a4801803ab2e2e2d703bce1a56540d9f99a9143fbec7bf63d225044fef8/orjson-3.11.4-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:842289889de515421f3f224ef9c1f1efb199a32d76d8d2ca2706fa8afe749549", size = 406330, upload-time = "2025-10-24T15:49:02.327Z" }, - { url = "https://files.pythonhosted.org/packages/80/55/a8f682f64833e3a649f620eafefee175cbfeb9854fc5b710b90c3bca45df/orjson-3.11.4-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:3b2427ed5791619851c52a1261b45c233930977e7de8cf36de05636c708fa905", size = 149580, upload-time = "2025-10-24T15:49:03.517Z" }, - { url = "https://files.pythonhosted.org/packages/ad/e4/c132fa0c67afbb3eb88274fa98df9ac1f631a675e7877037c611805a4413/orjson-3.11.4-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:3c36e524af1d29982e9b190573677ea02781456b2e537d5840e4538a5ec41907", size = 139846, upload-time = "2025-10-24T15:49:04.761Z" }, - { url = "https://files.pythonhosted.org/packages/54/06/dc3491489efd651fef99c5908e13951abd1aead1257c67f16135f95ce209/orjson-3.11.4-cp311-cp311-win32.whl", hash = "sha256:87255b88756eab4a68ec61837ca754e5d10fa8bc47dc57f75cedfeaec358d54c", size = 135781, upload-time = "2025-10-24T15:49:05.969Z" }, - { url = "https://files.pythonhosted.org/packages/79/b7/5e5e8d77bd4ea02a6ac54c42c818afb01dd31961be8a574eb79f1d2cfb1e/orjson-3.11.4-cp311-cp311-win_amd64.whl", hash = "sha256:e2d5d5d798aba9a0e1fede8d853fa899ce2cb930ec0857365f700dffc2c7af6a", size = 131391, upload-time = "2025-10-24T15:49:07.355Z" }, - { url = "https://files.pythonhosted.org/packages/0f/dc/9484127cc1aa213be398ed735f5f270eedcb0c0977303a6f6ddc46b60204/orjson-3.11.4-cp311-cp311-win_arm64.whl", hash = "sha256:6bb6bb41b14c95d4f2702bce9975fda4516f1db48e500102fc4d8119032ff045", size = 126252, upload-time = "2025-10-24T15:49:08.869Z" }, - { url = "https://files.pythonhosted.org/packages/63/51/6b556192a04595b93e277a9ff71cd0cc06c21a7df98bcce5963fa0f5e36f/orjson-3.11.4-cp312-cp312-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:d4371de39319d05d3f482f372720b841c841b52f5385bd99c61ed69d55d9ab50", size = 243571, upload-time = "2025-10-24T15:49:10.008Z" }, - { url = "https://files.pythonhosted.org/packages/1c/2c/2602392ddf2601d538ff11848b98621cd465d1a1ceb9db9e8043181f2f7b/orjson-3.11.4-cp312-cp312-macosx_15_0_arm64.whl", hash = "sha256:e41fd3b3cac850eaae78232f37325ed7d7436e11c471246b87b2cd294ec94853", size = 128891, upload-time = "2025-10-24T15:49:11.297Z" }, - { url = "https://files.pythonhosted.org/packages/4e/47/bf85dcf95f7a3a12bf223394a4f849430acd82633848d52def09fa3f46ad/orjson-3.11.4-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:600e0e9ca042878c7fdf189cf1b028fe2c1418cc9195f6cb9824eb6ed99cb938", size = 130137, upload-time = "2025-10-24T15:49:12.544Z" }, - { url = "https://files.pythonhosted.org/packages/b4/4d/a0cb31007f3ab6f1fd2a1b17057c7c349bc2baf8921a85c0180cc7be8011/orjson-3.11.4-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:7bbf9b333f1568ef5da42bc96e18bf30fd7f8d54e9ae066d711056add508e415", size = 129152, upload-time = "2025-10-24T15:49:13.754Z" }, - { url = "https://files.pythonhosted.org/packages/f7/ef/2811def7ce3d8576b19e3929fff8f8f0d44bc5eb2e0fdecb2e6e6cc6c720/orjson-3.11.4-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4806363144bb6e7297b8e95870e78d30a649fdc4e23fc84daa80c8ebd366ce44", size = 136834, upload-time = "2025-10-24T15:49:15.307Z" }, - { url = "https://files.pythonhosted.org/packages/00/d4/9aee9e54f1809cec8ed5abd9bc31e8a9631d19460e3b8470145d25140106/orjson-3.11.4-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ad355e8308493f527d41154e9053b86a5be892b3b359a5c6d5d95cda23601cb2", size = 137519, upload-time = "2025-10-24T15:49:16.557Z" }, - { url = "https://files.pythonhosted.org/packages/db/ea/67bfdb5465d5679e8ae8d68c11753aaf4f47e3e7264bad66dc2f2249e643/orjson-3.11.4-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c8a7517482667fb9f0ff1b2f16fe5829296ed7a655d04d68cd9711a4d8a4e708", size = 136749, upload-time = "2025-10-24T15:49:17.796Z" }, - { url = "https://files.pythonhosted.org/packages/01/7e/62517dddcfce6d53a39543cd74d0dccfcbdf53967017c58af68822100272/orjson-3.11.4-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:97eb5942c7395a171cbfecc4ef6701fc3c403e762194683772df4c54cfbb2210", size = 136325, upload-time = "2025-10-24T15:49:19.347Z" }, - { url = "https://files.pythonhosted.org/packages/18/ae/40516739f99ab4c7ec3aaa5cc242d341fcb03a45d89edeeaabc5f69cb2cf/orjson-3.11.4-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:149d95d5e018bdd822e3f38c103b1a7c91f88d38a88aada5c4e9b3a73a244241", size = 140204, upload-time = "2025-10-24T15:49:20.545Z" }, - { url = "https://files.pythonhosted.org/packages/82/18/ff5734365623a8916e3a4037fcef1cd1782bfc14cf0992afe7940c5320bf/orjson-3.11.4-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:624f3951181eb46fc47dea3d221554e98784c823e7069edb5dbd0dc826ac909b", size = 406242, upload-time = "2025-10-24T15:49:21.884Z" }, - { url = "https://files.pythonhosted.org/packages/e1/43/96436041f0a0c8c8deca6a05ebeaf529bf1de04839f93ac5e7c479807aec/orjson-3.11.4-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:03bfa548cf35e3f8b3a96c4e8e41f753c686ff3d8e182ce275b1751deddab58c", size = 150013, upload-time = "2025-10-24T15:49:23.185Z" }, - { url = "https://files.pythonhosted.org/packages/1b/48/78302d98423ed8780479a1e682b9aecb869e8404545d999d34fa486e573e/orjson-3.11.4-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:525021896afef44a68148f6ed8a8bf8375553d6066c7f48537657f64823565b9", size = 139951, upload-time = "2025-10-24T15:49:24.428Z" }, - { url = "https://files.pythonhosted.org/packages/4a/7b/ad613fdcdaa812f075ec0875143c3d37f8654457d2af17703905425981bf/orjson-3.11.4-cp312-cp312-win32.whl", hash = "sha256:b58430396687ce0f7d9eeb3dd47761ca7d8fda8e9eb92b3077a7a353a75efefa", size = 136049, upload-time = "2025-10-24T15:49:25.973Z" }, - { url = "https://files.pythonhosted.org/packages/b9/3c/9cf47c3ff5f39b8350fb21ba65d789b6a1129d4cbb3033ba36c8a9023520/orjson-3.11.4-cp312-cp312-win_amd64.whl", hash = "sha256:c6dbf422894e1e3c80a177133c0dda260f81428f9de16d61041949f6a2e5c140", size = 131461, upload-time = "2025-10-24T15:49:27.259Z" }, - { url = "https://files.pythonhosted.org/packages/c6/3b/e2425f61e5825dc5b08c2a5a2b3af387eaaca22a12b9c8c01504f8614c36/orjson-3.11.4-cp312-cp312-win_arm64.whl", hash = "sha256:d38d2bc06d6415852224fcc9c0bfa834c25431e466dc319f0edd56cca81aa96e", size = 126167, upload-time = "2025-10-24T15:49:28.511Z" }, - { url = "https://files.pythonhosted.org/packages/23/15/c52aa7112006b0f3d6180386c3a46ae057f932ab3425bc6f6ac50431cca1/orjson-3.11.4-cp313-cp313-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:2d6737d0e616a6e053c8b4acc9eccea6b6cce078533666f32d140e4f85002534", size = 243525, upload-time = "2025-10-24T15:49:29.737Z" }, - { url = "https://files.pythonhosted.org/packages/ec/38/05340734c33b933fd114f161f25a04e651b0c7c33ab95e9416ade5cb44b8/orjson-3.11.4-cp313-cp313-macosx_15_0_arm64.whl", hash = "sha256:afb14052690aa328cc118a8e09f07c651d301a72e44920b887c519b313d892ff", size = 128871, upload-time = "2025-10-24T15:49:31.109Z" }, - { url = "https://files.pythonhosted.org/packages/55/b9/ae8d34899ff0c012039b5a7cb96a389b2476e917733294e498586b45472d/orjson-3.11.4-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:38aa9e65c591febb1b0aed8da4d469eba239d434c218562df179885c94e1a3ad", size = 130055, upload-time = "2025-10-24T15:49:33.382Z" }, - { url = "https://files.pythonhosted.org/packages/33/aa/6346dd5073730451bee3681d901e3c337e7ec17342fb79659ec9794fc023/orjson-3.11.4-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:f2cf4dfaf9163b0728d061bebc1e08631875c51cd30bf47cb9e3293bfbd7dcd5", size = 129061, upload-time = "2025-10-24T15:49:34.935Z" }, - { url = "https://files.pythonhosted.org/packages/39/e4/8eea51598f66a6c853c380979912d17ec510e8e66b280d968602e680b942/orjson-3.11.4-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:89216ff3dfdde0e4070932e126320a1752c9d9a758d6a32ec54b3b9334991a6a", size = 136541, upload-time = "2025-10-24T15:49:36.923Z" }, - { url = "https://files.pythonhosted.org/packages/9a/47/cb8c654fa9adcc60e99580e17c32b9e633290e6239a99efa6b885aba9dbc/orjson-3.11.4-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9daa26ca8e97fae0ce8aa5d80606ef8f7914e9b129b6b5df9104266f764ce436", size = 137535, upload-time = "2025-10-24T15:49:38.307Z" }, - { url = "https://files.pythonhosted.org/packages/43/92/04b8cc5c2b729f3437ee013ce14a60ab3d3001465d95c184758f19362f23/orjson-3.11.4-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5c8b2769dc31883c44a9cd126560327767f848eb95f99c36c9932f51090bfce9", size = 136703, upload-time = "2025-10-24T15:49:40.795Z" }, - { url = "https://files.pythonhosted.org/packages/aa/fd/d0733fcb9086b8be4ebcfcda2d0312865d17d0d9884378b7cffb29d0763f/orjson-3.11.4-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1469d254b9884f984026bd9b0fa5bbab477a4bfe558bba6848086f6d43eb5e73", size = 136293, upload-time = "2025-10-24T15:49:42.347Z" }, - { url = "https://files.pythonhosted.org/packages/c2/d7/3c5514e806837c210492d72ae30ccf050ce3f940f45bf085bab272699ef4/orjson-3.11.4-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:68e44722541983614e37117209a194e8c3ad07838ccb3127d96863c95ec7f1e0", size = 140131, upload-time = "2025-10-24T15:49:43.638Z" }, - { url = "https://files.pythonhosted.org/packages/9c/dd/ba9d32a53207babf65bd510ac4d0faaa818bd0df9a9c6f472fe7c254f2e3/orjson-3.11.4-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:8e7805fda9672c12be2f22ae124dcd7b03928d6c197544fe12174b86553f3196", size = 406164, upload-time = "2025-10-24T15:49:45.498Z" }, - { url = "https://files.pythonhosted.org/packages/8e/f9/f68ad68f4af7c7bde57cd514eaa2c785e500477a8bc8f834838eb696a685/orjson-3.11.4-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:04b69c14615fb4434ab867bf6f38b2d649f6f300af30a6705397e895f7aec67a", size = 149859, upload-time = "2025-10-24T15:49:46.981Z" }, - { url = "https://files.pythonhosted.org/packages/b6/d2/7f847761d0c26818395b3d6b21fb6bc2305d94612a35b0a30eae65a22728/orjson-3.11.4-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:639c3735b8ae7f970066930e58cf0ed39a852d417c24acd4a25fc0b3da3c39a6", size = 139926, upload-time = "2025-10-24T15:49:48.321Z" }, - { url = "https://files.pythonhosted.org/packages/9f/37/acd14b12dc62db9a0e1d12386271b8661faae270b22492580d5258808975/orjson-3.11.4-cp313-cp313-win32.whl", hash = "sha256:6c13879c0d2964335491463302a6ca5ad98105fc5db3565499dcb80b1b4bd839", size = 136007, upload-time = "2025-10-24T15:49:49.938Z" }, - { url = "https://files.pythonhosted.org/packages/c0/a9/967be009ddf0a1fffd7a67de9c36656b28c763659ef91352acc02cbe364c/orjson-3.11.4-cp313-cp313-win_amd64.whl", hash = "sha256:09bf242a4af98732db9f9a1ec57ca2604848e16f132e3f72edfd3c5c96de009a", size = 131314, upload-time = "2025-10-24T15:49:51.248Z" }, - { url = "https://files.pythonhosted.org/packages/cb/db/399abd6950fbd94ce125cb8cd1a968def95174792e127b0642781e040ed4/orjson-3.11.4-cp313-cp313-win_arm64.whl", hash = "sha256:a85f0adf63319d6c1ba06fb0dbf997fced64a01179cf17939a6caca662bf92de", size = 126152, upload-time = "2025-10-24T15:49:52.922Z" }, - { url = "https://files.pythonhosted.org/packages/25/e3/54ff63c093cc1697e758e4fceb53164dd2661a7d1bcd522260ba09f54533/orjson-3.11.4-cp314-cp314-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:42d43a1f552be1a112af0b21c10a5f553983c2a0938d2bbb8ecd8bc9fb572803", size = 243501, upload-time = "2025-10-24T15:49:54.288Z" }, - { url = "https://files.pythonhosted.org/packages/ac/7d/e2d1076ed2e8e0ae9badca65bf7ef22710f93887b29eaa37f09850604e09/orjson-3.11.4-cp314-cp314-macosx_15_0_arm64.whl", hash = "sha256:26a20f3fbc6c7ff2cb8e89c4c5897762c9d88cf37330c6a117312365d6781d54", size = 128862, upload-time = "2025-10-24T15:49:55.961Z" }, - { url = "https://files.pythonhosted.org/packages/9f/37/ca2eb40b90621faddfa9517dfe96e25f5ae4d8057a7c0cdd613c17e07b2c/orjson-3.11.4-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6e3f20be9048941c7ffa8fc523ccbd17f82e24df1549d1d1fe9317712d19938e", size = 130047, upload-time = "2025-10-24T15:49:57.406Z" }, - { url = "https://files.pythonhosted.org/packages/c7/62/1021ed35a1f2bad9040f05fa4cc4f9893410df0ba3eaa323ccf899b1c90a/orjson-3.11.4-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:aac364c758dc87a52e68e349924d7e4ded348dedff553889e4d9f22f74785316", size = 129073, upload-time = "2025-10-24T15:49:58.782Z" }, - { url = "https://files.pythonhosted.org/packages/e8/3f/f84d966ec2a6fd5f73b1a707e7cd876813422ae4bf9f0145c55c9c6a0f57/orjson-3.11.4-cp314-cp314-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d5c54a6d76e3d741dcc3f2707f8eeb9ba2a791d3adbf18f900219b62942803b1", size = 136597, upload-time = "2025-10-24T15:50:00.12Z" }, - { url = "https://files.pythonhosted.org/packages/32/78/4fa0aeca65ee82bbabb49e055bd03fa4edea33f7c080c5c7b9601661ef72/orjson-3.11.4-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f28485bdca8617b79d44627f5fb04336897041dfd9fa66d383a49d09d86798bc", size = 137515, upload-time = "2025-10-24T15:50:01.57Z" }, - { url = "https://files.pythonhosted.org/packages/c1/9d/0c102e26e7fde40c4c98470796d050a2ec1953897e2c8ab0cb95b0759fa2/orjson-3.11.4-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:bfc2a484cad3585e4ba61985a6062a4c2ed5c7925db6d39f1fa267c9d166487f", size = 136703, upload-time = "2025-10-24T15:50:02.944Z" }, - { url = "https://files.pythonhosted.org/packages/df/ac/2de7188705b4cdfaf0b6c97d2f7849c17d2003232f6e70df98602173f788/orjson-3.11.4-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e34dbd508cb91c54f9c9788923daca129fe5b55c5b4eebe713bf5ed3791280cf", size = 136311, upload-time = "2025-10-24T15:50:04.441Z" }, - { url = "https://files.pythonhosted.org/packages/e0/52/847fcd1a98407154e944feeb12e3b4d487a0e264c40191fb44d1269cbaa1/orjson-3.11.4-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:b13c478fa413d4b4ee606ec8e11c3b2e52683a640b006bb586b3041c2ca5f606", size = 140127, upload-time = "2025-10-24T15:50:07.398Z" }, - { url = "https://files.pythonhosted.org/packages/c1/ae/21d208f58bdb847dd4d0d9407e2929862561841baa22bdab7aea10ca088e/orjson-3.11.4-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:724ca721ecc8a831b319dcd72cfa370cc380db0bf94537f08f7edd0a7d4e1780", size = 406201, upload-time = "2025-10-24T15:50:08.796Z" }, - { url = "https://files.pythonhosted.org/packages/8d/55/0789d6de386c8366059db098a628e2ad8798069e94409b0d8935934cbcb9/orjson-3.11.4-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:977c393f2e44845ce1b540e19a786e9643221b3323dae190668a98672d43fb23", size = 149872, upload-time = "2025-10-24T15:50:10.234Z" }, - { url = "https://files.pythonhosted.org/packages/cc/1d/7ff81ea23310e086c17b41d78a72270d9de04481e6113dbe2ac19118f7fb/orjson-3.11.4-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:1e539e382cf46edec157ad66b0b0872a90d829a6b71f17cb633d6c160a223155", size = 139931, upload-time = "2025-10-24T15:50:11.623Z" }, - { url = "https://files.pythonhosted.org/packages/77/92/25b886252c50ed64be68c937b562b2f2333b45afe72d53d719e46a565a50/orjson-3.11.4-cp314-cp314-win32.whl", hash = "sha256:d63076d625babab9db5e7836118bdfa086e60f37d8a174194ae720161eb12394", size = 136065, upload-time = "2025-10-24T15:50:13.025Z" }, - { url = "https://files.pythonhosted.org/packages/63/b8/718eecf0bb7e9d64e4956afaafd23db9f04c776d445f59fe94f54bdae8f0/orjson-3.11.4-cp314-cp314-win_amd64.whl", hash = "sha256:0a54d6635fa3aaa438ae32e8570b9f0de36f3f6562c308d2a2a452e8b0592db1", size = 131310, upload-time = "2025-10-24T15:50:14.46Z" }, - { url = "https://files.pythonhosted.org/packages/1a/bf/def5e25d4d8bfce296a9a7c8248109bf58622c21618b590678f945a2c59c/orjson-3.11.4-cp314-cp314-win_arm64.whl", hash = "sha256:78b999999039db3cf58f6d230f524f04f75f129ba3d1ca2ed121f8657e575d3d", size = 126151, upload-time = "2025-10-24T15:50:15.878Z" }, +version = "3.11.8" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/9d/1b/2024d06792d0779f9dbc51531b61c24f76c75b9f4ce05e6f3377a1814cea/orjson-3.11.8.tar.gz", hash = "sha256:96163d9cdc5a202703e9ad1b9ae757d5f0ca62f4fa0cc93d1f27b0e180cc404e", size = 5603832, upload-time = "2026-03-31T16:16:27.878Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/2f/90/5d81f61fe3e4270da80c71442864c091cee3003cc8984c75f413fe742a07/orjson-3.11.8-cp310-cp310-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:e6693ff90018600c72fd18d3d22fa438be26076cd3c823da5f63f7bab28c11cb", size = 229663, upload-time = "2026-03-31T16:14:30.708Z" }, + { url = "https://files.pythonhosted.org/packages/6c/ef/85e06b0eb11de6fb424120fd5788a07035bd4c5e6bb7841ae9972a0526d1/orjson-3.11.8-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:93de06bc920854552493c81f1f729fab7213b7db4b8195355db5fda02c7d1363", size = 132321, upload-time = "2026-03-31T16:14:32.317Z" }, + { url = "https://files.pythonhosted.org/packages/86/71/089338ee51b3132f050db0864a7df9bdd5e94c2a03820ab8a91e8f655618/orjson-3.11.8-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:fe0b8c83e0f36247fc9431ce5425a5d95f9b3a689133d494831bdbd6f0bceb13", size = 130658, upload-time = "2026-03-31T16:14:33.935Z" }, + { url = "https://files.pythonhosted.org/packages/10/0d/f39d8802345d0ad65f7fd4374b29b9b59f98656dc30f21ca5c773265b2f0/orjson-3.11.8-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:97d823831105c01f6c8029faf297633dbeb30271892bd430e9c24ceae3734744", size = 135708, upload-time = "2026-03-31T16:14:35.224Z" }, + { url = "https://files.pythonhosted.org/packages/ff/b5/40aae576b3473511696dcffea84fde638b2b64774eb4dcb8b2c262729f8a/orjson-3.11.8-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c60c0423f15abb6cf78f56dff00168a1b582f7a1c23f114036e2bfc697814d5f", size = 147047, upload-time = "2026-03-31T16:14:36.489Z" }, + { url = "https://files.pythonhosted.org/packages/7b/f0/778a84458d1fdaa634b2e572e51ce0b354232f580b2327e1f00a8d88c38c/orjson-3.11.8-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:01928d0476b216ad2201823b0a74000440360cef4fed1912d297b8d84718f277", size = 133072, upload-time = "2026-03-31T16:14:37.715Z" }, + { url = "https://files.pythonhosted.org/packages/bf/d3/1bbf2fc3ffcc4b829ade554b574af68cec898c9b5ad6420a923c75a073d3/orjson-3.11.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6a4a639049c44d36a6d1ae0f4a94b271605c745aee5647fa8ffaabcdc01b69a6", size = 133867, upload-time = "2026-03-31T16:14:39.356Z" }, + { url = "https://files.pythonhosted.org/packages/08/94/6413da22edc99a69a8d0c2e83bf42973b8aa94d83ef52a6d39ac85da00bc/orjson-3.11.8-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:3222adff1e1ff0dce93c16146b93063a7793de6c43d52309ae321234cdaf0f4d", size = 142268, upload-time = "2026-03-31T16:14:40.972Z" }, + { url = "https://files.pythonhosted.org/packages/4a/5f/aa5dbaa6136d7ba55f5461ac2e885efc6e6349424a428927fd46d68f4396/orjson-3.11.8-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:3223665349bbfb68da234acd9846955b1a0808cbe5520ff634bf253a4407009b", size = 424008, upload-time = "2026-03-31T16:14:42.637Z" }, + { url = "https://files.pythonhosted.org/packages/fa/aa/2c1962d108c7fe5e27aa03a354b378caf56d8eafdef15fd83dec081ce45a/orjson-3.11.8-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:61c9d357a59465736022d5d9ba06687afb7611dfb581a9d2129b77a6fcf78e59", size = 147942, upload-time = "2026-03-31T16:14:44.256Z" }, + { url = "https://files.pythonhosted.org/packages/47/d1/65f404f4c47eb1b0b4476f03ec838cac0c4aa933920ff81e5dda4dee14e7/orjson-3.11.8-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:58fb9b17b4472c7b1dcf1a54583629e62e23779b2331052f09a9249edf81675b", size = 136640, upload-time = "2026-03-31T16:14:45.884Z" }, + { url = "https://files.pythonhosted.org/packages/90/5f/7b784aea98bdb125a2f2da7c27d6c2d2f6d943d96ef0278bae596d563f85/orjson-3.11.8-cp310-cp310-win32.whl", hash = "sha256:b43dc2a391981d36c42fa57747a49dae793ef1d2e43898b197925b5534abd10a", size = 132066, upload-time = "2026-03-31T16:14:47.397Z" }, + { url = "https://files.pythonhosted.org/packages/92/ec/2e284af8d6c9478df5ef938917743f61d68f4c70d17f1b6e82f7e3b8dba1/orjson-3.11.8-cp310-cp310-win_amd64.whl", hash = "sha256:c98121237fea2f679480765abd566f7713185897f35c9e6c2add7e3a9900eb61", size = 127609, upload-time = "2026-03-31T16:14:48.78Z" }, + { url = "https://files.pythonhosted.org/packages/67/41/5aa7fa3b0f4dc6b47dcafc3cea909299c37e40e9972feabc8b6a74e2730d/orjson-3.11.8-cp311-cp311-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:003646067cc48b7fcab2ae0c562491c9b5d2cbd43f1e5f16d98fd118c5522d34", size = 229229, upload-time = "2026-03-31T16:14:50.424Z" }, + { url = "https://files.pythonhosted.org/packages/0a/d7/57e7f2458e0a2c41694f39fc830030a13053a84f837a5b73423dca1f0938/orjson-3.11.8-cp311-cp311-macosx_15_0_arm64.whl", hash = "sha256:ed193ce51d77a3830cad399a529cd4ef029968761f43ddc549e1bc62b40d88f8", size = 128871, upload-time = "2026-03-31T16:14:51.888Z" }, + { url = "https://files.pythonhosted.org/packages/53/4a/e0fdb9430983e6c46e0299559275025075568aad5d21dd606faee3703924/orjson-3.11.8-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f30491bc4f862aa15744b9738517454f1e46e56c972a2be87d70d727d5b2a8f8", size = 132104, upload-time = "2026-03-31T16:14:53.142Z" }, + { url = "https://files.pythonhosted.org/packages/08/4a/2025a60ff3f5c8522060cda46612d9b1efa653de66ed2908591d8d82f22d/orjson-3.11.8-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:6eda5b8b6be91d3f26efb7dc6e5e68ee805bc5617f65a328587b35255f138bf4", size = 130483, upload-time = "2026-03-31T16:14:54.605Z" }, + { url = "https://files.pythonhosted.org/packages/2d/3c/b9cde05bdc7b2385c66014e0620627da638d3d04e4954416ab48c31196c5/orjson-3.11.8-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ee8db7bfb6fe03581bbab54d7c4124a6dd6a7f4273a38f7267197890f094675f", size = 135481, upload-time = "2026-03-31T16:14:55.901Z" }, + { url = "https://files.pythonhosted.org/packages/ff/f2/a8238e7734de7cb589fed319857a8025d509c89dc52fdcc88f39c6d03d5a/orjson-3.11.8-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5d8b5231de76c528a46b57010bbd83fb51e056aa0220a372fd5065e978406f1c", size = 146819, upload-time = "2026-03-31T16:14:57.548Z" }, + { url = "https://files.pythonhosted.org/packages/db/10/dbf1e2a3cafea673b1b4350e371877b759060d6018a998643b7040e5de48/orjson-3.11.8-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:58a4a208a6fbfdb7a7327b8f201c6014f189f721fd55d047cafc4157af1bc62a", size = 132846, upload-time = "2026-03-31T16:14:58.91Z" }, + { url = "https://files.pythonhosted.org/packages/f8/fc/55e667ec9c85694038fcff00573d221b085d50777368ee3d77f38668bf3c/orjson-3.11.8-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5f8952d6d2505c003e8f0224ff7858d341fa4e33fef82b91c4ff0ef070f2393c", size = 133580, upload-time = "2026-03-31T16:15:00.519Z" }, + { url = "https://files.pythonhosted.org/packages/7e/a6/c08c589a9aad0cb46c4831d17de212a2b6901f9d976814321ff8e69e8785/orjson-3.11.8-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:0022bb50f90da04b009ce32c512dc1885910daa7cb10b7b0cba4505b16db82a8", size = 142042, upload-time = "2026-03-31T16:15:01.906Z" }, + { url = "https://files.pythonhosted.org/packages/5c/cc/2f78ea241d52b717d2efc38878615fe80425bf2beb6e68c984dde257a766/orjson-3.11.8-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:ff51f9d657d1afb6f410cb435792ce4e1fe427aab23d2fcd727a2876e21d4cb6", size = 423845, upload-time = "2026-03-31T16:15:03.703Z" }, + { url = "https://files.pythonhosted.org/packages/70/07/c17dcf05dd8045457538428a983bf1f1127928df5bf328cb24d2b7cddacb/orjson-3.11.8-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:6dbe9a97bdb4d8d9d5367b52a7c32549bba70b2739c58ef74a6964a6d05ae054", size = 147729, upload-time = "2026-03-31T16:15:05.203Z" }, + { url = "https://files.pythonhosted.org/packages/90/6c/0fb6e8a24e682e0958d71711ae6f39110e4b9cd8cab1357e2a89cb8e1951/orjson-3.11.8-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:a5c370674ebabe16c6ccac33ff80c62bf8a6e59439f5e9d40c1f5ab8fd2215b7", size = 136425, upload-time = "2026-03-31T16:15:07.052Z" }, + { url = "https://files.pythonhosted.org/packages/b2/35/4d3cc3a3d616035beb51b24a09bb872942dc452cf2df0c1d11ab35046d9f/orjson-3.11.8-cp311-cp311-win32.whl", hash = "sha256:0e32f7154299f42ae66f13488963269e5eccb8d588a65bc839ed986919fc9fac", size = 131870, upload-time = "2026-03-31T16:15:08.678Z" }, + { url = "https://files.pythonhosted.org/packages/13/26/9fe70f81d16b702f8c3a775e8731b50ad91d22dacd14c7599b60a0941cd1/orjson-3.11.8-cp311-cp311-win_amd64.whl", hash = "sha256:25e0c672a2e32348d2eb33057b41e754091f2835f87222e4675b796b92264f06", size = 127440, upload-time = "2026-03-31T16:15:09.994Z" }, + { url = "https://files.pythonhosted.org/packages/e8/c6/b038339f4145efd2859c1ca53097a52c0bb9cbdd24f947ebe146da1ad067/orjson-3.11.8-cp311-cp311-win_arm64.whl", hash = "sha256:9185589c1f2a944c17e26c9925dcdbc2df061cc4a145395c57f0c51f9b5dbfcd", size = 127399, upload-time = "2026-03-31T16:15:11.412Z" }, + { url = "https://files.pythonhosted.org/packages/01/f6/8d58b32ab32d9215973a1688aebd098252ee8af1766c0e4e36e7831f0295/orjson-3.11.8-cp312-cp312-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:1cd0b77e77c95758f8e1100139844e99f3ccc87e71e6fc8e1c027e55807c549f", size = 229233, upload-time = "2026-03-31T16:15:12.762Z" }, + { url = "https://files.pythonhosted.org/packages/a9/8b/2ffe35e71f6b92622e8ea4607bf33ecf7dfb51b3619dcfabfd36cbe2d0a5/orjson-3.11.8-cp312-cp312-macosx_15_0_arm64.whl", hash = "sha256:6a3d159d5ffa0e3961f353c4b036540996bf8b9697ccc38261c0eac1fd3347a6", size = 128772, upload-time = "2026-03-31T16:15:14.237Z" }, + { url = "https://files.pythonhosted.org/packages/27/d2/1f8682ae50d5c6897a563cb96bc106da8c9cb5b7b6e81a52e4cc086679b9/orjson-3.11.8-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:76070a76e9c5ae661e2d9848f216980d8d533e0f8143e6ed462807b242e3c5e8", size = 131946, upload-time = "2026-03-31T16:15:15.607Z" }, + { url = "https://files.pythonhosted.org/packages/52/4b/5500f76f0eece84226e0689cb48dcde081104c2fa6e2483d17ca13685ffb/orjson-3.11.8-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:54153d21520a71a4c82a0dbb4523e468941d549d221dc173de0f019678cf3813", size = 130368, upload-time = "2026-03-31T16:15:17.066Z" }, + { url = "https://files.pythonhosted.org/packages/da/4e/58b927e08fbe9840e6c920d9e299b051ea667463b1f39a56e668669f8508/orjson-3.11.8-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:469ac2125611b7c5741a0b3798cd9e5786cbad6345f9f400c77212be89563bec", size = 135540, upload-time = "2026-03-31T16:15:18.404Z" }, + { url = "https://files.pythonhosted.org/packages/56/7c/ba7cb871cba1bcd5cd02ee34f98d894c6cea96353ad87466e5aef2429c60/orjson-3.11.8-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:14778ffd0f6896aa613951a7fbf4690229aa7a543cb2bfbe9f358e08aafa9546", size = 146877, upload-time = "2026-03-31T16:15:19.833Z" }, + { url = "https://files.pythonhosted.org/packages/0b/5d/eb9c25fc1386696c6a342cd361c306452c75e0b55e86ad602dd4827a7fd7/orjson-3.11.8-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ea56a955056a6d6c550cf18b3348656a9d9a4f02e2d0c02cabf3c73f1055d506", size = 132837, upload-time = "2026-03-31T16:15:21.282Z" }, + { url = "https://files.pythonhosted.org/packages/37/87/5ddeb7fc1fbd9004aeccab08426f34c81a5b4c25c7061281862b015fce2b/orjson-3.11.8-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:53a0f57e59a530d18a142f4d4ba6dfc708dc5fdedce45e98ff06b44930a2a48f", size = 133624, upload-time = "2026-03-31T16:15:22.641Z" }, + { url = "https://files.pythonhosted.org/packages/22/09/90048793db94ee4b2fcec4ac8e5ddb077367637d6650be896b3494b79bb7/orjson-3.11.8-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:9b48e274f8824567d74e2158199e269597edf00823a1b12b63d48462bbf5123e", size = 141904, upload-time = "2026-03-31T16:15:24.435Z" }, + { url = "https://files.pythonhosted.org/packages/c0/cf/eb284847487821a5d415e54149a6449ba9bfc5872ce63ab7be41b8ec401c/orjson-3.11.8-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:3f262401086a3960586af06c054609365e98407151f5ea24a62893a40d80dbbb", size = 423742, upload-time = "2026-03-31T16:15:26.155Z" }, + { url = "https://files.pythonhosted.org/packages/44/09/e12423d327071c851c13e76936f144a96adacfc037394dec35ac3fc8d1e8/orjson-3.11.8-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:8e8c6218b614badf8e229b697865df4301afa74b791b6c9ade01d19a9953a942", size = 147806, upload-time = "2026-03-31T16:15:27.909Z" }, + { url = "https://files.pythonhosted.org/packages/b3/6d/37c2589ba864e582ffe7611643314785c6afb1f83c701654ef05daa8fcc7/orjson-3.11.8-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:093d489fa039ddade2db541097dbb484999fcc65fc2b0ff9819141e2ab364f25", size = 136485, upload-time = "2026-03-31T16:15:29.749Z" }, + { url = "https://files.pythonhosted.org/packages/be/c9/135194a02ab76b04ed9a10f68624b7ebd238bbe55548878b11ff15a0f352/orjson-3.11.8-cp312-cp312-win32.whl", hash = "sha256:e0950ed1bcb9893f4293fd5c5a7ee10934fbf82c4101c70be360db23ce24b7d2", size = 131966, upload-time = "2026-03-31T16:15:31.687Z" }, + { url = "https://files.pythonhosted.org/packages/ed/9a/9796f8fbe3cf30ce9cb696748dbb535e5c87be4bf4fe2e9ca498ef1fa8cf/orjson-3.11.8-cp312-cp312-win_amd64.whl", hash = "sha256:3cf17c141617b88ced4536b2135c552490f07799f6ad565948ea07bef0dcb9a6", size = 127441, upload-time = "2026-03-31T16:15:33.333Z" }, + { url = "https://files.pythonhosted.org/packages/cc/47/5aaf54524a7a4a0dd09dd778f3fa65dd2108290615b652e23d944152bc8e/orjson-3.11.8-cp312-cp312-win_arm64.whl", hash = "sha256:48854463b0572cc87dac7d981aa72ed8bf6deedc0511853dc76b8bbd5482d36d", size = 127364, upload-time = "2026-03-31T16:15:34.748Z" }, + { url = "https://files.pythonhosted.org/packages/66/7f/95fba509bb2305fab0073558f1e8c3a2ec4b2afe58ed9fcb7d3b8beafe94/orjson-3.11.8-cp313-cp313-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:3f23426851d98478c8970da5991f84784a76682213cd50eb73a1da56b95239dc", size = 229180, upload-time = "2026-03-31T16:15:36.426Z" }, + { url = "https://files.pythonhosted.org/packages/f6/9d/b237215c743ca073697d759b5503abd2cb8a0d7b9c9e21f524bcf176ab66/orjson-3.11.8-cp313-cp313-macosx_15_0_arm64.whl", hash = "sha256:ebaed4cef74a045b83e23537b52ef19a367c7e3f536751e355a2a394f8648559", size = 128754, upload-time = "2026-03-31T16:15:38.049Z" }, + { url = "https://files.pythonhosted.org/packages/42/3d/27d65b6d11e63f133781425f132807aef793ed25075fec686fc8e46dd528/orjson-3.11.8-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:97c8f5d3b62380b70c36ffacb2a356b7c6becec86099b177f73851ba095ef623", size = 131877, upload-time = "2026-03-31T16:15:39.484Z" }, + { url = "https://files.pythonhosted.org/packages/dd/cc/faee30cd8f00421999e40ef0eba7332e3a625ce91a58200a2f52c7fef235/orjson-3.11.8-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:436c4922968a619fb7fef1ccd4b8b3a76c13b67d607073914d675026e911a65c", size = 130361, upload-time = "2026-03-31T16:15:41.274Z" }, + { url = "https://files.pythonhosted.org/packages/5c/bb/a6c55896197f97b6d4b4e7c7fd77e7235517c34f5d6ad5aadd43c54c6d7c/orjson-3.11.8-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1ab359aff0436d80bfe8a23b46b5fea69f1e18aaf1760a709b4787f1318b317f", size = 135521, upload-time = "2026-03-31T16:15:42.758Z" }, + { url = "https://files.pythonhosted.org/packages/9c/7c/ca3a3525aa32ff636ebb1778e77e3587b016ab2edb1b618b36ba96f8f2c0/orjson-3.11.8-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f89b6d0b3a8d81e1929d3ab3d92bbc225688bd80a770c49432543928fe09ac55", size = 146862, upload-time = "2026-03-31T16:15:44.341Z" }, + { url = "https://files.pythonhosted.org/packages/3c/0c/18a9d7f18b5edd37344d1fd5be17e94dc652c67826ab749c6e5948a78112/orjson-3.11.8-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:29c009e7a2ca9ad0ed1376ce20dd692146a5d9fe4310848904b6b4fee5c5c137", size = 132847, upload-time = "2026-03-31T16:15:46.368Z" }, + { url = "https://files.pythonhosted.org/packages/23/91/7e722f352ad67ca573cee44de2a58fb810d0f4eb4e33276c6a557979fd8a/orjson-3.11.8-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:705b895b781b3e395c067129d8551655642dfe9437273211d5404e87ac752b53", size = 133637, upload-time = "2026-03-31T16:15:48.123Z" }, + { url = "https://files.pythonhosted.org/packages/af/04/32845ce13ac5bd1046ddb02ac9432ba856cc35f6d74dde95864fe0ad5523/orjson-3.11.8-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:88006eda83858a9fdf73985ce3804e885c2befb2f506c9a3723cdeb5a2880e3e", size = 141906, upload-time = "2026-03-31T16:15:49.626Z" }, + { url = "https://files.pythonhosted.org/packages/02/5e/c551387ddf2d7106d9039369862245c85738b828844d13b99ccb8d61fd06/orjson-3.11.8-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:55120759e61309af7fcf9e961c6f6af3dde5921cdb3ee863ef63fd9db126cae6", size = 423722, upload-time = "2026-03-31T16:15:51.176Z" }, + { url = "https://files.pythonhosted.org/packages/00/a3/ecfe62434096f8a794d4976728cb59bcfc4a643977f21c2040545d37eb4c/orjson-3.11.8-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:98bdc6cb889d19bed01de46e67574a2eab61f5cc6b768ed50e8ac68e9d6ffab6", size = 147801, upload-time = "2026-03-31T16:15:52.939Z" }, + { url = "https://files.pythonhosted.org/packages/18/6d/0dce10b9f6643fdc59d99333871a38fa5a769d8e2fc34a18e5d2bfdee900/orjson-3.11.8-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:708c95f925a43ab9f34625e45dcdadf09ec8a6e7b664a938f2f8d5650f6c090b", size = 136460, upload-time = "2026-03-31T16:15:54.431Z" }, + { url = "https://files.pythonhosted.org/packages/01/d6/6dde4f31842d87099238f1f07b459d24edc1a774d20687187443ab044191/orjson-3.11.8-cp313-cp313-win32.whl", hash = "sha256:01c4e5a6695dc09098f2e6468a251bc4671c50922d4d745aff1a0a33a0cf5b8d", size = 131956, upload-time = "2026-03-31T16:15:56.081Z" }, + { url = "https://files.pythonhosted.org/packages/c1/f9/4e494a56e013db957fb77186b818b916d4695b8fa2aa612364974160e91b/orjson-3.11.8-cp313-cp313-win_amd64.whl", hash = "sha256:c154a35dd1330707450bb4d4e7dd1f17fa6f42267a40c1e8a1daa5e13719b4b8", size = 127410, upload-time = "2026-03-31T16:15:57.54Z" }, + { url = "https://files.pythonhosted.org/packages/57/7f/803203d00d6edb6e9e7eef421d4e1adbb5ea973e40b3533f3cfd9aeb374e/orjson-3.11.8-cp313-cp313-win_arm64.whl", hash = "sha256:4861bde57f4d253ab041e374f44023460e60e71efaa121f3c5f0ed457c3a701e", size = 127338, upload-time = "2026-03-31T16:15:59.106Z" }, + { url = "https://files.pythonhosted.org/packages/6d/35/b01910c3d6b85dc882442afe5060cbf719c7d1fc85749294beda23d17873/orjson-3.11.8-cp314-cp314-macosx_10_15_x86_64.macosx_11_0_arm64.macosx_10_15_universal2.whl", hash = "sha256:ec795530a73c269a55130498842aaa762e4a939f6ce481a7e986eeaa790e9da4", size = 229171, upload-time = "2026-03-31T16:16:00.651Z" }, + { url = "https://files.pythonhosted.org/packages/c2/56/c9ec97bd11240abef39b9e5d99a15462809c45f677420fd148a6c5e6295e/orjson-3.11.8-cp314-cp314-macosx_15_0_arm64.whl", hash = "sha256:c492a0e011c0f9066e9ceaa896fbc5b068c54d365fea5f3444b697ee01bc8625", size = 128746, upload-time = "2026-03-31T16:16:02.673Z" }, + { url = "https://files.pythonhosted.org/packages/3b/e4/66d4f30a90de45e2f0cbd9623588e8ae71eef7679dbe2ae954ed6d66a41f/orjson-3.11.8-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:883206d55b1bd5f5679ad5e6ddd3d1a5e3cac5190482927fdb8c78fb699193b5", size = 131867, upload-time = "2026-03-31T16:16:04.342Z" }, + { url = "https://files.pythonhosted.org/packages/19/30/2a645fc9286b928675e43fa2a3a16fb7b6764aa78cc719dc82141e00f30b/orjson-3.11.8-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5774c1fdcc98b2259800b683b19599c133baeb11d60033e2095fd9d4667b82db", size = 124664, upload-time = "2026-03-31T16:16:05.837Z" }, + { url = "https://files.pythonhosted.org/packages/db/44/77b9a86d84a28d52ba3316d77737f6514e17118119ade3f91b639e859029/orjson-3.11.8-cp314-cp314-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8ac7381c83dd3d4a6347e6635950aa448f54e7b8406a27c7ecb4a37e9f1ae08b", size = 129701, upload-time = "2026-03-31T16:16:07.407Z" }, + { url = "https://files.pythonhosted.org/packages/b3/ea/eff3d9bfe47e9bc6969c9181c58d9f71237f923f9c86a2d2f490cd898c82/orjson-3.11.8-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:14439063aebcb92401c11afc68ee4e407258d2752e62d748b6942dad20d2a70d", size = 141202, upload-time = "2026-03-31T16:16:09.48Z" }, + { url = "https://files.pythonhosted.org/packages/52/c8/90d4b4c60c84d62068d0cf9e4d8f0a4e05e76971d133ac0c60d818d4db20/orjson-3.11.8-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:fa72e71977bff96567b0f500fc5bfd2fdf915f34052c782a4c6ebbdaa97aa858", size = 127194, upload-time = "2026-03-31T16:16:11.02Z" }, + { url = "https://files.pythonhosted.org/packages/8d/c7/ea9e08d1f0ba981adffb629811148b44774d935171e7b3d780ae43c4c254/orjson-3.11.8-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7679bc2f01bb0d219758f1a5f87bb7c8a81c0a186824a393b366876b4948e14f", size = 133639, upload-time = "2026-03-31T16:16:13.434Z" }, + { url = "https://files.pythonhosted.org/packages/6c/8c/ddbbfd6ba59453c8fc7fe1d0e5983895864e264c37481b2a791db635f046/orjson-3.11.8-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:14f7b8fcb35ef403b42fa5ecfa4ed032332a91f3dc7368fbce4184d59e1eae0d", size = 141914, upload-time = "2026-03-31T16:16:14.955Z" }, + { url = "https://files.pythonhosted.org/packages/4e/31/dbfbefec9df060d34ef4962cd0afcb6fa7a9ec65884cb78f04a7859526c3/orjson-3.11.8-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:c2bdf7b2facc80b5e34f48a2d557727d5c5c57a8a450de122ae81fa26a81c1bc", size = 423800, upload-time = "2026-03-31T16:16:16.594Z" }, + { url = "https://files.pythonhosted.org/packages/87/cf/f74e9ae9803d4ab46b163494adba636c6d7ea955af5cc23b8aaa94cfd528/orjson-3.11.8-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:ccd7ba1b0605813a0715171d39ec4c314cb97a9c85893c2c5c0c3a3729df38bf", size = 147837, upload-time = "2026-03-31T16:16:18.585Z" }, + { url = "https://files.pythonhosted.org/packages/64/e6/9214f017b5db85e84e68602792f742e5dc5249e963503d1b356bee611e01/orjson-3.11.8-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:cdbc8c9c02463fef4d3c53a9ba3336d05496ec8e1f1c53326a1e4acc11f5c600", size = 136441, upload-time = "2026-03-31T16:16:20.151Z" }, + { url = "https://files.pythonhosted.org/packages/24/dd/3590348818f58f837a75fb969b04cdf187ae197e14d60b5e5a794a38b79d/orjson-3.11.8-cp314-cp314-win32.whl", hash = "sha256:0b57f67710a8cd459e4e54eb96d5f77f3624eba0c661ba19a525807e42eccade", size = 131983, upload-time = "2026-03-31T16:16:21.823Z" }, + { url = "https://files.pythonhosted.org/packages/3f/0f/b6cb692116e05d058f31ceee819c70f097fa9167c82f67fabe7516289abc/orjson-3.11.8-cp314-cp314-win_amd64.whl", hash = "sha256:735e2262363dcbe05c35e3a8869898022af78f89dde9e256924dc02e99fe69ca", size = 127396, upload-time = "2026-03-31T16:16:23.685Z" }, + { url = "https://files.pythonhosted.org/packages/c0/d1/facb5b5051fabb0ef9d26c6544d87ef19a939a9a001198655d0d891062dd/orjson-3.11.8-cp314-cp314-win_arm64.whl", hash = "sha256:6ccdea2c213cf9f3d9490cbd5d427693c870753df41e6cb375bd79bcbafc8817", size = 127330, upload-time = "2026-03-31T16:16:25.496Z" }, +] + +[[package]] +name = "ormsgpack" +version = "1.12.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/12/0c/f1761e21486942ab9bb6feaebc610fa074f7c5e496e6962dea5873348077/ormsgpack-1.12.2.tar.gz", hash = "sha256:944a2233640273bee67521795a73cf1e959538e0dfb7ac635505010455e53b33", size = 39031, upload-time = "2026-01-18T20:55:28.023Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/93/fa/a91f70829ebccf6387c4946e0a1a109f6ba0d6a28d65f628bedfad94b890/ormsgpack-1.12.2-cp310-cp310-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl", hash = "sha256:c1429217f8f4d7fcb053523bbbac6bed5e981af0b85ba616e6df7cce53c19657", size = 378262, upload-time = "2026-01-18T20:55:22.284Z" }, + { url = "https://files.pythonhosted.org/packages/5f/62/3698a9a0c487252b5c6a91926e5654e79e665708ea61f67a8bdeceb022bf/ormsgpack-1.12.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5f13034dc6c84a6280c6c33db7ac420253852ea233fc3ee27c8875f8dd651163", size = 203034, upload-time = "2026-01-18T20:55:53.324Z" }, + { url = "https://files.pythonhosted.org/packages/66/3a/f716f64edc4aec2744e817660b317e2f9bb8de372338a95a96198efa1ac1/ormsgpack-1.12.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:59f5da97000c12bc2d50e988bdc8576b21f6ab4e608489879d35b2c07a8ab51a", size = 210538, upload-time = "2026-01-18T20:55:20.097Z" }, + { url = "https://files.pythonhosted.org/packages/72/30/a436be9ce27d693d4e19fa94900028067133779f09fc45776db3f689c822/ormsgpack-1.12.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9e4459c3f27066beadb2b81ea48a076a417aafffff7df1d3c11c519190ed44f2", size = 212401, upload-time = "2026-01-18T20:55:46.447Z" }, + { url = "https://files.pythonhosted.org/packages/10/c5/cde98300fd33fee84ca71de4751b19aeeca675f0cf3c0ec4b043f40f3b76/ormsgpack-1.12.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:7a1c460655d7288407ffa09065e322a7231997c0d62ce914bf3a96ad2dc6dedd", size = 387080, upload-time = "2026-01-18T20:56:00.884Z" }, + { url = "https://files.pythonhosted.org/packages/6a/31/30bf445ef827546747c10889dd254b3d84f92b591300efe4979d792f4c41/ormsgpack-1.12.2-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:458e4568be13d311ef7d8877275e7ccbe06c0e01b39baaac874caaa0f46d826c", size = 482346, upload-time = "2026-01-18T20:55:39.831Z" }, + { url = "https://files.pythonhosted.org/packages/2e/f5/e1745ddf4fa246c921b5ca253636c4c700ff768d78032f79171289159f6e/ormsgpack-1.12.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:8cde5eaa6c6cbc8622db71e4a23de56828e3d876aeb6460ffbcb5b8aff91093b", size = 425178, upload-time = "2026-01-18T20:55:27.106Z" }, + { url = "https://files.pythonhosted.org/packages/8d/a2/e6532ed7716aed03dede8df2d0d0d4150710c2122647d94b474147ccd891/ormsgpack-1.12.2-cp310-cp310-win_amd64.whl", hash = "sha256:dc7a33be14c347893edbb1ceda89afbf14c467d593a5ee92c11de4f1666b4d4f", size = 117183, upload-time = "2026-01-18T20:55:55.52Z" }, + { url = "https://files.pythonhosted.org/packages/4b/08/8b68f24b18e69d92238aa8f258218e6dfeacf4381d9d07ab8df303f524a9/ormsgpack-1.12.2-cp311-cp311-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl", hash = "sha256:bd5f4bf04c37888e864f08e740c5a573c4017f6fd6e99fa944c5c935fabf2dd9", size = 378266, upload-time = "2026-01-18T20:55:59.876Z" }, + { url = "https://files.pythonhosted.org/packages/0d/24/29fc13044ecb7c153523ae0a1972269fcd613650d1fa1a9cec1044c6b666/ormsgpack-1.12.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:34d5b28b3570e9fed9a5a76528fc7230c3c76333bc214798958e58e9b79cc18a", size = 203035, upload-time = "2026-01-18T20:55:30.59Z" }, + { url = "https://files.pythonhosted.org/packages/ad/c2/00169fb25dd8f9213f5e8a549dfb73e4d592009ebc85fbbcd3e1dcac575b/ormsgpack-1.12.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:3708693412c28f3538fb5a65da93787b6bbab3484f6bc6e935bfb77a62400ae5", size = 210539, upload-time = "2026-01-18T20:55:48.569Z" }, + { url = "https://files.pythonhosted.org/packages/1b/33/543627f323ff3c73091f51d6a20db28a1a33531af30873ea90c5ac95a9b5/ormsgpack-1.12.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:43013a3f3e2e902e1d05e72c0f1aeb5bedbb8e09240b51e26792a3c89267e181", size = 212401, upload-time = "2026-01-18T20:56:10.101Z" }, + { url = "https://files.pythonhosted.org/packages/e8/5d/f70e2c3da414f46186659d24745483757bcc9adccb481a6eb93e2b729301/ormsgpack-1.12.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:7c8b1667a72cbba74f0ae7ecf3105a5e01304620ed14528b2cb4320679d2869b", size = 387082, upload-time = "2026-01-18T20:56:12.047Z" }, + { url = "https://files.pythonhosted.org/packages/c0/d6/06e8dc920c7903e051f30934d874d4afccc9bb1c09dcaf0bc03a7de4b343/ormsgpack-1.12.2-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:df6961442140193e517303d0b5d7bc2e20e69a879c2d774316125350c4a76b92", size = 482346, upload-time = "2026-01-18T20:56:05.152Z" }, + { url = "https://files.pythonhosted.org/packages/66/c4/f337ac0905eed9c393ef990c54565cd33644918e0a8031fe48c098c71dbf/ormsgpack-1.12.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:c6a4c34ddef109647c769d69be65fa1de7a6022b02ad45546a69b3216573eb4a", size = 425181, upload-time = "2026-01-18T20:55:37.83Z" }, + { url = "https://files.pythonhosted.org/packages/78/29/6d5758fabef3babdf4bbbc453738cc7de9cd3334e4c38dd5737e27b85653/ormsgpack-1.12.2-cp311-cp311-win_amd64.whl", hash = "sha256:73670ed0375ecc303858e3613f407628dd1fca18fe6ac57b7b7ce66cc7bb006c", size = 117182, upload-time = "2026-01-18T20:55:31.472Z" }, + { url = "https://files.pythonhosted.org/packages/c4/57/17a15549233c37e7fd054c48fe9207492e06b026dbd872b826a0b5f833b6/ormsgpack-1.12.2-cp311-cp311-win_arm64.whl", hash = "sha256:c2be829954434e33601ae5da328cccce3266b098927ca7a30246a0baec2ce7bd", size = 111464, upload-time = "2026-01-18T20:55:38.811Z" }, + { url = "https://files.pythonhosted.org/packages/4c/36/16c4b1921c308a92cef3bf6663226ae283395aa0ff6e154f925c32e91ff5/ormsgpack-1.12.2-cp312-cp312-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl", hash = "sha256:7a29d09b64b9694b588ff2f80e9826bdceb3a2b91523c5beae1fab27d5c940e7", size = 378618, upload-time = "2026-01-18T20:55:50.835Z" }, + { url = "https://files.pythonhosted.org/packages/c0/68/468de634079615abf66ed13bb5c34ff71da237213f29294363beeeca5306/ormsgpack-1.12.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0b39e629fd2e1c5b2f46f99778450b59454d1f901bc507963168985e79f09c5d", size = 203186, upload-time = "2026-01-18T20:56:11.163Z" }, + { url = "https://files.pythonhosted.org/packages/73/a9/d756e01961442688b7939bacd87ce13bfad7d26ce24f910f6028178b2cc8/ormsgpack-1.12.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:958dcb270d30a7cb633a45ee62b9444433fa571a752d2ca484efdac07480876e", size = 210738, upload-time = "2026-01-18T20:56:09.181Z" }, + { url = "https://files.pythonhosted.org/packages/7b/ba/795b1036888542c9113269a3f5690ab53dd2258c6fb17676ac4bd44fcf94/ormsgpack-1.12.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:58d379d72b6c5e964851c77cfedfb386e474adee4fd39791c2c5d9efb53505cc", size = 212569, upload-time = "2026-01-18T20:56:06.135Z" }, + { url = "https://files.pythonhosted.org/packages/6c/aa/bff73c57497b9e0cba8837c7e4bcab584b1a6dbc91a5dd5526784a5030c8/ormsgpack-1.12.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:8463a3fc5f09832e67bdb0e2fda6d518dc4281b133166146a67f54c08496442e", size = 387166, upload-time = "2026-01-18T20:55:36.738Z" }, + { url = "https://files.pythonhosted.org/packages/d3/cf/f8283cba44bcb7b14f97b6274d449db276b3a86589bdb363169b51bc12de/ormsgpack-1.12.2-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:eddffb77eff0bad4e67547d67a130604e7e2dfbb7b0cde0796045be4090f35c6", size = 482498, upload-time = "2026-01-18T20:55:29.626Z" }, + { url = "https://files.pythonhosted.org/packages/05/be/71e37b852d723dfcbe952ad04178c030df60d6b78eba26bfd14c9a40575e/ormsgpack-1.12.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:fcd55e5f6ba0dbce624942adf9f152062135f991a0126064889f68eb850de0dd", size = 425518, upload-time = "2026-01-18T20:55:49.556Z" }, + { url = "https://files.pythonhosted.org/packages/7a/0c/9803aa883d18c7ef197213cd2cbf73ba76472a11fe100fb7dab2884edf48/ormsgpack-1.12.2-cp312-cp312-win_amd64.whl", hash = "sha256:d024b40828f1dde5654faebd0d824f9cc29ad46891f626272dd5bfd7af2333a4", size = 117462, upload-time = "2026-01-18T20:55:47.726Z" }, + { url = "https://files.pythonhosted.org/packages/c8/9e/029e898298b2cc662f10d7a15652a53e3b525b1e7f07e21fef8536a09bb8/ormsgpack-1.12.2-cp312-cp312-win_arm64.whl", hash = "sha256:da538c542bac7d1c8f3f2a937863dba36f013108ce63e55745941dda4b75dbb6", size = 111559, upload-time = "2026-01-18T20:55:54.273Z" }, + { url = "https://files.pythonhosted.org/packages/eb/29/bb0eba3288c0449efbb013e9c6f58aea79cf5cb9ee1921f8865f04c1a9d7/ormsgpack-1.12.2-cp313-cp313-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl", hash = "sha256:5ea60cb5f210b1cfbad8c002948d73447508e629ec375acb82910e3efa8ff355", size = 378661, upload-time = "2026-01-18T20:55:57.765Z" }, + { url = "https://files.pythonhosted.org/packages/6e/31/5efa31346affdac489acade2926989e019e8ca98129658a183e3add7af5e/ormsgpack-1.12.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f3601f19afdbea273ed70b06495e5794606a8b690a568d6c996a90d7255e51c1", size = 203194, upload-time = "2026-01-18T20:56:08.252Z" }, + { url = "https://files.pythonhosted.org/packages/eb/56/d0087278beef833187e0167f8527235ebe6f6ffc2a143e9de12a98b1ce87/ormsgpack-1.12.2-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:29a9f17a3dac6054c0dce7925e0f4995c727f7c41859adf9b5572180f640d172", size = 210778, upload-time = "2026-01-18T20:55:17.694Z" }, + { url = "https://files.pythonhosted.org/packages/1c/a2/072343e1413d9443e5a252a8eb591c2d5b1bffbe5e7bfc78c069361b92eb/ormsgpack-1.12.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:39c1bd2092880e413902910388be8715f70b9f15f20779d44e673033a6146f2d", size = 212592, upload-time = "2026-01-18T20:55:32.747Z" }, + { url = "https://files.pythonhosted.org/packages/a2/8b/a0da3b98a91d41187a63b02dda14267eefc2a74fcb43cc2701066cf1510e/ormsgpack-1.12.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:50b7249244382209877deedeee838aef1542f3d0fc28b8fe71ca9d7e1896a0d7", size = 387164, upload-time = "2026-01-18T20:55:40.853Z" }, + { url = "https://files.pythonhosted.org/packages/19/bb/6d226bc4cf9fc20d8eb1d976d027a3f7c3491e8f08289a2e76abe96a65f3/ormsgpack-1.12.2-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:5af04800d844451cf102a59c74a841324868d3f1625c296a06cc655c542a6685", size = 482516, upload-time = "2026-01-18T20:55:42.033Z" }, + { url = "https://files.pythonhosted.org/packages/fb/f1/bb2c7223398543dedb3dbf8bb93aaa737b387de61c5feaad6f908841b782/ormsgpack-1.12.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:cec70477d4371cd524534cd16472d8b9cc187e0e3043a8790545a9a9b296c258", size = 425539, upload-time = "2026-01-18T20:55:24.727Z" }, + { url = "https://files.pythonhosted.org/packages/7b/e8/0fb45f57a2ada1fed374f7494c8cd55e2f88ccd0ab0a669aa3468716bf5f/ormsgpack-1.12.2-cp313-cp313-win_amd64.whl", hash = "sha256:21f4276caca5c03a818041d637e4019bc84f9d6ca8baa5ea03e5cc8bf56140e9", size = 117459, upload-time = "2026-01-18T20:55:56.876Z" }, + { url = "https://files.pythonhosted.org/packages/7a/d4/0cfeea1e960d550a131001a7f38a5132c7ae3ebde4c82af1f364ccc5d904/ormsgpack-1.12.2-cp313-cp313-win_arm64.whl", hash = "sha256:baca4b6773d20a82e36d6fd25f341064244f9f86a13dead95dd7d7f996f51709", size = 111577, upload-time = "2026-01-18T20:55:43.605Z" }, + { url = "https://files.pythonhosted.org/packages/94/16/24d18851334be09c25e87f74307c84950f18c324a4d3c0b41dabdbf19c29/ormsgpack-1.12.2-cp314-cp314-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl", hash = "sha256:bc68dd5915f4acf66ff2010ee47c8906dc1cf07399b16f4089f8c71733f6e36c", size = 378717, upload-time = "2026-01-18T20:55:26.164Z" }, + { url = "https://files.pythonhosted.org/packages/b5/a2/88b9b56f83adae8032ac6a6fa7f080c65b3baf9b6b64fd3d37bd202991d4/ormsgpack-1.12.2-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:46d084427b4132553940070ad95107266656cb646ea9da4975f85cb1a6676553", size = 203183, upload-time = "2026-01-18T20:55:18.815Z" }, + { url = "https://files.pythonhosted.org/packages/a9/80/43e4555963bf602e5bdc79cbc8debd8b6d5456c00d2504df9775e74b450b/ormsgpack-1.12.2-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:c010da16235806cf1d7bc4c96bf286bfa91c686853395a299b3ddb49499a3e13", size = 210814, upload-time = "2026-01-18T20:55:33.973Z" }, + { url = "https://files.pythonhosted.org/packages/78/e1/7cfbf28de8bca6efe7e525b329c31277d1b64ce08dcba723971c241a9d60/ormsgpack-1.12.2-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:18867233df592c997154ff942a6503df274b5ac1765215bceba7a231bea2745d", size = 212634, upload-time = "2026-01-18T20:55:28.634Z" }, + { url = "https://files.pythonhosted.org/packages/95/f8/30ae5716e88d792a4e879debee195653c26ddd3964c968594ddef0a3cc7e/ormsgpack-1.12.2-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:b009049086ddc6b8f80c76b3955df1aa22a5fbd7673c525cd63bf91f23122ede", size = 387139, upload-time = "2026-01-18T20:56:02.013Z" }, + { url = "https://files.pythonhosted.org/packages/dc/81/aee5b18a3e3a0e52f718b37ab4b8af6fae0d9d6a65103036a90c2a8ffb5d/ormsgpack-1.12.2-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:1dcc17d92b6390d4f18f937cf0b99054824a7815818012ddca925d6e01c2e49e", size = 482578, upload-time = "2026-01-18T20:55:35.117Z" }, + { url = "https://files.pythonhosted.org/packages/bd/17/71c9ba472d5d45f7546317f467a5fc941929cd68fb32796ca3d13dcbaec2/ormsgpack-1.12.2-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:f04b5e896d510b07c0ad733d7fce2d44b260c5e6c402d272128f8941984e4285", size = 425539, upload-time = "2026-01-18T20:56:04.009Z" }, + { url = "https://files.pythonhosted.org/packages/2e/a6/ac99cd7fe77e822fed5250ff4b86fa66dd4238937dd178d2299f10b69816/ormsgpack-1.12.2-cp314-cp314-win_amd64.whl", hash = "sha256:ae3aba7eed4ca7cb79fd3436eddd29140f17ea254b91604aa1eb19bfcedb990f", size = 117493, upload-time = "2026-01-18T20:56:07.343Z" }, + { url = "https://files.pythonhosted.org/packages/3a/67/339872846a1ae4592535385a1c1f93614138566d7af094200c9c3b45d1e5/ormsgpack-1.12.2-cp314-cp314-win_arm64.whl", hash = "sha256:118576ea6006893aea811b17429bfc561b4778fad393f5f538c84af70b01260c", size = 111579, upload-time = "2026-01-18T20:55:21.161Z" }, + { url = "https://files.pythonhosted.org/packages/49/c2/6feb972dc87285ad381749d3882d8aecbde9f6ecf908dd717d33d66df095/ormsgpack-1.12.2-cp314-cp314t-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl", hash = "sha256:7121b3d355d3858781dc40dafe25a32ff8a8242b9d80c692fd548a4b1f7fd3c8", size = 378721, upload-time = "2026-01-18T20:55:52.12Z" }, + { url = "https://files.pythonhosted.org/packages/a3/9a/900a6b9b413e0f8a471cf07830f9cf65939af039a362204b36bd5b581d8b/ormsgpack-1.12.2-cp314-cp314t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4ee766d2e78251b7a63daf1cddfac36a73562d3ddef68cacfb41b2af64698033", size = 203170, upload-time = "2026-01-18T20:55:44.469Z" }, + { url = "https://files.pythonhosted.org/packages/87/4c/27a95466354606b256f24fad464d7c97ab62bce6cc529dd4673e1179b8fb/ormsgpack-1.12.2-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:292410a7d23de9b40444636b9b8f1e4e4b814af7f1ef476e44887e52a123f09d", size = 212816, upload-time = "2026-01-18T20:55:23.501Z" }, + { url = "https://files.pythonhosted.org/packages/73/cd/29cee6007bddf7a834e6cd6f536754c0535fcb939d384f0f37a38b1cddb8/ormsgpack-1.12.2-cp314-cp314t-win_amd64.whl", hash = "sha256:837dd316584485b72ef451d08dd3e96c4a11d12e4963aedb40e08f89685d8ec2", size = 117232, upload-time = "2026-01-18T20:55:45.448Z" }, ] [[package]] @@ -2510,7 +2731,7 @@ wheels = [ [[package]] name = "temporalio" version = "1.26.0" -source = { registry = "https://pypi.org/simple" } +source = { git = "https://github.com/temporalio/sdk-python.git?branch=main#8a6d0e0f329280df2e9f9f1bc7ab4835ee7a3462" } dependencies = [ { name = "nexus-rpc" }, { name = "protobuf" }, @@ -2518,16 +2739,11 @@ dependencies = [ { name = "types-protobuf" }, { name = "typing-extensions" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/ae/d4/fa21150a225393f87732ed6fef3cc9735d9e751edc6be415fe6e375105c6/temporalio-1.26.0.tar.gz", hash = "sha256:f4bfb35125e6f5e8c7f7ed1277c7354d812c6fac7ed5f8dbd50536cf289aaaa7", size = 2388994, upload-time = "2026-04-15T23:43:00.911Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/1e/27/8c421c622d18cc8e034247d5d72b89e6456937344b5bec1de40abef3c085/temporalio-1.26.0-cp310-abi3-macosx_10_12_x86_64.whl", hash = "sha256:5489040c0cf621edeb36984199dd9e4fbd2b3a07d61a4f2a8da1f2cb9820ef26", size = 14221070, upload-time = "2026-04-15T23:42:26.21Z" }, - { url = "https://files.pythonhosted.org/packages/49/7c/d2b691d16ec5db87198c2e08dbfba58e286c096faee15753613a581abdce/temporalio-1.26.0-cp310-abi3-macosx_11_0_arm64.whl", hash = "sha256:b18dd85771509c19ef059a31908bcd4e6130d1f67037c4db519702f3f2ad6d4a", size = 13583991, upload-time = "2026-04-15T23:42:34.357Z" }, - { url = "https://files.pythonhosted.org/packages/05/ca/b8728451320ca9d8bb6e1680b9bd23767118f86d5b8644edf2304d533f1b/temporalio-1.26.0-cp310-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:46187d5f82ca2ae81f35ea5916a76db0e2f067210dc6b1852c3749475721946e", size = 13808036, upload-time = "2026-04-15T23:42:42.757Z" }, - { url = "https://files.pythonhosted.org/packages/cb/54/3113f5e0ac58655790abac64656373e06191b351d74bfb94692e81bd6784/temporalio-1.26.0-cp310-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:03300c3e5237443367ac61bb20bd726c656b3daa50310bdd436599d5bdc7cf97", size = 14336604, upload-time = "2026-04-15T23:42:49.851Z" }, - { url = "https://files.pythonhosted.org/packages/fd/9b/c50840a26af3587c0c8d9af04d9976743e22496996dc1a377efc75dcd316/temporalio-1.26.0-cp310-abi3-win_amd64.whl", hash = "sha256:1c4a0d82f0a3796cbf78864c799f8dca0b94cdaec68e7b8b224c859005686ec4", size = 14525849, upload-time = "2026-04-15T23:42:57.589Z" }, -] [package.optional-dependencies] +langgraph = [ + { name = "langgraph" }, +] langsmith = [ { name = "langsmith" }, ] @@ -2584,6 +2800,12 @@ encryption = [ gevent = [ { name = "gevent" }, ] +langgraph = [ + { name = "langchain" }, + { name = "langchain-anthropic" }, + { name = "langgraph" }, + { name = "temporalio", extra = ["langgraph", "langsmith"] }, +] langsmith-tracing = [ { name = "langsmith" }, { name = "openai" }, @@ -2613,7 +2835,7 @@ trio-async = [ ] [package.metadata] -requires-dist = [{ name = "temporalio", specifier = ">=1.26.0,<2" }] +requires-dist = [{ name = "temporalio", git = "https://github.com/temporalio/sdk-python.git?branch=main" }] [package.metadata.requires-dev] bedrock = [{ name = "boto3", specifier = ">=1.34.92,<2" }] @@ -2644,20 +2866,26 @@ encryption = [ { name = "cryptography", specifier = ">=38.0.1,<39" }, ] gevent = [{ name = "gevent", marker = "python_full_version >= '3.8'", specifier = ">=25.4.2" }] +langgraph = [ + { name = "langchain", specifier = ">=0.3.0" }, + { name = "langchain-anthropic", specifier = ">=0.3.0" }, + { name = "langgraph", specifier = ">=1.1.3" }, + { name = "temporalio", extras = ["langgraph", "langsmith"], git = "https://github.com/temporalio/sdk-python.git?branch=main" }, +] langsmith-tracing = [ { name = "langsmith", specifier = ">=0.7.0" }, { name = "openai", specifier = ">=1.4.0" }, - { name = "temporalio", extras = ["pydantic", "langsmith"], specifier = ">=1.26.0" }, + { name = "temporalio", extras = ["pydantic", "langsmith"], git = "https://github.com/temporalio/sdk-python.git?branch=main" }, ] nexus = [{ name = "nexus-rpc", specifier = ">=1.1.0,<2" }] open-telemetry = [ { name = "opentelemetry-exporter-otlp-proto-grpc" }, - { name = "temporalio", extras = ["opentelemetry"] }, + { name = "temporalio", extras = ["opentelemetry"], git = "https://github.com/temporalio/sdk-python.git?branch=main" }, ] openai-agents = [ { name = "openai-agents", extras = ["litellm"], specifier = ">=0.14.1" }, { name = "requests", specifier = ">=2.32.0,<3" }, - { name = "temporalio", extras = ["openai-agents", "opentelemetry"], specifier = ">=1.26.0" }, + { name = "temporalio", extras = ["openai-agents", "opentelemetry"], git = "https://github.com/temporalio/sdk-python.git?branch=main" }, ] pydantic-converter = [{ name = "pydantic", specifier = ">=2.10.6,<3" }] sentry = [{ name = "sentry-sdk", specifier = ">=2.13.0" }] @@ -2666,6 +2894,15 @@ trio-async = [ { name = "trio-asyncio", specifier = ">=0.15.0,<0.16" }, ] +[[package]] +name = "tenacity" +version = "9.1.4" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/47/c6/ee486fd809e357697ee8a44d3d69222b344920433d3b6666ccd9b374630c/tenacity-9.1.4.tar.gz", hash = "sha256:adb31d4c263f2bd041081ab33b498309a57c77f9acf2db65aadf0898179cf93a", size = 49413, upload-time = "2026-02-07T10:45:33.841Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d7/c1/eb8f9debc45d3b7918a32ab756658a0904732f75e555402972246b0b8e71/tenacity-9.1.4-py3-none-any.whl", hash = "sha256:6095a360c919085f28c6527de529e76a06ad89b23659fa881ae0649b867a9d55", size = 28926, upload-time = "2026-02-07T10:45:32.24Z" }, +] + [[package]] name = "tiktoken" version = "0.12.0"