Skip to content

Quickstart

Two tutorials to get your first Dockrion agent running in under 5 minutes.

  • Python 3.11+
  • pip (or uv)
  • Docker (for dockrion build only)
Terminal window
pip install dockrion

See Installation & Extras for optional packages.


Terminal window
mkdir my-agent && cd my-agent
dockrion init my-agent --handler

This creates a Dockfile.yaml pre-configured for handler mode with framework: custom.

Create app/service.py:

def handle(payload: dict) -> dict:
query = payload.get("query", "")
return {"answer": f"Echo: {query}"}

Update Dockfile.yaml to point to your handler:

version: "1.0"
agent:
name: my-agent
handler: app.service:handle
io_schema:
input:
type: object
properties:
query:
type: string
required: [query]
output:
type: object
properties:
answer:
type: string
expose:
rest: true
streaming: none
Terminal window
dockrion validate

Expected output:

✓ Dockfile is valid
Agent: my-agent
Mode: handler (app.service:handle)
Framework: custom
Terminal window
dockrion run

The server starts on http://0.0.0.0:8080. You’ll see uvicorn startup logs.

Terminal window
curl -X POST http://localhost:8080/invoke \
-H "Content-Type: application/json" \
-d '{"query": "hello"}'

Response:

{
"success": true,
"output": { "answer": "Echo: hello" },
"metadata": {}
}

Open http://localhost:8080/docs in your browser. You’ll see all endpoints, with the /invoke request body shaped by your io_schema.


Terminal window
pip install dockrion[langgraph]
mkdir my-graph-agent && cd my-graph-agent
dockrion init my-graph-agent --framework langgraph

Create app/graph.py:

from langgraph.graph import StateGraph, END
def create_graph():
class State(dict):
pass
def process(state):
query = state.get("query", "")
state["answer"] = f"Processed: {query}"
return state
graph = StateGraph(State)
graph.add_node("process", process)
graph.set_entry_point("process")
graph.add_edge("process", END)
return graph.compile()

Update Dockfile.yaml:

version: "1.0"
agent:
name: my-graph-agent
entrypoint: app.graph:create_graph
framework: langgraph
io_schema:
input:
type: object
properties:
query:
type: string
output:
type: object
expose:
rest: true
Terminal window
dockrion validate
dockrion run
Terminal window
curl -X POST http://localhost:8080/invoke \
-H "Content-Type: application/json" \
-d '{"query": "hello from langgraph"}'

In entrypoint mode, create_graph() is called once at startup. The returned compiled graph’s .invoke() method handles each request. In handler mode, your function is called directly for every request.


To invoke your agent without starting an HTTP server:

Terminal window
dockrion test -p '{"query": "quick test"}'

This uses invoke_local() from the SDK — it loads your Dockfile, imports your handler/entrypoint, and calls it directly.


Previous: 1.3 Architecture Overview | Next: 1.5 Roadmap →