Copy-paste recipes for common use cases. Each cookbook is a self-contained example you can adapt to your own agent architecture.
Build a LangChain agent that can autonomously access premium paywalled APIs. The agent decides when to use the tool, and smart_fetch handles the payment negotiation transparently.
Create a custom LangChain @tool that wraps smart_fetch. The LLM will invoke this tool whenever it needs premium data.
from langchain.tools import tool
from modexia import ModexiaClient
client = ModexiaClient(api_key="mx_test_...")
@tool
def fetch_premium_news(query: str) -> str:
"""Fetch premium financial news articles. Automatically pays
if the API requires payment via the x402 protocol."""
url = f"https://api.premium-news.com/v1/search?q={query}"
response = client.smart_fetch(url)
if response.ok:
return response.json()["summary"]
return f"Error: {response.status_code}"Pass the tool to any LangChain agent. The LLM decides when to invoke it based on the user query.
from langchain_openai import ChatOpenAI
from langchain.agents import AgentExecutor, create_openai_tools_agent
from langchain_core.prompts import ChatPromptTemplate
llm = ChatOpenAI(model="gpt-4o")
prompt = ChatPromptTemplate.from_messages([
("system", "You are a financial research assistant. "
"Use the fetch_premium_news tool to find relevant data."),
("human", "{input}"),
("placeholder", "{agent_scratchpad}"),
])
agent = create_openai_tools_agent(llm, [fetch_premium_news], prompt)
executor = AgentExecutor(agent=agent, tools=[fetch_premium_news])
# The LLM will call fetch_premium_news, which auto-pays via x402
result = executor.invoke({"input": "What is the latest on NVIDIA earnings?"})
print(result["output"])How billing works here
Each time the LLM invokes fetch_premium_news, the SDK checks if the API returns HTTP 402. If it does, your agent pays the micro-fee (e.g. $0.01) from its wallet automatically. You only pay for what you use.
Build autonomous economic relationships between agents. In this example, a Manager Agent pays a Worker Agent (Freelancer) for completing a task — no human approval needed.
The Manager Agent outsources a data-labeling task. Once the Worker confirms completion, the Manager pays the Worker's wallet address directly.
from modexia import ModexiaClient
# Manager Agent's client (funds come from Manager's wallet)
manager = ModexiaClient(api_key="mx_test_manager_key...")
# Worker Agent's wallet address (retrieved from their profile)
WORKER_WALLET = "0x742d35Cc6634C0532925a3b844Bc454e..."
def pay_worker_for_task(task_id: str, amount: float):
"""Pay the Worker Agent upon verified task completion."""
receipt = manager.transfer(
recipient=WORKER_WALLET,
amount=amount,
idempotency_key=f"task-payment-{task_id}", # prevents double-pay
wait=True
)
if receipt["success"]:
print(f"Paid {amount} credits for task {task_id}")
print(f"Transaction: {receipt['txHash']}")
else:
print(f"Payment failed: {receipt}")
# Manager verifies the work is done, then pays
pay_worker_for_task(task_id="label-batch-042", amount=2.50)Scaling A2A economies
Each agent can have its own API key and wallet. Use Spending Policies to cap what each agent can spend per day. The idempotency_key ensures that even if the Manager retries the payment (e.g. due to a network glitch), the Worker only gets paid once.
Build a fully autonomous agent that deploys Docker containers on the Akash decentralized cloud using natural language. Uses Modexia for cross-chain JIT funding.
pip install modexia-compute-agent
modexia-computeWhen the agent detects insufficient AKT balance, it automatically calls fund_akash_wallet which uses the Modexia SDK to bridge USDC via CCTP:
from modexia import ModexiaClient
client = ModexiaClient(api_key="mx_live_...")
# The agent calls this automatically before deployment
receipt = client.cross_chain_transfer(
to_chain="akashnet-2",
to_token="ibc/170C677610AC31DF...", # Noble USDC on Akash
recipient="akash1nq6k3psu9mkqk...", # User's local wallet
amount=5.10 # Enough for 5 AKT deposit
)
if receipt.success:
print(f"Bridged! Tx: {receipt.txId}")
else:
print(f"Failed: {receipt.errorReason}")The agent uses LangChain @tool decorators and LangGraph's ReAct agent for dynamic tool orchestration:
from langchain_core.tools import tool
from langgraph.prebuilt import create_react_agent
from langchain_groq import ChatGroq
@tool
def deploy_container_tool(image: str, cpu: float, ram_mi: int,
storage_mi: int, port: int) -> str:
"""Deploy a container to Akash Network.
Handles SDL generation, JIT funding, and deployment."""
# 1. Generate SDL manifest
# 2. Check balance → auto-fund if < 5 AKT
# 3. Submit deployment → return DSEQ
...
llm = ChatGroq(model="llama-3.3-70b-versatile")
agent = create_react_agent(llm, [deploy_container_tool, ...])Full documentation
See the complete Akash Compute Agent docs for architecture details, all 6 tools, and the first-run setup wizard.