A full-stack application I built for a commercial beekeeping wholesale domain. Custom chatbots, an automated ordering pipeline that converts inbound email into typed, reviewable orders, and a semantic-search RAG layer over historical inspections, customer correspondence, and product data.
Live app: forthcoming
Code: forthcoming
Stack: Python · FastAPI · Postgres + pgvector · LangGraph · MCP · Docker
How it’s built¶
System architecture¶
A FastAPI service fronts a PostgreSQL store with pgvector for embeddings. Three agent classes coordinate through a custom MCP server: order extraction, customer matching, and SKU resolution. The agent chain is exposed through a chat UI on the same web app.
RAG + semantic search¶
Write path — ingestion:
Read path — query:
Inbound text (emails, inspection notes, customer reports) is chunked, embedded, and stored in pgvector at write time. The same embedder processes queries at read time so the vector spaces line up. Retrieval filters by tenant and recency before cosine top-K, with an optional cross-encoder rerank pass on the top-k. If nothing scores above threshold, the system escalates to a human rep instead of generating an ungrounded answer.
Agent extraction sequence¶
An inbound email enters the order pipeline. The order-extraction agent
calls MCP tools to fuzzy-match the customer, resolve SKUs against the
catalog, check regulatory constraints, and emit a typed Order object.
Every tool call is logged for review.
Order pipeline FSM¶
Orders move through an explicit lifecycle (extracted, matched, priced, reviewed, synced) with idempotent transitions and stale-state recovery. The FSM is the contract between the agent and the ERP adapter.
Data model¶
Customers, sites, products, inspections, orders, and customer reports: relational where they’re relational, vectorized where they’re searched by meaning.