Workflows are graph-based automations that fire when traffic matches a trigger and execute one or more actions in sequence. Build them visually in the Workflows view, store them as JSON, and Hugin evaluates every captured flow against the enabled workflow set in real time.
Backed by boa_engine for inline JavaScript, workflows are powerful enough for multi-step scanning pipelines but light enough to use as quick “flag every 500 response” rules.
🔗Architecture
[Trigger Node] ──► [Action 1] ──► [Action 2] ──► ...
↘ [Action 3] (parallel branch)
Each workflow is a directed acyclic graph (DAG) of nodes:
- Trigger nodes declare what fires the workflow
- Action nodes declare what happens
- Edges define execution order
Multi-trigger workflows (OR semantics) and parallel action branches are both supported. Cycles are rejected at validation time.
🔗Workflow Types
- Passive (default) — auto-evaluated on every captured flow as it arrives. No latency cost to the proxy: workflows run on a background task after the flow is stored.
- Active — manually triggered or scheduled. Useful for periodic scans you don’t want firing on every request.
- Convert — pure transformation pipelines (encode/decode/format chains) without trigger evaluation. Driven by the Decoder view.
🔗Triggers
| Trigger | Fires when |
|---|---|
flow_created | Every captured flow (most general) |
flow_updated | Flow record is mutated (re-tagged, annotated) |
status_code | Response status matches a value or range (e.g., 5xx, 401, 4xx) |
host_match | Request host matches a glob pattern (*.example.com, api.target.com) |
path_match | Request path matches a regex |
Triggers can be chained with OR semantics — a workflow with two trigger nodes fires if either matches.
🔗Actions
🔗Triage / Storage
- Flag — flag the matching flow with optional color and note
- AddToFindings — auto-create a finding from the flow, with title / severity / CWE supplied as the action value (JSON)
- SendNotification — desktop notification (text from action value, supports
{{template}}interpolation)
🔗Active operations
- Scan — submit the flow to the active scanner (built-in or specific check IDs in the action value)
- Repeat — re-issue the request through the proxy with optional modifications (full repeater config in the action value)
🔗Custom code
- RunJavaScript — execute inline JavaScript via
boa_engine. Receives the flow asflowglobal; can returntrue/falseto allow further actions, or a modified flow object. No network access — sandboxed pure-compute. - RunShell — execute a shell command. Receives the flow as JSON on stdin; stdout becomes the next node’s input. Permission-gated — requires explicit allow in Settings → Custom Code.
🔗Encoding / Transform (10 actions)
Used in the Convert pipeline mode and chained through RunJavaScript outputs:
Base64Encode/Base64DecodeUrlEncode/UrlDecodeHtmlEncode/HtmlDecodeHexEncode/HexDecodeJsonPrettyPrintGzipDecompress
🔗Specialised
- AnalyzeTokens — extract a token from this flow (e.g., from a
Set-Cookieheader), capture more samples, and run Sequencer randomness analysis. Action value:{"token_location":"cookie:session_id","num_tokens":100}. - BacTagCorpus (Pro) — tag the triggering flow into the BAC ID corpus as belonging to the identity named in
action_value. Lets a passive workflow pre-seed the corpus on every flow that matches a trigger (e.g. “any 200 response under/api/users/”). Emptyaction_valuetags the corpus without an identity; a named value (e.g."admin") buckets the IDs into that identity for cross-identity rotation. Only fires for Pro / Trial / Dev-bypass tiers — Community users don’t see the action in the builder. See BAC pipeline for how the corpus feeds the active audit.
🔗Inter-Node Data Passing
Action nodes can reference outputs from upstream nodes via {{template}} syntax in their action_value field:
{
"trigger": "status_code",
"trigger_value": "401",
"actions": [
{
"action": "Repeat",
"action_value": "{\"url\":\"{{flow.url}}\",\"headers\":{\"Authorization\":\"Bearer {{env.admin_token}}\"}}"
},
{
"action": "AddToFindings",
"action_value": "{\"title\":\"Auth bypass via admin token on {{flow.host}}\",\"severity\":\"high\"}"
}
]
}
Available template namespaces:
flow.*— the matching flow (url, host, path, method, status, headers, body)env.*— variables from the active Environmentprevious.*— output of the immediately upstream node
🔗Visual Builder
The Workflows view ships with a node-based visual builder:
- Drag triggers and actions from the palette onto the canvas
- Drag edges from one node’s output handle to another’s input
- Click any node to edit its config in the right panel
- Validate Workflow button checks the graph
- Run button executes the workflow against captured input
🔗Scheduling
Trigger-based workflows fire on captured flow events. To run a workflow on a cron / interval, submit it as a Scheduler job rather than relying on a built-in scheduler tab in the Workflows view.
🔗Sample Workflows
Use the visual builder to assemble workflows from the trigger + action primitives above. Useful starter patterns to build by hand:
- Auth signals — trigger on
status_code:401|403, action Flag with a note - Server-error catcher — trigger on
status_code:5xx, action AddToFindings with severity Medium - Token leakage — trigger on
flow_created, RunJavaScript that searches the response body forAuthorization/Bearer/tokenpatterns and AddToFindings on hits - CORS misconfig — RunJavaScript that flags responses with
Access-Control-Allow-Origin: *plus credentials - JWT
alg:none— RunJavaScript that decodes JWTs in the Authorization header and AddToFindings when the alg isnone
Hugin doesn’t currently ship a packaged workflow library — build the patterns you need in the editor and save.
🔗Execution Order
Actions execute in BFS order starting from the trigger(s). A single node can have any number of outgoing and incoming wires.
- Fan-out (one trigger → many parallel actions) runs every branch.
- Fan-in (many branches → one merged action) runs the converged node exactly once per flow — duplicate executions are deduplicated via a visited set.
Trigger ──► Flag ────┐
│ ├──► AddToFindings
└──► Notify ──────┘
Both Flag and Notify run, and AddToFindings runs once — not twice.
🔗Run History (Logs Pane)
The toolbar Logs checkbox opens a wide table pane beneath the canvas showing every execution of the selected workflow. Columns:
- Time —
YYYY-MM-DD HH:MM:SS - Source —
Passive(auto-triggered on the flow stream) orManual(one-shot Run) - Result —
OKgreen badge orErrorred badge - Triggers —
matchedorskipped(a manual run can skip trigger match and still execute) - Actions — number of action nodes executed in BFS order
- Duration — wall time in ms
- Flow — 8-char prefix of the flow UUID (click to jump back to Logger)
Runs are stored in SQLite (workflow_runs table) so the history survives restarts. The pane refreshes live after every manual Run.
🔗REST API
Workflows expose five routes on the local Hugin API:
GET /api/workflows List all workflows (raw JSON from disk)
GET /api/workflows/{id} Fetch one workflow
POST /api/workflows/{id}/validate Validate graph structure
POST /api/workflows/{id}/run Manual one-shot run against a flow
GET /api/workflows/{id}/runs Execution history (newest first)
Run request body:
{ "flow_id": "c447a083-907c-4c8b-8520-dcb2635697c9" }
Run response:
{
"workflow_id": "ec54bf71-405f-46e2-9c70-3a98ac985fd1",
"workflow_name": "delivery-demo",
"flow_id": "c447a083-907c-4c8b-8520-dcb2635697c9",
"triggers_matched": true,
"actions_executed": 3,
"run_count": 1
}
The /runs endpoint accepts ?limit=N&offset=N query params for paging.
🔗Storage
All workflows live in a single file:
~/.hugin/workflows.json
It’s a JSON array of workflow objects with id, name, workflow_type, enabled, nodes[], edges[], and run_count. Exports and imports are just this file. Run history is separate in SQLite (workflow_runs table).
🔗MCP
The workflows MCP tool exposes 10 actions:
list,get,create,update,deleteenable,disableduplicaterun,test
The same operations are also on REST (see above). Scheduling a workflow on a cron / interval is handled by submitting it through the Scheduler, not via a workflow-tool action.
🔗Performance
Passive workflows run in a background task pool independent of the proxy hot path, capped at 16 concurrent evaluations by a semaphore. A misbehaving workflow can’t slow request forwarding. RunShell and RunJavaScript actions have per-execution timeouts (default 30s for shell, 5s for JS). Workflows that exceed the timeout are killed and logged in Events.