An AI workflow is worth building when it sits inside a real operating loop. Someone has to own it. Someone has to use the output. There has to be a repeated action that currently costs time, creates mistakes, blocks a customer, or slows a team down.
If the request is only "can we add AI here", I slow the work down. The useful question is not whether AI can do something. It is whether the workflow has enough shape to inspect, automate, test, and improve.
The owner and the repeated action
The first thing I ask is who will decide whether this worked. A workflow without an owner becomes a demo. A workflow with an owner has acceptance criteria, tradeoffs, and a way to tell whether the output is useful.
- Who uses the result? A founder, operator, customer, sales team, support team, or internal admin person.
- What repeats? Intake, research, classification, metadata cleanup, content generation, reporting, routing, review, or handoff.
- Where does the result go? A product page, Supabase row, project board, email thread, n8n workflow, dashboard, or human review queue.
The existing materials
I want to see what already exists: the spreadsheet, repo, form, database table, old Zap, n8n workflow, support inbox, pricing page, failed prototype, or manual process. The current mess is useful. It shows where the real constraints are.
This is also where bad projects reveal themselves. If there is no current workflow, no examples, no owner, and no definition of done, the work is probably strategy or product discovery, not an implementation sprint.
Where AI actually belongs
Not every automation needs a model. If the task is deterministic, use normal code, rules, or a database query. AI belongs where judgment, messy input, language, media, extraction, summarization, or variation is actually part of the work.
The strongest workflows usually combine boring software with AI in one or two narrow places. The boring parts make the system reliable. The AI part handles the ambiguity.
The definition of done
Before I take on the build, I want a clear stopping point. Not a perfect product. A usable checkpoint. For example: a working intake form, a metadata pipeline, a review queue, a generated draft with human approval, or a deployed route that can be tested against real inputs.
A good AI workflow brief names the owner, user, repeated action, current materials, tools involved, failure mode, budget, timeline, and definition of done.
