AI Project Brief for Belgian SMEs: 2026 Method
Why a sloppy AI project brief costs you three times more than a good one
When a Belgian SME asks me for a second opinion on an AI project that's gone off the rails, I almost always trace it back to the same root cause: there was no project brief, or it ran to three quarters of a page. The owner compared two or three quotes at €8,000, €14,000 and €22,000, signed with the cheapest, and discovered six months later that what was delivered barely resembled what they had in mind. The problem is rarely the vendor — it's the absence of a written scope. This article lays out a concrete method, calibrated for the Belgian context in 2026, for writing an AI project brief that is usable, defensible, and that makes vendor quotes comparable. Seven sections, costed examples, and the traps I see in 80% of files. By the end, you'll know exactly what you're asking for, and you'll filter out unsuitable vendors with two questions.
What an AI project brief should do (and what it isn't)
An AI project brief is not a legal document, nor a frozen technical specification. It is a scoping document that aligns four parties: the owner, the operational teams, the future vendor, and — often forgotten — your accountant or your IT contact. Its job is to turn a fuzzy need ("we'd like to use AI") into an executable, measurable, budgetable order.
In practice, a good AI project brief produces three results. First, it makes quotes comparable: if three vendors read your document and send back three proposals scoped differently, your brief is too vague. Second, it fixes quantified success criteria before you sign, because once the project is delivered it is too late to argue. Third, it explicitly states what you will not do in this phase, because scope creep is the number-one cause of budget overruns in SME AI projects.
For a typical project between €8,000 and €40,000, your brief should run between 6 and 15 pages. Below that, you're buying approximations. Above that, you're paying for over-specification. If you want a top-down view of where to begin, this overview of AI for Belgian SMEs gives the broader context.
Section 1: the business problem, not the technology
This is the most important section, and it is almost always skipped. A proper AI project brief never opens with "we want a chatbot" or "we need an AI agent". It opens with: what business problem are we trying to solve, and what is it costing us today?
A concrete example to make this tangible. Bad framing: "We want to automate customer service with an AI chatbot." Good framing: "Our customer service handles 380 emails per month, of which 220 are repetitive (order tracking, return policy, opening hours). The current first-response time averages 6 hours. We estimate 18 hours per week is spent on these 220 repetitive emails, roughly €38,000 per year of fully-loaded cost. Our goal: bring first-response time below 30 minutes for repetitive questions, free up 60% of customer service team time, and redeploy it on complex, higher-value requests."
The second framing contains four elements no vendor can ignore: volume, current cost, performance target, and the use of freed-up capacity. With this level of precision, an honest vendor can tell you in two hours whether your project is at €6,000 or €25,000, and a dishonest vendor gives themselves away in three questions. For the methodology of measuring the "before" state, see calculating the ROI of an AI project.
Section 2: functional scope and quantified success criteria
This section answers the question: what exactly will be delivered, and how will we know it works? List use cases by priority, clearly distinguishing must-have, should-have, and won't-have-in-this-phase scope. That last column is what saves budgets.
For each use case, define three quantified success criteria. Not one, not two: three. For example, for automating CV screening in a 25-person SME: (1) at least 80% of CVs correctly classified against our internal grid, (2) average screening time per application brought down from 12 minutes to 2 minutes, (3) zero high-potential candidate misclassified on a test sample of 50 historical CVs. These three criteria are the basis of acceptance: if any one fails, the project is not delivered.
Add a business non-regression clause: for example, "the customer satisfaction score measured by our NPS must not drop by more than 2 points in the 90 days after go-live". This forces the vendor to bake in quality, not just technical performance. To go deeper on translating business goals into technical metrics, see AI data analysis for SME decisions.
Section 3: data, data quality, and GDPR constraints
In AI, data precedes the model. This section describes what you have, in what shape, and what the law lets you do with it. List each source: name of the system (ERP, CRM, accounting tool, SharePoint files, mailboxes), approximate volume, refresh frequency, format (structured, semi-structured, scanned PDFs, audio), and — critically — an honest quality assessment.
The classic mistake is to claim data is clean because it is stored. It almost never is. On a quote-automation project, I once saw a client announce "4,000 historical quotes available", which turned out to be 4,000 unstructured PDFs with no common fields and product labels that varied by sales rep. Acknowledging that problem in the brief saves three weeks of contractual tension down the line.
On the GDPR side, identify the categories of personal data involved, the applicable legal bases, the planned retention period, and the location of processing (EU or non-EU). For 95% of SME projects, the solution must stay in the EU and use models with contractual commitments not to train on your data. The full obligations are covered in AI and GDPR for Belgian SMEs. For the transparency and AI-system classification requirements imposed by the AI Act, the official reference is the European Commission's AI strategy page, and the Belgian Data Protection Authority publishes its current guidelines.
Section 4: target architecture, integrations, technical constraints
This is the section you write with your IT or with an independent consultant — not with the vendor you're about to put into competition (obvious conflict of interest). Describe the existing IT landscape: which business software is in place, which APIs are available, which cloud you use, which authentication protocols are deployed.
Then state your non-negotiable constraints. A few examples that radically change the quote: "the solution must integrate with our Microsoft 365 via SSO", "no data export to a non-European cloud", "audit logs retained for 5 years", "99.5% availability during business hours". Each of these constraints can double your budget or rule out half the vendors — better to write them down before comparing prices.
State your technical maturity honestly. If your SME has no in-house IT and runs on an external accountant plus a freelancer maintaining the website, that is strategic information. The right AI vendor will then propose a managed setup, not a cloud-native architecture demanding a DevOps team you don't have. For the hidden costs tied to these choices, see AI integration cost for Belgian SMEs.
Section 5: budget, timeline, and validation milestones
Stating a budget cap in an AI project brief is not taboo — it's recommended for SMEs. Without a cap, some vendors calibrate their proposal to what they think you can pay rather than to the actual need. Give an honest range, for example €12,000 to €18,000 for implementation, with a separate annual run cost of €4,000 to €7,000.
On the timeline, break the project into a maximum of three validation milestones over the first six months: (1) detailed scoping and a narrow proof of concept, (2) pilot deployment on a sub-scope, (3) full production rollout. At each milestone, you pay one third of the implementation cost, and you have an explicit right to stop if the previous phase's criteria were not met. This is the most protective clause an SME can insert, and the one that separates serious vendors from the rest: a vendor who refuses this structure is telling you something about their confidence in their own proposal.
Mention the public funding you intend to mobilise. For many Walloon SMEs, the Wallonia digitalisation grant and certain sector-specific vouchers cover part of the project. Be careful: chèques-entreprises impose strict conditions (notably going through an accredited vendor) — verify against the official list at cheques-entreprises.be before you build the funding plan. Aïves works upstream to scope the project and navigate the funding maze; choosing the accredited execution vendor stays in your hands.
Section 6: vendor selection criteria and scoring grid
This section is often missing, which is a shame: it saves you two weeks of doubt after the quotes arrive. Define the scoring grid you will apply to incoming proposals in advance. A robust grid uses five criteria families, weighted by your priorities.
First, business understanding (25%): does the vendor reframe your problem better than you did, or are they just copying your brief? Then technical solidity of the solution (25%): architecture, model choices, scalability planning, test plan. Then cost transparency (20%): line-by-line quote, or opaque flat fee? Human enablement plan (15%): without a training and change management plan, the project ends up in a drawer — see training your team for AI adoption. Finally, verifiable references (15%): at least two clients you can call, in a comparable sector or company size.
Set a minimum threshold per family (for example 12/20) below which a vendor is eliminated even if they have the best total score. This avoids the mechanical "lowest bidder wins" effect. To go further on sourcing biases in AI procurement, see AI integration mistakes to avoid.
Five recurring traps when writing your first AI project brief
Trap one: confusing brief with demo. A brief describes a problem and constraints, not an interface or a specific workflow. If you write "must have a blue button in the top right corner", you're buying a bad version of what you imagine instead of the best possible version.
Trap two: over-specifying the technology. Writing "must use GPT-4" in a brief locks in a decision that should stay open per use case. Prefer "must use a language model of quality comparable to top 2026 models, hosted in the EU".
Trap three: forgetting the exit cost. What happens if you want to leave the solution in 18 months? If the answer is "rewrite the entire integration", you are locked in. Ask for a reversibility plan in the brief.
Trap four: neglecting the human side. An autonomous AI agent project, for example, changes the nature of several roles. Without an upstream conversation with the team, you create resistance that kills adoption. See autonomous AI agents in Belgian SMEs.
Trap five: skipping the pilot. A brief that goes straight to a 200-user rollout without a 10-user pilot phase will be signed by the least serious vendors and refused by the best ones. A six- to eight-week pilot on a narrow team costs 15 to 20% of the total budget but exposes almost every blind spot: data quality, user behaviour, edge cases, training gaps. Skip it and you pay the same lesson during rollout, with higher stakes and more visibility.
Trap six, which I see more often since 2025: underestimating the regulation. The AI Act applies in phases and classifies certain use cases (HR, credit scoring, biometrics) as high risk. A brief that contains no risk classification pushes that analysis onto the vendor, who will logically round up to cover themselves. Do the classification yourself based on Annex III of the text, or have it done by an independent third party, and bake the conclusion into the document. You limit the legal padding on the proposal side.
What comes after the brief
Once your brief is written, the next step is a restricted RFP to a maximum of three vendors, comparative analysis using your grid, and contractual negotiation. The best outcomes I see in SMEs come when the owner invests two to four weeks upfront on scoping, instead of chasing six months of delivery delay later.
If you want to secure that scoping phase without depending on an execution vendor with an interest in steering your choices, that's exactly the role Aïves Consulting plays: helping the SME structure the brief, challenge assumptions, and prepare the vendor competition. You then keep full control over selecting the execution vendor, accredited or otherwise. To start a scoping conversation, contact us or explore our AI consulting and strategy services. And if you first need to situate the overall need, AI for Belgian SMEs gives the big picture.
Want to discuss this?
Get in touch