TWINLADDER
TwinLadder
TWINLADDER
Back to Insights

AI Strategy

The AI Training Market Is Broken — Here's What Legal Professionals Actually Need

Academic programs, vendor demos, bar seminars, checkbox compliance — four categories of training, all failing lawyers. We dissect why and show what actually works.

March 17, 2026Alex Blumentals, Founder & CEO7 min read
The AI Training Market Is Broken — Here's What Legal Professionals Actually Need

Klausīties šo rakstu

0:000:00

The AI Training Market Is Broken — Here's What Legal Professionals Actually Need

Article 4 of the EU AI Act mandates AI literacy. A training industry has scrambled to meet the demand. But most of what is being sold does not work.


The market for AI training in the legal profession is booming. Conferences, certificates, webinars, vendor academies -- the options multiply weekly.

And yet 30% of legal departments still offer no AI training at all. Among those that do, adoption remains stubbornly uneven. Lawyers complete training and remain uncomfortable. They earn certificates and cannot verify a citation.

The problem is not a shortage of training. Almost everything on offer falls into one of four categories -- and all four are fundamentally broken.

The Academic Trap

Universities and technical institutes have launched AI programmes for professionals -- machine learning fundamentals, data science, neural network architectures. Rigorous and credible. Also wildly mismatched for practising lawyers.

These programmes were designed for career changers pivoting into data science. They cost thousands of euros, run for months, and have completion rates that reflect the mismatch. Teaching neural network architecture to a lawyer who needs to verify a citation is like teaching metallurgy to a pilot. Technically adjacent. Practically useless.

The knowledge addresses questions lawyers are not asking while leaving the ones they need answered -- Can I trust this? How do I check it? -- entirely untouched. Curricula built around specific model generations become obsolete within a year. The investment depreciates before the certificate is framed.

The Vendor Demo Disguised as Education

AI tool vendors have stepped in with their own training. Harvey offers onboarding. LexisNexis provides AI modules. Westlaw runs webinars. Free or low-cost, professionally produced, readily available.

Also: sales material wearing an education costume.

Vendor training teaches you to use one product. It does not build transferable judgment about when AI is reliable. A lawyer trained on Harvey cannot necessarily evaluate a Clio AI output.

Worse, vendor training is structurally incentivised to minimise limitation awareness. The vendor wants adoption. The lawyer needs competent, critical use -- including knowing when not to use the tool. These interests diverge. The result is narrow capability without broad competence.

The Awareness Illusion

Bar associations have begun offering AI awareness programmes -- one- to two-hour seminars on regulatory obligations and ethical considerations. They serve a legitimate purpose: ensuring practitioners know AI obligations exist.

But awareness is not ability.

A lawyer who attends knows AI can hallucinate. She has never practised catching one. She understands confidentiality applies to AI inputs. She has never navigated the practical decisions about what can and cannot be entered into a tool. Aware of her obligations -- not competent to meet them.

The variation across jurisdictions compounds the problem. Some US bars have issued detailed guidance. Across the Baltic states, bar associations have published zero. Two hours is enough to create anxiety about AI risks. It is not enough to build the skills that convert anxiety into confident practice.

The Checkbox Factory

The fourth category may be the most insidious. Compliance-focused online modules exist to prove that training occurred -- brief, generic, designed to generate a completion certificate that satisfies an auditor. The training equivalent of reading a safety card on an aeroplane.

These programmes are rarely legal-specific. The same module serves lawyers, accountants, and marketing teams. A practitioner finishes without encountering a single scenario relevant to their work.

The regulatory logic is circular: Article 4 requires literacy, so organisations purchase training that documents literacy, regardless of whether literacy results. The box is ticked. The lawyer is no more competent. This is compliance theatre -- and as enforcement matures, regulators will not be fooled by it.

The Gap That Matters

Strip away the marketing and these four categories share a common failure: none build what lawyers actually need.

Lawyers need practical competence, confidence, legal-specific content, and certifiable outcomes. No current category delivers all of these.

The data confirms the gap is consequential. According to Embroker's 2025 legal industry survey, AI adoption surged from 22% to 80% in segments with proper workflow-focused training. The strongest predictor of adoption is not technical knowledge -- it is comfort and confidence. Organisations with multi-modal programmes (guided practice, scenario work, peer discussion) dramatically outperform single-format alternatives. Workflow-focused programmes consistently beat technical ones in both enrolment and satisfaction.

The pattern is unambiguous. What works is not teaching lawyers about AI. What works is teaching lawyers to work with AI -- in their context, on their problems, with professional responsibilities front and centre.

What the Market Actually Needs

The training that works sits in a space the current market barely serves: more practical than academic programmes, more comprehensive than vendor tutorials, more substantial than awareness seminars, more specific than compliance checkboxes.

Training built around legal workflows, not computer science curricula. Training that builds confidence through guided practice, not anxiety through theoretical complexity. Training that respects the expertise lawyers already possess and extends it into an AI-augmented environment.

As Article 4 enforcement begins and the gap between compliant-on-paper and genuinely competent becomes visible, this difference will define which practitioners thrive. The broken market is an opportunity -- but only for those willing to build something that actually works.


The evidence on comfort-driven adoption is explored in Why Comfort Matters More Than Code. For a broader analysis of the competence challenge, see The Competence Paradox: AI Eliminates the Jobs Where You Learn.