Why faster isn't always better: is AI sacrificing legal acumen for efficiency?
The new LexisNexis report reports that 82% of lawyers are using or planning to use AI and 71% are praising its speed. It sounds revolutionary, but as someone who's been both a Magic Circle lawyer and a legal tech innovator, I can't help but wonder if we're asking the right questions.
I'm all for innovation. It's what we do at my company. But this race towards AI adoption, fueled by the relentless pressure of billable hours and office overheads, has me worried. We're so obsessed with 'how fast' that we've forgotten to ask 'at what cost?'
Everything has a trade-off. It's just how the world works. I walk faster, I burn more energy, I get hungry sooner. I spend a tenner today, I can't buy lunch tomorrow. So when we talk about using Gen AI to work faster, we need to ask: what's the real cost?
Sure, AI can scan through information like no human can, but does that actually make lawyers more efficient? And are there areas of law where we don't want AI to be quicker or a substitute for human cognition and experience?
When I did my MSc in Cyberpsychology, we delved into System 1 (fast) and System 2 (slow) thinking. Gen AI is a System 1 powerhouse, but is that always what we need in legal work? Sometimes we need slow, critical thinking. Sometimes we need emotional intelligence. Law isn't just about speed, it's about judgement, nuance, and understanding the human element.
Think about the concept of 'construction' in legal reasoning, where lawyers apply specific interpretations to words or sentences to achieve desired outcomes. This nuanced manipulation of language is a uniquely human skill that AI simply can't replicate. We need to be smart about what we automate, making sure we're reinvesting that saved time in high-value work and professional development. Right now, AI is best in the hands of lawyers who know their documents inside out. The ones who can spot when an AI output feels off. We need to augment experienced lawyers, not replace core legal skills.
We're in danger of eroding those, especially among junior lawyers. Legal reasoning isn't something you can fully automate - it's a skill honed through experience and training. A general AI model might get you 80% there, but that last 20% is crucial. It's where the real legal acumen lies.
Perhaps we need to treat AI tools like we do social media and smartphones for children. We might consider withholding some AI tools from trainees or junior associates until they've built up enough stored knowledge and experience. It sounds patronising, but there's a reason AI engineers talk about 'grounding' a model. You need someone with the 'truth' to evaluate the output and inputs.
This makes me think about legal education: how do we train the next generation when AI's handling the grunt work they are usually trained on? We need to pivot fast, focusing on what AI can't do – complex problem-solving, emotional intelligence, ethical decision-making.
There's a risk we're turning lawyers into line workers when they used to be machine operators. Are we de-skilling our lawyers in the pursuit of efficiency? And does this increase the risk profile of the job? After all, anything that moves quicker generally has a higher risk threshold.
AI looks like a magic bullet. But maybe we need to pump the brakes a bit. Let lawyers be lawyers. Let them wrestle with documents, learn the hard way, and grow. I'm not anti-AI, far from it. It can augment lawyers' work - but I stress 'augment' - we need to help lawyers who know their stuff find it faster, not to think for them.
The future of law is about finding the sweet spot between tech efficiency and good old-fashioned legal nous. We need to open up discussions about how much we want or desire a human-machine blend. While we might want to automate human output, we cannot yet automate the human mind - and for law, both are needed.
So let's make sure we're asking the right questions. Our profession – and our clients – deserve nothing less.