How can law firms and legal in-house teams use generative AI and Microsoft Copilot?
There continues to be a huge interest in the potential for how generative AI can – and probably will – disrupt how we work. It’s ability to increase our productivity both now and in the future are both exciting and even slightly scary. At the same time, many organisations and individual teams within businesses are working out how to actually make AI work in the present; one way this is happening is by experimenting with Microsoft Copilot, the brand name for the in-built AI assistant that is threaded through different Microsoft products, including Microsoft 365.
The legal industry has an established record of using technology to streamline work, often by automating some of the more repetitive tasks that are part of day-to-day legal work. There’s a whole series of tech products and providers that help law firms and in-house legal teams drive efficiency and innovate.
The potential for generative AI to support different aspect of legal work is clear, and here at 3chillies we’ve already started to work on a project for a law firm client that leverages the power of AI to provide value for its clients. At the same time, AI comes with a lot of risks which need to be dealt with before it can be trusted to take on work that covers legal processes and related compliance.
Microsoft recently held an interesting webinar about the potential for using “AI in modern legal practice.” This focused on the use of Microsoft Copilot for Microsoft 365, and how it could be used for legal work. While at times the webinar did feel a bit like an extended advert for Microsoft – which is perhaps to be expected – there were also some useful insights in the session. Featuring three Senior Corporate Counsel from Microsoft’s Corporate, External and Legal Affairs (CELA) function (Dervish Tayyip, Nadine Morgan and Tom Wyrwich), the session covered some of the “why”, “what” and “how” around using generative and Copilot in a legal function.
Here are some of our takeaways from the webinar.
There is the potential to drive efficiencies for law firms and departments
There are multiple opportunities to use generative AI to supercharge productivity, often through saving time by letting AI do some of the more simple or mundane tasks.
In the webinar, the opportunities for AI and Copilot to transform the work of a law firm or legal department was stressed, not only to drive “unprecedented efficiency gains” but also support effectiveness. A typical example was given of being able to ask the AI assistant questions about a Teams meeting such as who should have attended, and which questions should have been asked.
Dervish Tayyip explained that the need to optimise and enhance legal processes has been a focus for legal tech for some time but up to now this has been mainly about streamlining and automating what lawyers already do. However, generative AI has the power to do things that we couldn’t do previously.
Tayyip stressed that he thought it was not a viable option to watch from the sidelines and get left behind. There are expectations from leadership that AI is going to disrupt the legal industry; he urged Chief Legal Officers to “control the narrative” around this before consultants come in and say they can use AI to reduce the overhead of the legal function and save costs. Instead, the legal team should use AI to show how they can use AI to better redeploy resources and “super power” the legal function so the legal team can focus on value-add activities.
Microsoft has a framework to support “trustworthy” AI
During the session, the need for strict governance and using AI in a responsible and trustworthy way was emphasised. Microsoft has a multi-layered approach to this and while at times this did feel like an extended reminder for Microsoft’s credentials in this area, there were also some useful insights into how to put a multi-layered governance framework around AI that other firms can emulate parts of.
At the top of this is a set of “responsible AI” principles focusing on:
-
Fairness
-
Reliability & safety
-
Privacy & security
-
Inclusiveness
-
Transparency
-
Accountability
Each of these principles is in turn mapped to individual goals and outcomes. For example, “Reliability & safety” relates to different areas such as appropriate guidance, ongoing feedback and valuation, and more.
Microsoft also has a dedicated ecosystem with AI oversight roles for the overall Microsoft Board and Executive Leadership. However, most activity is carried out through the Office of Responsible AI and a set of activities around research, policy and engineering.
Microsoft says it also dedicated to sharing AI learnings through its “Responsible AI principles” as well as other policies and assets including corporate standards on AI, and AI impact assessment template and guide, transparency notes on Microsoft’s AI technologies, guidelines for human-AI interaction and an annual AI transparency report.
On top of all this there a further set of privacy commitments to AI services, a set of customer commitments, security safeguards and more.
Trialling and experimentation are important
In the next section of the webinar, Nadine Morgan, another Senior Corporate Counsel from Microsoft’s CELA function explained how AI and Copilot is actually being used at Microsoft. CELA is an organisation of 2,000 across 54 countries, so it a little like a legal firm, but working within Microsoft. About a third of CELA are lawyers, but there are also those working in public affairs, a range of subject matter experts, data scientists and others from across different disciplines.
A key takeaway was the importance of legal teams trialling and experimenting with Copilot, not only to understand the technology and become fluent in using it to improve work, but also to be able to give advice to others on its usage.
CELA have been using Microsoft 365 Copilot for about a year now and there is a strong commitment from leadership to adopt it. However, prior to this, the team started to experiment with AI. They created an “AI for CELA Lab” which was an online “crowdsourcing” site where CELA staff could contribute ideas and make suggestions for experimenting with Large Language Models (LLMs). Overall, about 250 ideas were submitted, helping to pave the way for deriving use cases for Copilot.
The legal team at Microsoft used Copilot in three main areas
At the moment the legal team at Microsoft use Copilot in three main ways:
-
Knowledge management and chatbots: bridging silos and better knowledge sharing with internal and external self-service capabilities. Here Microsoft has crated an internal AI-powered self-help tool that answers basic questions so that lawyers can focus more on strategic matters. This is based on an existing repository of basic legal advice, with Copilot running on top of it.
-
Contracting and related automation: generating contracts and “improving negotiation intelligence to deliver contracting velocity.” Here, CELA is using Copilot to evaluate contracts and revisions.
-
Regulatory: monitoring and understanding the regulatory and policy landscape, helping to improve compliance and reduce costs, and delivering more consistent responses more quickly. Here AI is examining compliance trends and supporting regulatory teams when issuing advice.
Morgan emphasised that in using Copilot it is not a replacement for human judgement, but more like an assistant. Also underpinning all these AI services is what Morgan referred to as “CELA’s Data Factory”, effectively a data layer that identifies and ingests relevant CELA data and unifies it, and has all the necessary governance built in.
A cultural shift is required
One takeaway that we found particularly interesting was the shift in mindset that is needed to take advantage of AI. Morgan stressed that it was not only important to understand AI’s capabilities and limitations – critical context for using AI – but also to embrace the unknown and a spirit of experimentation. People can end up using AI in different ways and adopting that AI-ready mindset is key. Learning from what doesn’t work is also as important as seeing what does work.
Even in CELA within Microsoft, people had been on a bit of a” journey” and there had been active and continuing investment in culture change and changing how people work. This has been achieved in a number of different ways:
-
Communications
-
Incentivisation and recognition
-
Skilling
-
Storytelling
-
Strategic external engagement
-
Programmatic sponsors and catalysts.
For example, within CELA there are AI champions who bring new use cases back to teams and share their learnings. Within Nadine Morgan’s team, they have a little demo at the end of their regular team meeting, for example. Microsoft staff also have specific goals to use and support Copilot which they are evaluated against.
At a wider scale, at the annual CELA summit, everyone was given a checklist to get them set up with Copilot, including ensuring it was added to everyone’s task bar. There was also a light-hearted session on “Prompting for Lawyers” where everyone was scored on the quality of their prompts.
Microsoft claims Copilot has increased productivity and accuracy
In the final part of the session, as well as a demo and tips for starting out, Tom Wyrwich talked about some of the overall impact of Copilot on Microsoft’s legal operations:
-
32% faster completion of day-to-day legal tasks including understanding and summarising new laws
-
20% greater accuracy of completed legal tasks, including responding to a Request for Information
-
87% of users felt it made them more productive.
Conclusion
While accepting that some of the information provided by Microsoft in this webinar isn’t necessarily completely objective, this was definitely an interesting glimpse into the how generative AI can positively impact legal work.
Some of the takeaways that stood out of us:
-
AI is not there to replace human judgement, it’s more to free up time and make work easier.
-
You need to focus on the governance and guardrails that need to be in place for AI, especially supporting legal work.
-
Change management and a mindset that encourages experimentation are both important to support the shift in culture to get the best out of AI.
-
Starting now can help build up the knowledge to reap rewards later on – it’s effectively a journey.
If you’d like to discuss how you can use AI to streamline your legal work, then get in touch!