Objection!

25 Mar 2023

Phoenix Wright, Pixelated Attorney

Microsoft 365 Copilot, the recently announced addition of ChaptGPT-like Large Language Models to their venerable Office tools (and the somewhat less venerable Teams), is one of the more concrete and at least potentially potent applications of Large Language Models to date. As well as straight-forward localised capabilities such as summarisation and image generation, it offers a new way to query and aggregate data from across an entire company. While incredibly useful (if it works as advertised), this could have some interesting unintended consequences.

Google, arguably Microsoft’s biggest rival in both office tools and LLMs, recently ran into trouble for not retaining potentially incriminating messages, highlighting the fact that a company’s electronic records are fair game as evidence in court actions. Traditionally, this has involved one party producing a list of documents the other has to produce, and having teams of paralegals and junior lawyers pore over them to find relevant details. Companies such as Logikcull already apply AI techniques to automate this process, but what if you already had a model customised for and connected to the company’s data1? To look at it another way, could a company’s Copilot be called as a witness?2

Obviously, a cloud-provided AI tool is very different from a human being, but the legal fiction of corporate personhood is well established, and it’s not too much of a leap to consider interrogating the tool to be akin to interrogating that fictional person. In the US, corporations have long claimed, and been granted, protections under the first amendment. Are we about to see them take the fifth?

There are numerous issues with this framing, and of course it needs a massive I Am Not A Lawyer stamp applied. However, it seems to me that there’s enough wiggle room that someone will try it, and probably before too long. A lot of it, particularly in common law systems such as the US and UK, will depend on how early cases pan out. It’s impossible to predict what will happen, but the impact on everyone from cloud providers to CIOs to white collar criminals could be profound.

This post is an extended version of a post on Mastodon, which kicked off an interesting conversation with Simon Frankau. The header image is based on a screenshot from Phoenix Wright: Ace Attorney by Capcom.

  1. One issue would be the scope of the demand; the documents to be turned over during discovery are tightly defined, and litigants don’t have carte blanche to look at anything they want to. Based on the presentation, Microsoft actually seem to be in a pretty good position here, as they separate the linking to existing data (“grounding” in their architecture) from the main LLM, in part to explicitly implement access controls. However, this sort of strict separation isn’t a given, and even if you have it in the first version, constant vigiliance is required to avoid compromising it as the architecture evolves. [back]

  2. I’m not suggesting that litigators will literally be cross-examining a laptop in the witness stand (well, probably not), but demanding access to the tool as part of the discovery process seems well within the bounds of possibility. [back]

This site is maintained by me, Rob Hague. The opinions here are my own, and not those of my employer or anyone else. You can mail me at rob@rho.org.uk, and I'm @robhague@mas.to on Mastodon and robhague on Twitter. The site has a full-text RSS feed if you're so inclined.

Body text is set in Georgia or the nearest equivalent. Headings and other non-body text is set in Cooper Hewitt Light. The latter is © 2014 Cooper Hewitt Smithsonian Design Museum, and used under the SIL Open Font License.

All content © Rob Hague 2002-2024, except where otherwise noted.