This seemingly innocuous question has become a pivotal point of discussion in legal tech circles. As artificial intelligence (AI) tools flood the market, promising to revolutionize everything from contract review to legal research, savvy legal professionals are learning to look beyond the sleek interfaces and bold claims.
GPT wrappers, or AI tools built on existing large language models (LLMs), have faced criticism in the market. Many assume these tools rely heavily on foundation models with minimal adaptation, adding limited value as a result.
Wrappers aren't inherently problematic, however. In fact, they can offer quick, cost-effective solutions for certain tasks, and many products that draw upon foundation models actually have far more going on under the hood than you may think.
What’s critical for making informed technology investments is understanding where wrappers end and custom-built AI begins.
GPT Wrappers, Custom-Built LLMs, and Agentic AI
Let’s break down the spectrum of AI solutions, from simple wrappers to deeply customized language models. These tools differ in their underlying technology, customization level, and potential impact on legal workflows. But each plays a unique role in enhancing legal practice.
What Are Wrappers?
Think of a GPT wrapper as a layer of functionality built atop an existing large language model. It's akin to a specialized interface that tailors the underlying AI for specific tasks without fundamentally altering the core technology. GPT wrappers often focus on making AI more accessible or user-friendly, adding functions like contract analysis without deeply customizing or training the AI on the legal field's specific needs.
Domain-Specific LLMs
On the other hand, a custom-built LLM is an AI model that has been extensively fine-tuned or even developed from scratch to meet the unique demands of legal practice. These models are trained on vast amounts of legal-specific data, enabling them to understand and generate legal language, identify relevant case law, or perform legal-specific tasks like drafting documents with higher accuracy and relevance.
Agentic AI
Beyond simple wrappers and custom-built LLMs lies agentic AI, representing a more advanced approach to legal tech solutions. Agentic AI systems can autonomously plan and execute complex, multi-step legal tasks, adapting their strategies as needed. These systems often combine multiple tools and models to achieve higher-level goals with minimal human intervention.
Thick Versus Thin Wrappers
It’s important to know that not all wrappers are equal.
There are thin wrappers and thick wrappers around generative AI models. The main difference between the two is the level of complexity and customization built on top of the core LLM.
Thin Wrapper
A thin wrapper is a minimal layer of abstraction or functionality added on top of a generative AI model. This is what most people think of when they use the term “wrapper” as a criticism. Typically, a thin wrapper might involve:
- Basic API Handling: This includes simple methods for interfacing with an AI model via an API, making calls, and returning results with minimal processing, among otther functionalities.
- Input/Output Formatting: A thin wrapper might only handle basic formatting of inputs and outputs to align with the model's expectations or to fit the specific use case.
- User Access: A thin wrapper often serves the purpose of making a foundation model more accessible or easier to use without altering its core capabilities or enhancing its performance in a significant way.
Essentially, a thin wrapper is lightweight and straightforward. It serves mainly as a conduit between users and the model. If you think of an LLM as raw ingredients, a basic recipe is like a thin wrapper—it provides you with the ability to use those ingredients, but with minimal extra support. Many of the law firm proprietary AI chatbots are thin wrappers, since they largely constitute the deployment of a foundation model via an API. In practice, very few third-party legal technology products are actually thin wrappers around a foundation model.
Thick Wrapper
A thick wrapper, on the other hand, adds a more substantial layer of features and functionalities around the model. Sticking with the food analogy, a thick wrapper is more like a meal kit. It arrives with pre-measured ingredients, step-by-step instructions, photos, and sometimes even pre-prepped items. Compared to a basic recipe, a meal kit offers a complete, user-friendly experience that requires little guesswork.
Similarly, a thick wrapper around generative AI offers a more accessible, feature-rich experience, integrating additional tools and guidance. The application layer in a thick wrapper might involve:
- Customized Pre- and Post-Processing: This type of wrapper might include complex input processing like additional validation, contextual data injection, or restructuring the model’s responses based on business rules or user preferences.
- Integration with External Systems: It could integrate with other systems or databases, enriching responses with external data, or perform advanced decision-making using model outputs.
- User Interface Enhancements: A thick wrapper might also include user interface (UI) elements designed for a specific user persona, interaction mechanisms, or automated workflows that make the model more useful and tailored to a specific context.
- Advanced Business Logic: Thick wrappers can embed proprietary logic, in-depth tuning, filters, and conditional handling based on industry-specific needs or domain knowledge.
In short, a thick wrapper is more elaborate, focusing on significantly enhancing the usability, integration, and customization of the core generative AI model.
If we consider a generative language model like GPT-4, a thin wrapper might involve creating a straightforward API call with basic error handling. A thick wrapper would involve customizing how user queries are interpreted, integrating with domain-specific data sources, or adding advanced conversational capabilities like contextual memory.
Legal Tech Wrapper Examples
Most legal technology tools leverage more than one LLM and more than one type of technology. To help you better understand how the above criteria play out in real-world, multi-model scenarios, let’s look at a few examples of both GPT wrappers and other AI solutions in the legal tech industry. These case studies illustrate the practical differences between superficial enhancements and deeply integrated solutions.
David AI by 2nd Chair: A Thin Wrapper
David AI by 2nd Chair is an assistant tool built primarily on GPT-4o, allowing users to upload legal documents and query those documents using generative AI. David AI produces answers from the documents and supports those answers with source links to the specific excerpts of the documents from which the response was drawn. This product by 2nd Chair provides legal users the ability to securely leverage Open AI’s technology on legal documents and feel comfortable relying on its answers, since they are grounded specifically in the data that has been uploaded by the user.
2nd Chair has done the work to build in an application layer that allows for grounding and explainability, but it has not fine-tuned the model for legal use. We note that David AI now also relies on Anthropic’s models.
CoCounsel by Thomson Reuters: A Sophisticated, Thick Wrapper(ish)
Following its acquisition by Thomson Reuters, Casetext was rebranded to the name of its flagship AI product CoCounsel, with an expanded focus on AI-driven legal solutions. Casetext was building neural networks well before it deployed generative AI, and then the company obtained early access to GPT-4, months before it launched publicly. It also had a solid database of legal content that allowed for rich legal research.
So, to call CoCounsel a wrapper is harsh.
Nevertheless, the new Thomson Reuters CoCounsel Core is an example of a product leveraging underlying foundation models (in combination with other technology) and integrating additional functionalities tailored specifically to legal practice. It offers features such as redlining and data extraction, which are crucial for handling large volumes of legal documents. Although it does not fundamentally alter the base AI model, CoCounsel represents a far more advanced approach to wrapping, with a thick application layer deployed to strike a balance between ease of use and specialized functionality.
Harvey: Domain-Specific Generative AI and a Custom-Built LLM
At times mistaken for a wrapper, Harvey is more advanced than that, with reasoning engines designed to complete specific legal workflows. Harvey's process-oriented approach enables interdisciplinary teams of engineers and lawyers to break down legal workflows into subtasks by training language models to perform them accurately across various legal use cases.
Its BigLaw Bench framework provides examples of real-life billable tasks that Harvey can perform and offers benchmarking of Harvey’s performance and accuracy against commercially available models. Although Harvey uses GPT-4o (as well as many other models), it has unique access to OpenAI models, allowing for fine-tuning tailored to the specific workflows of law firms and legal departments.
This customization involves training the AI source code on large volumes of legal documents and processes, enhancing its ability to handle complex legal tasks and produce outputs aligned with legal standards. Unlike wrappers, Harvey’s blend of legal process knowledge and custom models creates a depth of unique functionality, precision, and reliability for legal professionals that thin wrappers cannot offer.
Harvey is used to draft, summarize, and analyze legal documents, redline contracts, and extract data from large sets of agreements within the specific context of a law firm or legal department. User feedback indicates that Harvey’s ability to “speak” and “reason” like a lawyer is one of the reasons it is well adopted by the firms that have licensed it.
KL3M: A Custom-Built Legal LLM
Very few examples exist of LLMs that have been built from scratch on legal (or any domain-specific) data. Early in 2024, Mike Bommarito and Dan Katz of 273 Ventures started building KL3M (the Kelvin Legal Large Language Model), which is a family of language models built from scratch entirely on legal data.
The difficulty and expense of building such a product cannot be overstated, and smaller, domain-specific models like KL3M are particularly useful for cutting down on hallucinations and to support certain types of legal work. In training their models, the 273 Ventures team was able to ensure that all data used was ‘clean’ and lawfully obtained—in other words, that copyright was not infringed and upstream licenses were not violated.
Spellbook Associate: Agentic AI
Spellbook Associate is an example of an AI agent developed to tackle specific legal workflows. Spellbook Associate can plan, execute, and check its work across multiple applications, including Microsoft Word. It’s designed to handle larger-scope assignments like producing complete financing documents from a term sheet or reviewing hundreds of documents for risks and inconsistencies. While not the first legal AI agent, it represents a significant step in bringing agentic AI capabilities to mid-sized law firms.
It’s Complicated
The examples above are illustrative of the differences in the way that various products use underlying technology, but there is additional complexity for buyers to navigate. For example:
- 2nd Chair’s source referencing arguably makes it more than a thin wrapper;
- Harvey offers agentic workflows in addition to its assistant product;
- Spellbook’s assistant product could be referred to as a thick wrapper.
The most important thing for buyers is to ask enough questions to understand how a provider is using generative AI, and whether the application layer is sufficient to meet their requirements.
AI Solutions in Legal Practice: A Necessary Spectrum of Tools
From thin wrappers to custom models, AI tools offer varied benefits for legal professionals, each designed to serve distinct needs. Simple wrappers make foundational AI accessible and secure, while advanced models and agentic AI support complex legal workflows with precision. Trusted vendors provide thoughtful applications across this spectrum, enabling firms to choose tools that fit their specific goals for specific use cases.
As the range of AI solutions grows, every type—from basic to custom—has its role in moving legal practice forward. What’s critical is understanding what you’re investing in and why.