I don’t love the term AI. It feels more science fiction than actual science. In 1971, when someone said AI, it WAS a science fiction term about a computer in space called HAL gone totally rogue.
Yes, there is the scientific pursuit of artificial general intelligence. And AI as a scientific pursuit is not a new term – it has been around for a long time and is well funded in very serious scientific circles. However, when people talk about Artificial Intelligence in 2025, they are rarely actually talking about AGI or some other academic pursuit. They are talking about LLMs. And what you can do with them.
If you want an EXTREMELY SPICY take about AI from a year ago, read here.: https://ludic.mataroa.blog/blog/i-will-fucking-piledrive-you-if-you-mention-ai-again/
Why this is frustrating is that LLMs are ACTUALLY SWEET, but the term AI muddies the waters. The technology is incredible and getting better with every day. It’s possible we have reached a plateau or a slowing of progress in AI (maybe the subject for another blog), but EVEN IF THAT IS THE CASE, LLM-tech is still extremely impressive.
At this point I would ask if you have tried any of the new age LLM tools. Be that chatbots or agents, or something else in your work setting. But I get the feeling a lot of people have. And now that the dust has settled on some of these tools released a couple of years ago, it’s actually becoming a little safer to start experimenting with said products. I know that I have ramped up my activity with them. I probably use ChatGPT’s deep research function 4-6 times per week at this point. And I am experimenting with some other tools. I even have started ~vibe coding~ a REALLY stupi product, just so I can see if someone as technically challenged as me can do something interesting now.
When I talk to my portfolio companies and their CTOs, it has become clear that the LLMs and codegen tools have made a significant impact on their tech teams. They have improved significantly – both the engineering teams and the tools they are using. While I haven’t heard about how these tools have replaced the need for engineers, it still feels like the impact is very tangible and real.
It’s actually an interesting point about replacing people. That is always the metric that people are interested in when it comes to LLM technology. Which, if you call something artificial intelligence, that makes sense. If this thing is intelligent, can I use it to replace the other intelligent workers on my team?
But that generally misses the point on technology (read: software) in general. It is a DEFLATIONARY force. Meaning it creates leverage in the market and improves operational efficiencies. There is an obscene amount of data to back this up. Below are two of my favorites.
But technology doesn’t necessarily replace jobs one-for-one. The image of the cobbler being replaced by the shoe factory sounds nice, but it didn’t happen in such a linear fashion then. Jobs to be done are stickier than we generally imagine them. Cobblers don’t make shoes anymore, they repair them.
And that’s the thing about LLMs and LLM-tech. Who cares if it replaces jobs or not? Which is to say, I deeply care if people are put out of work, however I don’t think it’s a good metric to judge an LLM or LLM-driven tech by whether it can replace people or not. That’s almost never been the ROI-case for software in the past, so why start now?
Instead, let’s judge LLM-tech (sorry for using this term so much, I am not good at coining terms and I have nothing better) based on the productivity gains. I broadly agree with Aaron Levie about how AI and agents are going to be doing jobs that weren’t being done before.
https://x.com/theallinpod/status/1918512959728287900
And they are going to make jobs that were being done before a lot easier (because that’s what tech always does, whether we want to admit it or not). Will it also create new problems? Hell yeah. But the act solving those problems can be how great businesses will be built in the future, so who cares? Will it create new ethical quandaries for us to try and solve (and fail and solve and fail again)? OF COURSE! But that’s just called being a human. Everything new changes the world and tests our humanity because our humanity is deeply interlinked with how we view the world.
I was talking to a friend the other day who has been going deeper on AI. He’s a fellow investor. He told me he uses Deep Research to write the first draft of every industry section of every investment memo he writes now. It saves him a ton of time and is pretty good work. We were talking about what else can get automated.
He works in a profession that frequently deals with attorneys. And if you are unfamiliar, there are a ton of AI legal tools out there.
-
Harvey AI
-
Funding: $506 million across multiple rounds, including a $300M Series D in February 2025
-
Key Differentiator: Provides generative AI solutions tailored for law firms, enabling custom model training on proprietary documents for tasks like contract analysis and regulatory compliance .
-
-
Eve
-
Funding: $47 million Series A in 2024
-
Key Differentiator: Specializes in AI tools for plaintiff law firms, enhancing case management and litigation processes .
-
-
Ivo
-
Funding: $22.2 million total, including a $16M Series A in early 2025
-
Key Differentiator: Provides AI-powered contract review that automates checks against company policies, offering naturalistic redlines akin to experienced attorneys’ work .
-
-
Ironclad
-
Funding: $333 million total, with a $150M Series E in 2022
-
Key Differentiator: Offers a contract lifecycle management platform integrating AI for contract creation, redlining, and analysis .
-
-
Clio
-
Funding: $900 million Series F in 2024
-
Key Differentiator: Provides a comprehensive legal practice management platform incorporating AI for tasks like document automation and client communication .
-
Most of these tools are actually used by lawyers; they are not direct experiences for folks who would be clients of attorneys. But they do automate various parts of the attorney workflow. Cobbled together, could they all replace the ENTIRE workflow? Cobbled together, why would I need an attorney if I have this tool to review an NDA, compare 30 contracts at one time in 15 minutes, or tell me what is market-acceptable to see in a SAFE note in 2025? Today, AI can theoretically do all of these things. In 5 years, there’s a chance it can do a lot more. So why do I need an attorney at all?
Well, anybody who has worked with an attorney will tell you that their real value isn’t in their ability to read something and provide a summary. A lot of attorneys will tell you that their real skill is in the ability to interpret and provide specifically tailored advice for their clients in some differentiated way. Maybe they really understand the specific problem area the client is dealing with. Or maybe they have some unique legal insights because of how they practice the law. Or maybe they just work insanely hard. I think a lot of those reasons are legitimately true for a lot of (good) firms. At the micro level (firm to firm), this is the reality. And maybe AI can disrupt that?
But at the macro level – why do people REALLY use attorneys? For all of the tactical, practical reasons listed above (I am trying to be brief, not comprehensive!), but also because lawyers are excellent safety blankets. Or bull dogs. Or pain sponges. Whatever term you want to use – lawyers give their clients peace of mind and reduce risk. Peace of mind that the actions the client is going to take won’t blow up in their face. Peace of mind that if something goes awry, you have an attack dog to hunt something down for you. Reduce the risk that you missed something. Or that when a decision was made, it’s NOT ALL ON YOU.
(I got nervous about referring to attorneys as “attack dogs” – it seems a little demeaning. But then I remembered that most of the attorneys I know would take that for how it is meant – a compliment.)
An attorney makes their money because of edge cases and solving problems that don’t fit into a box. More importantly, they get paid because of their availability to do so with the all of the historical context they have earned over the years of you being their client.
This is true for some service providers / consultants. Why does McKinsey exist? If you are a big company trying to figure out a problem, you COULD task your employees to try and figure out the solution (and frequently, that is done before or in-tandem with a consultant agreement). But what if you are too close to a problem? Or the specific problem requires you to find new employees to solve it and that’s a big commitment? Or the solution your employees come up with is the wrong one and it makes things worse? McKinsey shows up and wraps you in a big hug and carries you to bed. All is going to be well.
Can AI do that? Can AI be a pain sponge? Can it be a warm blanket and glass of milk before bed?
But AI is radical – so maybe it can solve all of these problems and more. A question we could ask is – are there any service professions that core value prop from 20 years ago has been effectively replaced by AI (or just technology) pre-LLM explosion that we could use as a case study?
Actually – yes. RIAs were threatened by robo advisors last decade. The core value for many RIAs was – we can take care of you financially and provide improved returns over other providers. I remember, I interned for a couple in college in 2012 – 2013. Then robo advisors rolled around and a lot of folks wondered: “are wealth managers going to be put out of business?”
The answer is absolutely not. They have actually increased in number as a profession since 2010! The number of SEC-registered investment advisory firms increased from 11,458 in 2009 to 15,441 in 2023, marking a 35% growth. And this is because if you talk to an RIA today, they will tell you that their value-add is that they provide a white glove service level and can fulfill a clients needs because they understand them better than the client even understands themselves. While that may sound somewhat corny, the reality is that is what a lot of RIAs provide – it’s more than just generating outsized returns. It’s about providing holistic care to a client’s financial health. That’s extremely tough to automate! And there are plenty that do innovative and do interesting things above and beyond that.
ALSO – for most of these service professions, the people running them are smart. And they have to eat. So they will be deeply motivated to find new ways they can provide deep value to their clients. The free market will find a way.
The platform shift isn’t about AI. It’s about LLMs. These tools are the platform on which to build incredible other businesses. But, like technology throughout history, it’s not about replacing jobs. It’s about changing the way that we work. Making work more productive. And opening up the opportunity for more creative work. Maybe that’s extremely optimistic. But if it goes in the other direction, we are all cooked anyway. So no use worrying about it. That train has left the station.