If AI can do the math, write emails and even draft contracts, why are lawyers and accountants still in business? A lawyer costs $500 an hour, an LLM $20 a month. Both provide answers and even some degree of emotional support. So, why aren't these going anywhere? Because the consequences are different.
I've been wanting to write an article on why it is so damn hard to build an AI business or an AI app that can replace a specialist, but that wouldn't help you to know what to build. So, instead, I decided to tie this article to the previous article on why most AI startups are bad businesses and talk about how to actually build a sustainable AI product that will outlive the hype blown up VC rounds and one that can actually bring you billions of dollars.
This is for those of you who evaluate AI investments, especially now that we're in the middle of 2026 budget planning, as well as for potential and current founders looking for a problem that is really worth solving. Let's dive in.
Why Specialists Cost So Much: Beyond the Hours
Why does the labor of a lawyer, an accountant, or a tax adviser cost so much? Remember this song that was circulating social media some time ago? "Exposure doesn't pay the bills. It cost that much cuz it takes me hours."
The ambition to replace high-cost professionals with AI is one of the most significant miscalculations in modern technology entrepreneurship. And I think that the fundamental problem lies in combining the hours with responsibility. This distinction creates devastating economic consequences for companies aiming at full replacement of a professional.
Deconstructing a Specialist's Work: Admin vs. Responsibility
To understand why, let's break down what a specialist does. Let's use a lawyer as an example, but this applies to accounting, finance, tax advising, and similar high-cost, high-value professions. The work of a specialist can be largely divided into two categories.
Category 1: Admin Tasks
The first category is admin. Input data, analyze the data, make a recommendation, spit out a document, write an email. These tasks usually repeat from a client to client, a patient to patient. It's usually the tasks with very low variability because a specialist has to follow a protocol.
Let's use a hypothetical situation that people often encounter when they, for example, are going through a divorce. When you're going through a divorce, you typically need a family lawyer. So, how does your interaction with a lawyer begin? You call them, you tell them about your situation. They listen to you. They assess your options. If you have common children, if you have a property, if you have a mutual business, how can those things be split, etc., etc.? This is the step when you're inputting data. You're inputting data into your lawyer's brain. They analyze it. They provide a recommendation based on your situation and they start communication with the lawyer of your former partner.
Chances are your case is fairly common and a relatively experienced family lawyer has seen it before. But what this similarity means is that your lawyer is going to be repeating the same procedures that they always do. If we map this scenario to how much it would cost you to hire an intermediate level lawyer in Toronto, for example, it'll be around $500 an hour. You tell them about your situation. That's the first hour. They provide a recommendation. That's the second hour. And then you start communicating or they start communicating with a fellow lawyer of your former partner. So let's say three more hours. So we just spent $2,500 on the initiation of the case. In the interest of time and for the purpose of this problem discovery exercise, let's keep it at $2500. Please don't come at me and be like, "Yeah, it's $7,000 or it's $500." Let's just assume that it's 2500.
Now, is this easy to automate? Objectively, yes. Everything I've said until this point is fairly easy to automate for the vast majority of divorce cases. Are there going to be exceptions? Of course. But let's say for 80% of divorce cases, the procedure is going to be straightforward. So, creating a way to input data, building an algorithm to provide a recommendation, drafting emails and sending them. Very common use cases for an AI app.
The Economics of Responsibility
The core issue isn't whether AI can technically do it. It's in the liability surface area. When professionals charge $500 per hour, clients aren't paying for the time. They're paying for someone to carry legal and financial responsibility when things go wrong. This responsibility gap creates multiple economic pressures that destroy AI business models that are aimed at professional replacement.
From the point when you enter the phase of negotiations and onward, what you're paying for is not an hour of their time. Sure, they can charge you for billable hours, but it's up to the lawyer on how they price their work. It can be by the hour, it can be by a fixed rate or it can be a contingency fee. But what you are paying for is the responsibility that that lawyer is going to carry and the ultimate outcome that they promise to deliver.
When AI founders embark on a journey to replace an expensive professionals even for a narrow use case... they start playing with the risk-shifting mechanism that exposes the professional replacement fallacy. When children, joint property, or spousal support are at issue, the lawyer must anticipate and plan for all kinds of outcomes and calculate each step given the constraints and most importantly sign their name as council of record on binding agreements or court filings. If the child's custody arrangement does not serve their best interests, lawyers face investigations, disciplinary hearings, and possible suspension or even disbarment. Defending against these regulatory and bar actions can cost between 50 to 200 plus,000 in legal fees alone, not counting any client compensation awarded. If a poorly worded support order results in long-term financial harm to the client, the lawyer's insurer typically pays for damages. The minimum required liability coverage in Ontario is $1 million, and many firms carry between $2 to $6 million.
In contrast, AI platforms typically cap vendor liability at the annual contract value, often just between 10 to $100,000 per client or per business. Most AI products in legal tech stipulate that if their software misses an asset or miscalculate support, the refund is limited to the subscription fee and no coverage of actual losses. AI tools do not provide malpractice insurance for clients. If an error costs the client $500,000, the client bears the risk and cost. There is no million-dollar coverage, there is no insurance guarantee and no regulatory recourse against the AI vendor.
AI platforms will always lose to a real professional in the negotiation stage because they never sign as council of record, do not hold regulatory responsibility and cap economic risk at trivial sums which leaves the client to absorb financial regulatory and legal responsibility and responsibility is a dimension that cannot be automated because if AI resolves your divorce case in a way that would make you give up all of your life savings in favor of your former partner, AI won't take responsibility. A lawyer will. If your business puts you in the territory of financial fraud and AI hasn't caught that while trying so hard to flatter you, AI won't pay the fine, but an accounting firm will.
The Economic Moat: Politics and Emotional Intelligence
The economics of responsibility extend far beyond liability and malpractice coverage. The one that is really delicate is politics. The ability to navigate politics and high stakes and emotionally charged situations is the biggest economic moat that professionals have over AI.
Now let's go back to our divorce example. Apart from real estate, custody, financial aspects, it is very important to consider all of these things within the context of family politics, cultural sensitivities and power dynamics within the same family. Experienced family lawyers understand generational hierarchies, cultural taboos, family alliance patterns, religious community pressures, and cultural concepts of honor that AI technically can learn, but it will not predict or consider that within a larger context reliably. The economic consequences of mishandling these dynamics isn't just legal or financial. It's a family relationship destruction and cultural alienation that can span generations.
On top of it, power imbalances, subtle intimidation patterns, financial abuse tactics, and emotional manipulation. Lawyers recognize when one spouse uses financial complexity to maintain control or when child custody becomes a weapon. AI on the other hand cannot read micro expressions on the spot or assess informal undocumented power structures or the highest paid person's opinion. The cost of missing these signals is continued abuse, financial harm, and psychological damage worth millions in long-term consequences.
This is why replace the professional business model simply collapses because it's trying to replace hours. But what you're paying for are not the hours. It's the responsibility.
How to Build a Billion-Dollar AI Business
So if you're trying to build a sustainable AI native business based on automating professional services within a foreseeable future and have a working product that you can actually start testing, you want to be looking at the first bucket.
Methodology for Building a Sustainable AI Product
- You emerge 100% of yourself in a specialist's life and study their day-to-day routine. You have to know what they do so well that you can basically become their shadow.
- When you're at a point when you're able to map out their work step by step and draw patterns from repetitive things that they do, pick one step in this entire thing.
- You pick one step when a client is paying the most and the lawyer or accountant or any other kind of specialist is losing the most time by doing this admin work.
- And then you try and see if you can automate that.
You would be priming your product for a high adoption from the very start because it goes both ways. Your selling point for the specialist is that they're going to spend less time on this step and can probably do more work faster and therefore get more clients. And the benefit for the client is that they get results faster and with potential cost savings.
To do this, I suggest that you really narrow down your scope. Narrow is the moat. Automating difficult workflows in legal tech comes with immense business potential, especially when targeting labor intensive or error-prone tasks considered boring or extremely challenging for AI. Automating even a part of the step, a step that is plagued by regulation and nuance, can bring you a unicorn value business because you would be saving for example law firms a lot of money and enabling scaling at a level that was not possible for them with manual labor alone.
Caveat: Outsourcing Cost Comparison
But there is one caveat and here is what I would recommend to keep in mind when assessing an idea for an AI native business. Do the math on how much it would cost to outsource this work. If a specialist needs help and they're choosing between an app or an outsourced resource to delegate to. How does your product compare and what kind of impact it's creating compared to an outsourced human? Because if the process that you're automating can be done by a junior specialist, the value of your product plummets because you don't have a moat. Just like lots of GPT wrappers don't have a moat because if Google, Anthropic or OpenAI introduce a native product that will cover the use case that the wrapper is built around, your product may not have a moat due to a limited functionality that a cheaper human can do.
Billion-Dollar AI Business Ideas (Examples)
And now I'm going to throw two business ideas, billion-dollar business ideas at you. One from the legal and one from the finance space. And hopefully they will be a good illustration of what I mean by that one step that needs to be automated. For this example, I would like to thank our subscriber. Her name is Christina. She's a financial planning and analysis professional based in New York who was kind enough to help us with a concrete example of an idea in the finance industry. Thank you very much, Christina.
Example 1: R&D Tax Credit Documentation Automation (Finance)
For context, R&D tax credit documentation is the process of paperwork and recordkeeping that you must provide to the government to prove that your company deserves money back for doing research and development work. Think of it this way. The government wants to encourage companies to innovate and build mind-blowing AI. So, they offer a tax credit, basically a discount on your taxes and even cash back. And it offers it to companies that spend money trying to solve technical problems. But to get the money, you need to prove that you actually did qualifying R&D work. To prove this, companies spend between 50 and $150,000 a year on tax consultants because documenting qualifying activities, tracking employee time, and maintaining audit trails is impossibly complex when done manually.
A simple use case, your software company spends $2 million on engineer salaries, building a new bleeding edge AI feature. The government says, "Great. We'll give you back 6 to 10% of that spending as a tax credit." So, you could get between $120 to $200,000 back. Now, you need to document and prove that your engineers were actually solving technical uncertainties. They experimented with different approaches, and that the work that they did involved real engineering research.
Business Opportunity Estimation:
- Total Addressable Market: 500,000+ companies in North America conducting qualifying R&D activities.
- 200,000 of those companies currently claim R&D credits and they spend $15 billion annually on consulting services.
- 300,000 companies are eligible, but they don't proceed with R&D tax credit because the documentation is very complex.
- The R&D tax credit market is valued at $5.2 billion in 2024 and projected to grow to $9.1 billion by 2033. North American market takes up about 60% of this market.
Automation Opportunities (Currently Manual):
- Companies must reconstruct retrospectively which employees worked on qualifying R&D projects and how many hours were spent.
- Companies must correlate payroll costs, cloud computing expenses, materials, and contractor payments to specific R&D projects. This cost companies between 10 and $25,000 annually in accounting and consultant time.
- The CRA and IRS require comprehensive audit trails with proper documentation. This costs companies between 5 and $15,000 annually in administrative fees.
Going back to my suggestion, with one step, solve one of these and there's your billion-dollar AI business.
Quick ROI Analysis:
- Customer Investment: $15,000 annually for automation platform.
- Customer Cost Savings: $60,000 annually versus manual consulting.
- Customer ROI: Four times return with a 3-month payback period.
- Direct Cost Savings: 80% reduction in consultant fees, 90% reduction in internal time spent in documentation.
- Risk Mitigation: Which can also cost you a lot of money.
None of these steps replace tax advisors. Tax advisers still need to interpret regulations, assess risk tolerance, and sign off on claims, but AI can automate 80% of repetitive admin documentation work.
Example 2: Automation for Intellectual Property Litigations (Legal)
For this one, I'm going to start with an example. Imagine that Apple sues Google, claiming that the Pixel phones infringe Apple's slide to unlock patent. To defend themselves, Google's lawyers must analyze the prosecution history of Apple's patent to find weaknesses. In this context, patent prosecution history is the detailed review of every document creating during a patent's journey through the US patent and trademark office from initial application to final grant. This includes all office actions, claim amendments, examiner reviews, applicant responses that show how the patent evolved, and what arguments were made to secure approval.
What Google's lawyers need to discover is whether Apple has originally claimed something broader during prosecution and then narrowed it down. What prior art did the examiner site that forced Apple to amend their claims? Did Apple make any statements during prosecution that limit how broadly the patent can be interpreted? Are there any inconsistencies between what Apple argued to have as part of the patent and what they're claiming in the lawsuit against Google?
Current Costs and Time:
- A typical patent litigation case requires about 25 patents to be analyzed.
- Cost per case: $420,000 just for the prosecution history analysis.
- Required time: 800 attorney hours per major case at $600 per hour.
- Document volume: Insane. 1,200+ pages per case.
Automation Areas:
- Automated Document Processing: Extracting and organizing all office actions, responses, and amendments. Creating chronological timelines and generating claim comparison charts showing evolution from original to final claims.
- Claim Analysis: Parse claim language and identify amendments automatically. Map amendments to examiner rejections through natural language processing.
- Prior Art Correlation: Prior art in this case covers previously granted patents, scientific journal articles, academic thesis, commercial products or public demonstrations, and essentially any information that was accessible to the public before the inventor filed the patent. Another area is extracting all cited prior art and linking to specific rejections. And lastly, analyzing applicants distinguished arguments and generating prior art comparison charts.
Potential for Automation:
- Time Reduction: 75%
- Cost Reduction: From $480,000 to $120,000 per case.
- Speed: Weeks instead of months to complete prosecution history analysis.
Quick Estimate of the Opportunity:
- 5,500 patent cases filed annually in the US.
- $2.3 billion spent annually on just the prosecution history analysis.
- 50,000 companies with large patent portfolios need ongoing analysis.
- $3.75 billion on annual portfolio management market.
- Total Addressable Market: $6.1 billion.
Conclusion and Key Takeaways
American enterprise world is full of examples like these. And the rule of thumb is that you should go for industries where work isn't just tasks. Any occupation that involves consulting in various capacities such as negotiation, strategizing, long-term planning that involves risk. I've said this before and I'll say it again. The best examples are boring, traditional, classic, regulated industries, finance, healthcare, accounting, legal.
Rules for Building a Profitable AI Product:
- Pick a boring traditional industry. You can either find someone who has spent decades working in it, or you can go to that industry and work in it yourself. Usually, the best founders are the ones who actually tried working in the industry for a while.
- Focus on making expensive professionals more efficient and narrow down your focus.
- Do not underestimate the nuance and regulation that goes into their job thinking that you can automate it all only because the industry is text-based. Sure, laws are text-based and in the legal process everything gets documented. But even before we get to the notion of regulations, check your assumptions. For example, the assumptions around how much data is publicly available to train your model. In fact, my close friend does audio transcriptions for a court and that industry is not dying. In the age when any video or audio AI software records, transcribes and connects to various agendic workflows, the courts are not allowed to use those apps due to privacy reasons.
- Automate small routine workflows, not core accountability, responsibility or skills that require high emotional intelligence such as corporate politics.
Hours are easy to automate. consequence and responsibility aren't. That is why $500 an hour beats $20 a month. To build a profitable product, you don't need to automate the entire job. You need one narrow, highstake step that sits beside accountability. Own that step and build a product that solves that step from end to end. Do that and the margins will follow.
We hope this was helpful. We'll see you next time.
0 Comments