In an Uncertain Era for Lawyers, Consultants, and CPAs, AI Takes Over - San Francisco - 1

From my perspective, about 10 years ago, the professional job market in the U.S. was straightforward.

If you studied hard, built a good academic background, passed the bar exam, obtained your license, and got into a big company, law firm, or Big 4, it felt like you were off to a great start in your career.

Lawyers did research and wrote documents, consultants organized data and created slides, and CPAs matched numbers and organized reports.

Of course, this doesn't mean these tasks were easy. It required tremendous effort just to get started. And while working, it was possible to anticipate your future.

They hired a lot of people, brought in juniors, and over time, they became seniors.

This is also why Korean parents were so eager to have their children become lawyers or accountants. There was a belief that once you got into the system, things would run smoothly.

However, now that very track is shaking. The AI wave I see this time is different.

It's not a bubble like the dot-com era; it's a fundamental shift like the mobile shift.

And ironically, the first sector to feel this shock is the job category where Korean families are raising their children.

Case 1: Kim Min-jun, 3rd Year Associate at a New York Law Firm

This is about someone I met at a Korean gathering. Let's call him Kim Min-jun.

A 3rd-year corporate M&A associate at a mid-sized New York law firm after graduating from law school.

He has a salary of $220K and bills 2,000 hours a year, clearly a successful case. But what he said was striking.

"I haven't been able to sleep lately. The first-year associates who joined last year have cut the time for reviewing due diligence documents in half since they installed Harvey AI. The partners are not ignoring the reduction in billable hours. In the past, it took 5 associates to close a deal, but now it only takes 3. So where do the other 2 go?"

This is the reality. Clients are now asking when they receive billing invoices.

"Isn't this something that can be done faster with AI?"

It's an uncomfortable question. It's uncomfortable because it's true.

Last year, a Fortune 500 General Counsel reportedly told an external law firm directly.

"Don't bill for tasks that can be handled by AI."

This means that tasks like case law searches, drafting contracts, and first reviews, which used to be profitable for law firms, have all become free services. This is the very reason for the existence of associates.

Law firm partners are also bewildered. In the past, they hired many associates to handle research and document work, billing clients at $400 to $600 an hour for that time.

This was the business model of U.S. law firms. But now, the core driving force of that model is disappearing.

Firms like Latham & Watkins, Allen & Overy, and Wilson Sonsini are investing tens of millions of dollars in AI tools.

On the surface, they say it's for "increased efficiency," but the real message is "we will hire fewer juniors."

Case 2: Park Ji-young, 4th Year MBB Consultant

I have a friend who works as a McKinsey consultant in Chicago. Let's call her Park Ji-young.

She has an MBA from Wharton and has been consulting for 4 years. When I met her last year, she was confident that "consulting will become more important in the AI era."

But when I met her again this Thanksgiving, her tone had completely changed.

"Hey, our company also installed ChatGPT Enterprise. That's good, but... clients are now asking their internal analysis teams to use GPT-4.

The decks they create are not much different from what our associates produce. Sometimes they are even better because they know their company data.

Now we have to prove 'why they should pay us' every time."

In the past, it was enough to throw in a few fancy words, create a nice matrix chart, and say, "We need a strategic pivot." 2x2 matrices, Porter's Five Forces, BCG Matrix — the magic tricks of consulting. But now, internal employees are also using AI to conduct market research, create frameworks, and prepare presentations.

The packaging that consultants used to sell is now being mass-produced. It's an open secret in the industry that companies like Walmart and JPMorgan have already created internal AI strategy teams to rapidly reduce their reliance on external consulting.

What Park Ji-young said at the end was striking. "I still have $180K left in my MBA student loans, and I really don't know if I can make partner track in 5 years."

This is coming from a Wharton MBA graduate. Could anyone have imagined someone from the same school saying this 5 years ago?

Case 3: Lee Sang-hoon, Senior Manager at a Big 4 CPA Firm

Lee Sang-hoon, a member of a Korean church, has a similar story. The CPA industry is being affected even faster.

"Last year, my brother, KPMG, cut about 5% of its global audit staff. The impact of AI automation is significant. Confirmation, sampling, reconciliation. The tasks that our first-year staff used to do are now all handled by the system. So what does that first-year staff do? They are not hiring. They are not hiring."

Simple bookkeeping, classification, tax preparation, and document review are being automated at a frightening speed. Of course, the final responsibility and judgment must be made by a person. Signing off on SOX audit results is ultimately the partner's job. But conversely, this means that those who cannot make judgments will increasingly find it hard to have a place. The era of just memorizing GAAP regulations and handling tax filings based on IRC sections is not going to last long. Machines can do those mechanical tasks much better.

One of the scariest things Lee Sang-hoon said was, "My brother, our company is bringing in smart kids from KAIST or SKY in Korea, even sponsoring their H-1B visas. Each one costs $80K. But now GPT is doing the work they used to do. The subscription fee for ChatGPT is much cheaper than the sponsorship cost."

In an Uncertain Era for Lawyers, Consultants, and CPAs, AI Takes Over - San Francisco - 2

The Elegant Nonsense of the Old Guard

What's interesting here is the reaction of the industry seniors, the so-called "old guard." At conferences, they all say the same thing. "AI is just a tool." "Human judgment cannot be replaced." "In fact, human value becomes more important in the AI era." That's true. But it's strange. The same people who shout about "AI-powered transformation" on LinkedIn are quietly reducing headcounts in HR systems.

While they claim that human value is important, they are slowly changing to a structure that requires fewer people. It's a dignified restructuring. When layoffs are mentioned, they call it "operational efficiency improvement." A hiring freeze is referred to as "talent optimization." They seem to think that packaging it in English makes it look less brutal, but the bottom line is the same. They are using fewer people.

What's particularly frustrating is that these seniors climbed to their positions by putting in the time when they were juniors. They are now kicking away the ladder they climbed. They say, "Kids these days need to use AI well to finish their work quickly," but they don't mention who gets cut as a result of that speed. They remain silent about who is hidden behind the word efficiency.

So Who Survives?

This is not just a lament about "the Korean kids who studied hard are failing." It's about facing reality. The professional job market in the U.S. is undergoing a paradigm shift. The safe zones that relied solely on academic credentials and licenses are rapidly shrinking.

This doesn't mean that titles like Harvard JD, Wharton MBA, or CPA are becoming worthless. It just means that those titles alone are no longer sufficient.

So who survives? From my experience working as an IT senior in Texas, I've noticed a pattern. The ones who survive are not those who can outsmart AI, but those who can review, correct, and take responsibility for the outputs created by AI. In other words:

First, people with deep domain expertise. A lawyer who can say, "This contract clause won't be enforced in Delaware court" after looking at a contract clause that AI generated. A consultant who can point out, "This assumption doesn't fit our industry" after reviewing a model created by AI. An accountant who can catch, "This revenue recognition violates ASC 606" after looking at a financial statement organized by AI.

Second, those who can truly use AI tools well. Not just asking questions to ChatGPT, but deeply integrating AI into their workflows to perform the work of what used to take 5 people. Such individuals provide an incredible ROI for the company. They won't be let go.

Third, and most importantly, those who can take responsibility. AI does not take responsibility. Ultimately, it is people who do. Therefore, judgment, ethics, client relationships, and dispute resolution — these "soft skills" that were once undervalued are actually the true values that remain until the end. This is not taught in school. It is learned through experience in the field.

So My Thoughts Are

The formula of sending children to prestigious universities by taking out student loans to make them lawyers, accountants, or consultants may no longer work in the Korean community. It's sad but true.

This is happening across the U.S. professional job market. The safe zones of various professions are narrowing.

Ultimately, the question is simple. "Do you have the ability to evaluate drafts created by AI? Can you catch the details that AI misses? Can you take responsibility for that judgment?" If you can answer yes to these questions, you will survive. If not, the market will respond coldly.

For those raising children, I want to say this. More important than sending them to prestigious universities is, "How will you cultivate judgment that cannot be replaced in the AI era?"

This is not something schools teach. It must be nurtured by parents and by oneself.

It's not too late. However, it is clear that time is running out.