BTC
ETH
HTX
SOL
BNB
ดูตลาด
简中
繁中
English
日本語
한국어
ภาษาไทย
Tiếng Việt

AI Era: The Ultimate Deduction of the Token Supply and Demand War

区块律动BlockBeats
特邀专栏作者
2026-04-28 12:00
บทความนี้มีประมาณ 20326 คำ การอ่านทั้งหมดใช้เวลาประมาณ 30 นาที
The most powerful model is becoming a weapon for the few
สรุปโดย AI
ขยาย
  • Core Thesis: AI's main driving force is shifting from model capability to Tokenomics: with execution costs plummeting, competition for access to frontier models and token rationing has become a new business barrier, triggering a full-chain value redistribution from demand explosion to supply chain bottlenecks, while the risk of social backlash is accumulating.
  • Key Elements:
    1. Execution costs collapse, AI becomes production capital: SemiAnalysis's annualized spending on Claude Code has reached $7 million, exceeding 25% of its salary expenditure, indicating that AI is transforming from an efficiency tool into core corporate production capital.
    2. The information service industry is the first to be restructured: Non-technical personnel within enterprises, using AI tools, can complete analysis and modeling work that traditionally required a team of 100 people several years to finish, in just a few weeks and at a token cost of thousands of dollars, rapidly breaking down industry barriers.
    3. Tokens become scarce production materials, competition intensifies: The real competition is no longer about "who uses AI," but "who can get access to the most powerful models and higher token rations," which could lead to the concentration of economic resources and usage rights among a few companies with capital and connections.
    4. Demand explosion cascades through the entire supply chain: The surge in token usage triggers a "bullwhip effect," where demand ripples from GPUs to CPUs, memory, PCBs, copper foil, and even semiconductor equipment, causing tight supply and continuous price increases across the entire industrial chain.
    5. The economic value of AI is difficult to capture with traditional GDP metrics: The economic value created by AI exists as "phantom GDP," such as the efficiency gains in decision-making and cascading effects brought by tokens, making it hard for existing economic indicators to accurately measure its true worth.
    6. Social anti-AI sentiment may erupt early: Due to rising public concerns over job displacement, energy consumption, and power concentration, interviews predict that large-scale protests against AI could occur within three months. The industry needs to prepare for brand repositioning to demonstrate concrete public value.

Video Title: The Supply and Demand of AI Tokens | Dylan Patel Interview

Video Channel: Invest Like The Best

Translation: Peggy, BlockBeats

Editor's Note: As AI model capabilities continue to leap forward with tools like Claude Code and Cursor being adopted at scale by enterprises, the industry discussion is shifting from "how capable are the models" to "how models enter production." However, as AI coding, automated analysis, and data modeling become the new consensus, a more fundamental question begins to surface: when execution costs are rapidly compressed, what truly becomes scarce — is it manpower, capital, or the right to access frontier models and tokens?

Left: Host Patrick O'Shaughnessy, Right: Dylan Patel

This article is compiled from a conversation between Patrick O'Shaughnessy and Dylan Patel, founder of SemiAnalysis. Dylan has been closely following AI infrastructure, the semiconductor supply chain, and model economics. In this dialogue, starting from his own company's skyrocketing Claude Code spending, he discusses how AI is changing enterprise organization, information services, token demand, the computing supply chain, and social sentiment.

What makes this conversation most noteworthy is not that some model has once again refreshed a benchmark, but that it provides a way to understand the AI economy — viewing AI as a production system that is reallocating execution capacity, organizational efficiency, and industrial profits, rather than just a software tool upgrade.

This conversation can be understood from roughly five perspectives.

First, execution costs are being broken. In the past, ideas weren't scarce; the real difficulty was turning ideas into products, systems, and deliverable services. Now, Claude Code allows non-technical people to write code, build applications, and perform data analysis. Work that previously required a team's long-term maintenance is now being completed by a few people with the help of models. SemiAnalysis's annualized spending on Claude Code has reached $7 million, exceeding a quarter of its payroll expenditure. This shows AI is no longer just a productivity tool, but is becoming a new form of production capital for enterprises.

Second, the information services industry will be the first to be rewritten. Dylan's business essentially sells analysis, consulting, and datasets, which is precisely the area most susceptible to commodification by AI. Chip reverse engineering, energy grid modeling, and macroeconomic indicator construction, which might have required a team's long-term investment in the past, can now be built into viable products by a few people in weeks. This means the pressure on information service companies from AI isn't "whether it will replace people," but "who can rebuild a competitor's products faster." Companies that don't adopt AI will be commodified by faster ones, while those that do adopt it must continuously raise standards to avoid being reversed-replaced by the next wave of more efficient competitors.

Deeper still, tokens are becoming a new means of production. In the past, when companies bought software subscriptions, the core question was whether the tool was useful. Now, access to frontier models, rate limits, enterprise contracts, and token budgets are starting to directly determine production capacity. A more powerful model doesn't necessarily mean higher cost because smarter tokens might complete higher-value tasks with fewer steps. The real competition is shifting from "who uses AI" to "who can obtain the strongest models and deploy the most expensive tokens on the highest-value scenarios."

This demand will continue to propagate through the entire supply chain. The explosion in token usage will ultimately translate into sustained pressure on GPU, CPU, memory, FPGA, PCB, copper foil, semiconductor equipment, and wafer fab capital expenditure. The "bullwhip effect" mentioned in the article follows this exact logic: a downstream increase in model call demand can be amplified several times over as it travels upstream, resulting in magnified orders, capacity expansion, and price increases. The profit distribution in the AI industry will therefore not remain solely with model companies and NVIDIA but will continue to spill over along the semiconductor and data center supply chain.

Finally, the social backlash against AI may come sooner than expected. When AI truly enters workflows, public concerns about job displacement, energy consumption, data center expansion, and power concentration will rise in tandem. Dylan even predicts large-scale protests against AI within three months. For model companies, continuing to emphasize that "AI will change the world" may not alleviate anxiety but could instead reinforce a sense of loss of control among ordinary people. The AI industry now needs to prove not just technical capability, but how it creates concrete, perceivable public value in the present.

Now, the core question of AI is shifting from "what the model can do" to "who can access the model, how to use it, and who can capture the value it creates." In this sense, the subject of this article is no longer just Claude Code, Anthropic, or any single AI company, but a structural reorganization centered on productivity, capital expenditure, organizational efficiency, and social acceptance.

The following is the original content (edited for readability):

TL;DR

· The core variable of AI is shifting from "can it be done" to "is it worth doing." As execution costs plummet, what truly becomes scarce are high-value ideas that can be amplified by models.

· Claude Code spending reaching 25% of payroll costs is just the beginning. AI is transforming from a software tool into a new form of enterprise production capital.

· Competition for frontier models is no longer just about capability but about the right to access tokens. Whoever can obtain the strongest models earlier and more reliably may build new competitive moats.

· The information services industry will be the first to be restructured by AI because the production cost of data, analysis, and research is rapidly decreasing. Slow companies will be commodified by faster ones.

· Token demand won't slow down because older models become cheaper. Each time a model becomes more powerful, it unlocks new high-value use cases and pushes users towards more expensive frontier models.

· The biggest change brought by AI isn't that people work less, but that a few people can achieve several times the output in the same time. Those who cannot create and capture token value will be locked into a "permanent underclass."

· Computing power shortages are spreading throughout the semiconductor supply chain. From GPU, CPU, and memory to PCB, copper foil, and equipment manufacturers, AI demand has become a price driver for the entire industry chain.

· The economic value of AI is difficult to capture with traditional GDP. The real question isn't just how much money model companies make, but the value of decisions, efficiency gains, and cascading effects generated by tokens — the "Phantom GDP."

Interview Transcript:

Claude Code Has Become the New Workforce

Patrick O'Shaughnessy (Host):

You once told me a fascinating story about the massive change in your team's token usage this year. Can you tell it again? What did it teach you about what's happening in the world?

Dylan Patel (Founder, SemiAnalysis):

Last year, we thought we were heavy AI users. Everyone was using ChatGPT, everyone was using Claude, and I provided the subscriptions they wanted. At that time, the company's spending on this was in the tens of thousands of dollars.

But this year, spending started to skyrocket. The real turning point was around the end of last December, with the release of Opus. This also involved Doug, our president, Douglas Lawler. He was basically leading the charge for non-technical people to use AI for coding. He gradually brought the whole company along. Of course, the engineers were already using it, but from January this year, our spending clearly turned a corner and then exploded rapidly.

We later signed an enterprise contract with Anthropic. Last time I talked to you, our annualized spending was about $5 million; now it's $7 million.

Patrick O'Shaughnessy:

And that was last week's number.

Dylan Patel:

Right, and a huge part of that is just usage volume. The really interesting thing is that people who have never written code before are now using Claude Code, and some can spend thousands of dollars a day. But looking at the company overall, our annual spending on Claude Code is now $7 million, while our total payroll is about $25 million. That means Claude Code spending has already exceeded 25% of payroll.

If this trend continues, by the end of the year it could even exceed 100% of payroll. That's a bit scary. Fortunately, I don't have to choose between "people" and "AI" right now because the company is growing quickly. It's more like: I don't need to hire as fast, but I can spend more on AI, and it works, allowing the company to grow faster.

But I think other companies will eventually have to face this question: If one person using Claude Code can do the work of 5, 10, or even 15 people, what happens next? First, they might need to lay people off; second, the use cases are currently very broad.

For example, we have a reverse engineering lab in Oregon that we've been building for a year and a half. It has high-end equipment like microscopes and scanning electron microscopes. The core purpose of this lab is to reverse engineer chips, extract their architecture, and analyze the materials used in their manufacturing. This is also one of the datasets we sell.

But analyzing this data used to be a very slow process. Now, one person on our team, spending just a few thousand dollars on Claude tokens, built an application. This app can use GPU acceleration, running on our servers at CoreWeave. We just send it a chip image, and it automatically labels the location of each material on the image: here is copper, here is tantalum, here is germanium, here is cobalt. Then you can perform finite element analysis on the entire chip stack structure very quickly, and it's visual, with a complete GUI and dashboard.

This person used to work at Intel. He said that in the past, this would have been a full team's job to build and maintain. Seeing similar things happening across the company is just incredible.

Another example I find particularly interesting is Malcolm. He used to be an economist at a large bank. That bank's economics department probably had 100 to 200 people. What he's built now is astonishing.

He integrated various data sources, including FRED data, employment reports, and datasets from different APIs. We also signed contracts with some data vendors to get API access. Then he pulled all this data in and started running regressions, analyzing the inflationary or deflationary impacts of different economic changes.

The Bureau of Labor Statistics has a full set of task classifications, about 2000 tasks. Malcolm used AI to assess which of these tasks can currently be done by AI and which cannot, scoring them according to a rubric. The results showed that about 3% of tasks can now be done by AI.

So he created an indicator to measure what can be done by AI and the deflationary impact when those things are done by AI. Output might increase, but because costs drop so drastically, GDP might theoretically contract. He calls this "Phantom GDP."

Based on this concept, he built a whole analysis and established a new language model benchmark with about 2000 evals.

Patrick O'Shaughnessy:

He did all this alone?

Dylan Patel:

Yes, all by himself. He told me, "Dude, this would have taken a team of 200 economists a year to do before." He's completely immersed in Claude, saying everything has changed.

Patrick O'Shaughnessy:

As a business owner, how do you understand this? You went from almost no spending to it approaching 25% of payroll and continuing to rise. At what point do you think: "Wait, should I hit the brakes? Should I control spending? Maybe we don't always need the newest frontier model released today, like Opus 4.7; maybe we can switch to a cheaper one?"

Dylan Patel:

Ultimately, I'm in the information business. We sell analysis, do consulting, and create datasets. I see no reason why these things won't be fully commoditized at a fairly rapid pace.

If I don't continuously improve, my first data product would already have more people doing similar things. We can still sell it because we keep making it better and more detailed. But the way we did it in 2023 isn't significantly different from how others are doing it now. If I don't keep raising the standard, I'll be commoditized. If I don't move fast enough, I'll lose my edge.

So the question is: Yes, AI will commoditize many things, just as it is commoditizing software. But those who act fast enough, manage customer relationships, provide excellent service, and continuously improve won't shrink; they'll grow faster. The incompetent, those who do nothing, will lose.

So it's a bit of a survival problem: If I don't adopt AI, someone else will, and they will beat me.

Another simple example is in the energy sector. We've had a few energy analysts for the past year trying to build an energy model. It's very complex, and the market for energy data services is about $900 million, so it's clearly a huge market I want to enter. But despite a year of effort from our team, we hadn't really broken into the energy data service business.

Then, "Claude Code psychosis" hit. We have a guy named Jeremy who handles data center energy and industrial business. He started using Claude Code, and things suddenly changed. In three weeks, he spent a lot of money, about $6,000 a day, which was indeed crazy. But he scraped every power plant in the US, every transmission line above a certain voltage level, built a map of the entire US grid from various public data sources, and integrated a lot of demand-side data.

We turned it into a dashboard for viewing and analyzing power shortages and surpluses across US micro-regions, with a lot of detail. This was built in a few weeks.

Later, we showed it to some clients who already bought our data center dataset, including energy traders. They said, "Wow, how long did this take? It's pretty good, better than Company X." We then found out that Company X had 100 people working on this for ten years.

Of course, our product isn't as complete or robust as theirs yet, but in some aspects, it's already better. So I'm now commoditizing these energy data service companies. But conversely, if I don't run faster, who will come to commoditize me?

So, from a business owner's perspective, the problem isn't "Am I spending a lot of money?" Yes, I am. But the question is: What is this money getting me? Is it generating more revenue? If the answer is yes, then the money is worth it.

Patrick O'Shaughnessy:

Do you worry that eventually, the people who control capital and invest it – the ones who often hire you for what you do – will say, "We have our own analysts, they're smart too, why don't we just do it ourselves?" If this becomes so easy, at what point does it all flow back inside the investment firms? They are the ones who can get the most leverage from this data and insight.

Dylan Patel:

First, any information service business is fundamentally like this: The value I get from a piece of information is clearly less than the value the client gets.

If I sell you information for $1, you are willing to pay that $1 because you know this information helps you make a decision that earns you more than $1. You have an arbitrage opportunity. You make more money from me than I make selling the information.

Investment funds themselves certainly have their own information service capabilities. Especially firms like Jane Street and Citadel, they are very granular and deep with data. Yet these institutions still buy our data, continue to buy it, and our relationship with them is growing.

I think there's an "it factor" here. We move faster, are more agile, have a smaller team, and are focused on a very specific niche: AI infrastructure and the massive changes it's causing, including AI, the token economy, and the whole ecosystem. We can see the direction earlier and build things faster.

So, investment professionals will certainly try to do some of what we do internally. But more often, they buy our data and build on top of it. For them, buying our data and building on top is usually cheaper than building from scratch. Eventually, someone will definitely try to do it themselves.

Tokens Become the New Means of Production

Patrick O'Shaughnessy:

I feel like every time I talk to you, I end up at the same question: the supply and demand of tokens. It's the most interesting thing in the world to me right now. What new understanding of the demand side have your own experiences given you? After feeling this so viscerally yourself, has your judgment on token demand changed?

Dylan Patel:

If we step back and look at the macro picture, Anthropic's ARR might have grown from $9 billion to around $35-40 billion. By the time this episode airs, it might be $40-45 billion.

But their compute power hasn't grown at the same rate. If you run the numbers, and assume they haven't reduced compute for R&D — which they clearly haven't, since they're releasing new models like Metis, Opus 4, Opus 4.7 — it means one thing: Even if all their new compute goes to inference, their gross margin floor is around 72%.

In reality, some new compute likely goes to R&D, so their actual gross margin could be higher than 72%. Remember, earlier this year, someone leaked part of their fundraising document showing a gross margin

ลงทุน
อุตสาหกรรม
เทคโนโลยี
AI
ยินดีต้อนรับเข้าร่วมชุมชนทางการของ Odaily
กลุ่มสมาชิก
https://t.me/Odaily_News
กลุ่มสนทนา
https://t.me/Odaily_CryptoPunk
บัญชีทางการ
https://twitter.com/OdailyChina
กลุ่มสนทนา
https://t.me/Odaily_CryptoPunk
ค้นหา
สารบัญบทความ
ดาวน์โหลดแอพ Odaily พลาเน็ตเดลี่
ให้คนบางกลุ่มเข้าใจ Web3.0 ก่อน
IOS
Android