Author: Carbon Chain Value
The development trend of Ai+Crypto seems to be unfolding rapidly. It's just that the way it's unfolding this time is a little different from what everyone imagined before. It's unfolding in the form of attacking the other party. Ai first attacked the traditional capital market, and then attacked the Crypto market.
On January 27, the download volume of the emerging Chinese Ai large model DeepSeek surpassed ChatGPT for the first time. It topped the US APPStore list. It triggered special attention and reports from the global technology and investment communities and even the media.

Behind this incident, not only does it remind people of the possibility of rewriting the future pattern of Sino-US technological development. It also conveys a short-term panic to the US capital market. Affected by this, Nvidia fell by 5.3%. ARM fell by 5.5%. Broadcom fell by 4.9%. TSMC fell by 4.5%. As well as Micron, AMD, and Intel all had corresponding declines. Even the Nasdaq 100 futures fell to -400 points. It is expected to set the largest single-day drop since December 18. According to incomplete statistics, the market value of the US stock market is expected to evaporate by more than 1 trillion US dollars in Monday's trading. Lost one-third of the total market value of the crypto market.
The crypto market, which closely follows the trend of the US stock market, also saw a sharp drop caused by DeepSeek. Among them, Bitcoin fell below $100,500, with a 24-hour drop of 4.48%. ETH fell below $3,200, with a 24-hour drop of 3.83%. Many people are still scratching their heads and wondering why the crypto market has plummeted so quickly? It may be related to the lower expectations of the Federal Reserve's interest rate cuts or even other macro factors.
So where does the market panic come from? DeepSeek did not develop with strong capital and a large number of graphics cards like OpenAi, Meta, or even Googel. OpenAI was founded 10 years ago, has 4,500 employees, and has raised $6.6 billion in funds so far. Meta spent $60 billion to develop an artificial intelligence data center that is almost the size of Manhattan. In contrast, DeepSeek was founded less than 2 years ago, has 200 employees, and the development cost is less than $10 million. It did not spend a lot of money to pile up Nvidia's GPU graphics cards.
Some people can't help but ask: How can they compete with DeepSeek?
DeepSeek breaks not only the cost advantages at the capital/technology level, but also the traditional concepts and ideologies that people had previously had.
DropBox's vice president of products lamented on social media X that DeepSeek is a classic disruptive story. Existing companies are optimizing existing processes, while disruptors are rethinking basic methods. DeepSeek asks: What if we do this smarter instead of investing in more hardware?
It should be noted that at present, the cost of training top artificial intelligence models is extremely expensive. Companies such as OpenAI and Anthropic spend more than $100 million on computing alone. They need large data centers equipped with thousands of $40,000 GPUs. Just like you need an entire power plant to operate a factory.
DeepSeek suddenly appeared and said, "What if you use $5 million to do this?" They didn't just talk about it, but really did it. Their models are on par or better than GPT-4 and Claude on many tasks. How? They rethought everything from scratch. Traditional AI is like writing every number with 32 decimal places. DeepSeek is like “What if we only use 8 decimal places? It’s still accurate enough!” Requires 75% less memory.
DropBox VP of Product says the results are astounding: Training costs dropped from $100 million to $5 million. Required GPUs dropped from 100,000 to 2,000. API costs dropped by 95%. Can run on gaming GPUs, no data center hardware required. More importantly, they’re open source. It’s not magic, just incredibly clever engineering.
Some people also say that Deepseek completely subverts the traditional view in the field of artificial intelligence:
China only does closed source/proprietary technology.
Silicon Valley is the center of global AI development and has a huge lead.
OpenAI has an unparalleled moat.
You need to spend billions or even tens of billions of dollars to develop a SOTA model.
The value of the model will continue to accumulate (the fat model hypothesis)
The scalability hypothesis means that the performance of the model is linearly related to the cost of the training input (computation, data, GPU). All these traditional views have been shaken, even if they have not been completely overturned overnight.
In its briefing, Archerman Capital, a well-known American equity investment institution, commented on DeepSeek, saying that first of all, DeepSeek represents a victory for the entire open source over closed source, and its contribution to the community will quickly translate into the prosperity of the entire open source community. I believe that open source forces, including Meta, will further develop open source models on this basis. Open source is a matter of everyone adding fuel to the fire.
Secondly, OpenAI's path of miracles seems a bit simplistic and crude for the time being, but it is not ruled out that a new qualitative change will occur when a certain amount is reached, and the gap between closed source and open source will widen again, which is also hard to say. From the perspective of AI Based on the historical experience of the past 70 years of development, computing power is crucial, and it may still be in the future.
Then, DeepSeek makes open source models as good as closed source models, and more efficient. The necessity of spending money to buy OpenAI's API is reduced. Private deployment and autonomous fine-tuning will provide more room for development for downstream applications. In the next one or two years, we will most likely witness more abundant inference chip products and a more prosperous LLM application ecosystem.
Finally, the demand for computing power will not decrease. There is a Jevons paradox that says that the improvement in the efficiency of steam engines during the First Industrial Revolution actually increased the total consumption of coal in the market. It is similar to the era from the big brother era to the popularization of Nokia mobile phones. It was popular because it was cheap, and because it was popular, the total market consumption increased.