Last week, semi -conductor stocks like Nvid (NASDAQ: NVDA), Advanced Micro Devices (NASDAQ: AMD), A Micron Technology (NASDAQ: MU) Dive on news that a Chinese initial business called Deepseek has calculated how to train artificial intelligence (AI) models for a fraction of the cost of its American peers.
Investors were concerned that Deepseek's innovative approach would trigger a fall in demand for graphics processors (GPUs) and other data centers components, which are key to the development of AI. However, those concerns could be overcrowded.
Meta platforms (NASDAQ: META) is a huge buyer of AI chips from Nvidia and AMD. On January 29, CEO Mark Zuckerberg made a series of comments that should be music for the ears of investors who own AI hardware stocks.
Image source: Getty images.
Successful Chinese hedgerow fund High-flyer has been using Or to build trading algorithms for years. He founded Deepseek as a separate entity in 2023 to take advantage of the success of other AI research companies, who were accelerated ascending in value.
Last week's stock market panic was triggered by a major V3 Deepseek (LLM) language model, which fits the performance of the latest GPT-4O models of AI American initial main business, Openai, across several benchmarks. That's not a concern by its value, except Deepseeek claims to have spent just $ 5.6 million of V3 training, but Openai has burned over $ 20 pare Since 2015 to reach its current stage.
To make things more anxious, Deepseeek has no access to the latest data center GPUs from Nvidia, because the US government banned banning them from being sold to Chinese companies. That means the starting business has to use older generations like the H100 and the Underwater H800, indicating that it possible training AI models flagship without the best hardware.
To compensate for the lack of computational performance, Deepseek pioneered the software side by developing more efficient algorithms and data input methods. He also adopted a technique called distillation, which includes using a successful model to train his own smaller models. This speeds up the training process quickly and requires much less computer capacity.
Investors are concerned that if other AI companies adopt a Deepseek approach, they will not need to buy as much GPUs from NVIDIA or AMD. That would also be the demand for micron -leading data center memory solutions.
GPUS NVIDIA is the most popular in the world for the development of AI models. The company's financial year ended on January 31, and according to management guidance, its revenue is likely to more than doubles to a $ 128.6 billion record (the official results will be released on February 26). If recent quarters are anything to pass, about 88% of that revenue will have come from its data center segment thanks to GPU sales.
That incredible growth is the reason that Nvidia has added $ 2.5 trillion to its market capitalization over the past two years. If the demand for chips slowed down, much of that value would likely evaporate.
AMD has become a worthy competitor for Nvidia in the data center. The company plans to launch its new MI350 GPU later this year, which is expected to compete against Nvidia's latest Blackwell chips which have become a gold standard for AI workload processing.
But AMD is also the leading supplier of AI chip for PCs, which could become a major growth segment in the future. As LLMS becomes cheaper and more efficient, it will eventually be able to run on smaller chips inside computers and devices, reducing dependence on external data centers.
Finally, a micron is often overlooked as an AI chip company, but plays a vital role in the industry. Its HBM3E (high bandwidth memory) for the best in -class data center in terms of energy capacity and efficiency, which is why Nvidia uses it inside its latest Blackwell GPUs. Memory stores information in a ready -made condition, which allows the GPU to be accepted immediately when needed, and as workloads are so intensive in terms of data, it is an important piece of the hardware puzzle.
Image source: Getty images.
Meta Platforms spent $ 39.2 billion on chips and data center infrastructure during 2024, and plans to spend as much as $ 65 billion this year. Those investments help the company further promote its Llama Llama, the most popular open source models in the world, with 600 million downloads. Llama 4 is due to launch this year, and CEO Mark Zuckerberg believes it could be the most advanced in the industry, outperforming even the best closed source models.
On January 29, Meta hosted a conference call with analysts for its fourth quarter 2024. When Zuckerberg was questioned about the potential impact of Deepseek, he said it was probably too early to decide what it means for capital investments in chips and data centers. However, he said even if it results in fewer capacity requirements for AI training workloads, it does not mean that companies will need less chips.
Instead, he believes he could be able to move away from training and about a collection, which is the process for which AI models process consumer inputs and form responses. Many developers move away from training models by using endless amounts of data, and focusing on “reasoning” capabilities instead. This is referred to as test time grading, and includes the model takes extra time to “think” before giving output, which results in higher quality responses.
Reasoning requires more computer, so Zuckerberg thinks companies will still need the best data center infrastructure to maintain an advantage over the competition. Also, most AI software products have not yet achieved mainstream adoption, and Zuckerberg recognizes that additional data center capacity will also be needed for many users over time.
Therefore, while it is difficult to put exact numbers on how Deepseek's innovations will re -draw the demand for chips, Zuckerberg's comments suggest that there is no reason for NVIDIA, AMD, and Micron Stock Investors to panic. In fact, there is an bullish case even over those stocks over the long term.
Have you ever felt you lost the boat while purchasing the most successful stocks? Then you will want to hear this.
At rare times, our specialist team of analysts publishes a “Double Down” Stock Recommendation to companies they think are about to pop. If you're worried that you've already missed your chance to invest, now is the best time to buy before it's too late. And the numbers speak for themselves:
Nvidia: If you invested $ 1,000 when we doubled in 2009, You would have $ 307,661!*
Apple: If you invested $ 1,000 when we doubled in 2008, You would have $ 44,088!*
Netflix: If you invested $ 1,000 when we doubled in 2004, you would have $ 536,525!*
We are currently issuing “double down” warnings for three amazing companies, and there may be no other chance like this anytime soon.
Randi Zuckerberg, former Market Development Director and spokesman for Facebook CEO and Meta Platforms, Mark Zuckerberg, is a member of the Board of Directors of the Motley Fool. Anthony di pizio He does not have a position in any of the stocks mentioned. The Motley Fool has jobs in and recommends advanced micro devices, meta platforms, and Nvidia. The fool has motley and Disclosure Policy.