Table of Contents

“Necessity is the mother of invention”

DeepSeek | An Opportunity To Buy Nvidia? | Speevr



Imagine how amazing it would be if we could go an entire day without hearing, speaking, or reading the following words: AI, Crypto, Musk, and Trump. Blissful, isn’t it? Some of us go through the motions of caring about markets, even though our interest faded years ago. It's all a simulation.

What Everyone Wants to Know: Should I Buy Nvidia Shares After Today’s 15% Drop?

Let’s work through this step by step to address that question.

Key Considerations:

1. Misconceptions and Noise in the Market

Going through the market commentary, it’s clear there are fundamental misconceptions about AI among investors, with an over-fixation on specific notions and financial metrics. Today feels like a free-for-all. One guest on Fox News managed to jump from U.S.-China AI rivalries to quantum computing and Bitcoin, suggesting that the “Satoshi (SHA-256) encryption” used in Bitcoin is on the verge of being cracked by the Chinese. Easy there, tiger!

Someone needs to tell him that SHA-256 (often pronounced “Shark” or “Shar”) was developed by the NSA. It can be deciphered, but only with supercomputers. Bitcoin isn’t described as a “blockchain” for nothing—it’s more than just encryption. Of course, there’s always the conspiracy theory: maybe NSA encryption standards require supercomputers for everyone else to crack, except for insiders with access to a “secret algorithm” that reduces it to a Caesar cipher.

Predictably, the guest recommended quantum penny stocks from his own portfolio. We digress intentionally here—it ties into a point we’ll make later. The former spooks might already have hacked it.

2. Nvidia Price Movements: Often a Mystery

Most of the time, we have no clear idea why Nvidia’s stock moves, except around earnings results. Marc Andreessen’s recent tweet about DeepSeek likely breathed new life into a story that's been circulating in the developer community for a month which Wall Street had missed. Still, the idea that the proliferation of yet another language model is inherently bad for Nvidia lacks a clear basis.

If anything, AI startup runways are extended as operating costs decrease, enabling new viable projects and boosting aggregate demand for GPUs. However, there’s persistent confusion about Nvidia’s exclusivity in AI hardware, which we’ve covered in prior updates.

3. Nvidia’s Long-Term Outlook: Cisco or Apple?

A recurring concern for Nvidia is whether it will evolve into a Cisco (hardware commoditization) or an Apple (consumer ecosystem domination). Historically, the most value has been captured at the application layer, while hardware often becomes commoditized.

Right now, Nvidia captures the lion’s share of value created in AI. However, there are parallels to the dot-com bubble, where significant upfront investments in infrastructure—such as self-hosted websites—eventually became burdensome and redundant. As discussed with John Sung Kim on a call in December, the AI space has become uninvestable for many, even those with Bay Area connections. For average investors, Nvidia and the hyperscalers (Big Tech) are the only AI game in town.

4. Human vs. Market Capital Metric

Here’s a metric we occasionally like to track:

DeepSeek | An Opportunity To Buy Nvidia? | Speevr
DeepSeek | An Opportunity To Buy Nvidia? | Speevr


A friend/member—who won Gold and Silver medals at the International Physics Olympiad—began his career as a senior engineer at Nvidia in the late 90s. He later seized a unique opportunity at Google, where he led the development of its Tensor Processing Unit (TPU) team. A couple of years ago, he transitioned to venture capital at a prominent firm, focusing on AI microprocessor and cloud infrastructure startups. Had he stayed at Nvidia, he might now be investing his own money rather than managing external capital. Still, his journey underscores the growing competition and innovation in AI hardware.

The key takeaway: Nvidia GPUs are not the only option for developing large language models (LLMs), though they continue to dominate AI applications like image and video generation. It’s worth remembering that Nvidia got its start by creating graphics cards for video gamers.

5. Other Smart AI Plays

If we had to highlight a couple of old-school tech companies with smart AI strategies, Adobe and Disney (Pixar) come to mind. However, most consumer perceptions—and even fund manager opinions—are shaped by AI chatbot interfaces. Preferences often boil down to limited access and parameter settings for narrow use cases, which can be misleading.

Other AI-adjacent sector plays exist, but we’ve covered them before and won’t revisit them now.

6. DeepSeek and LLM Efficiency

Here’s a video (slightly technical) that explains the innovative steps DeepSeek took to train its LLMs, which differ from OpenAI’s approach. DeepSeek v3 and R1 were trained on more or less the same Nvidia hardware as GPT-4 but at a fraction of the cost.

For instance, the “1-Bit LLM” paper we highlighted last year showcases a clever computational approach that hasn’t received much attention. It’s cheaper and sufficient for many use cases compared to commercial models.

It also provides insight into how machine learning can be utilized to decipher encrypted communications more efficiently than relying solely on brute force methods.

In short, DeepSeek’s success can largely be attributed to optimizing OpenAI’s implementation of Google’s original 2017 LLM paper. However, advances in LLM algorithms and training methods shouldn’t be taken as an indicator of future demand for Nvidia microprocessors.

Conclusion: Should You Buy Nvidia Shares?

Taken together, if you believe that Nvidia’s price last Friday was “fair,” it might make sense to start adding to long positions now. However, given the current market confusion, we think better entry points could emerge in the next few days as investors digest the fundamentals and regain confidence. There is also likely to be additional news related to DeepSeek in the coming days.

Looking ahead, investors should anticipate and factor in unexpected stories about rapid advancements or breakthroughs in AI, as they could significantly impact market sentiment and valuations.

The only confident assertions we can make are: i) deep-learning will be the prevailing framework in AI, and ii) China will finish in the top 2 overall at the next International Math Olympiad.

Ben Thompson Blog Post

Here is very useful blog post just published by Ben Thompson.

He explains how it was both possible and plausible for DeepSeek to train their foundation model (v3) at such a significantly lower cost compared to U.S. competitors, likening it more to a “Huawei moment” than a “Sputnik moment.”

Contrary to reports of a U.S. export ban violation on high-spec Nvidia H100 GPUs, Thompson believes that DeepSeek’s innovation was likely driven by the necessity to optimize model training using lower-spec H800 GPUs, which are not subject to the restrictions.

The biggest relative winners from this development among U.S. tech firms are Meta and Apple, while it appears less advantageous for Google and its TPUs. His assessed outlook for Nvidia aligns closely with ours. That said, he views Nvidia almost entirely through the lens of LLMs, like many others.

Then again, Thompson isn’t a financial analyst—but sure as hell is more useful today than many who claim to be. Let's set the Wall Street narrative on this one according to Thompson's blog post. When push comes to shove, we know who the punters trust.


Updates

Okay, please, no more social media posts about DeepSeek by random people comparing it to OpenAI or other LLM vendors. It completely misses the point. Thankfully, we’ve yet to receive the “China trash talk” variety.

It seems incomprehensible to some that a startup with industrious people could spend *only* $6 million training a model, rather than blowing it all on beanbags and marketing. We charge a small fraction compared to most so-called independent research firms, yet when the soft brown stuff hits the fan, their customers come flocking to us for sanctuary. Price, after all, is an illusion.

We recommend reading Ben Thompson’s post in its entirety. It’s long, but not as long as Harry Potter and far more efficient than scrolling through hundreds of social media posts. The post was curated by a real AI developer with a background in computer science.

Additionally, our conclusions on Nvidia stock are caveated by anchoring on last Friday’s close as a reference. We haven’t expressed a view on Nvidia’s ticker price—or any other U.S. Big Tech stocks—since August of last year. See:

Nvidia | What Industry Insiders Are Saying

Tech that's difficult to explain or understand



Here's an extract from the piece:

Here's NVIDIA CEO Jensen Huang speaking with Bloomberg TV after his company's Q2 earnings call. You can skip past the Wall Street fluffers' commentary to 15:50, where Huang summarizes the earnings call in 12 minutes. Huang opens by saying, “The fact that I was so clear and it wasn't clear enough kinda tripped me up.” That's a nice way of saying he was surprised at how low he should have pitched the story and where market expectations are anchored.



Narrating an emergent or deep-tech investment thesis can be challenging for business leaders. As a result, they often focus on themes or business metrics that are easier to communicate and simpler for investors to understand. We noticed Rubrik faces a similar issue—yet its stock has risen 2.5x since its IPO last May. The market will soon create an investment thesis for them. Rubrik should benefit from both cheaper API token calls and self-hosted models.

Since the launch of ChatGPT, explaining NLP to the average person has become much easier and more intuitive. However, successful companies often fall victim to their own popular narratives over time.

It might make sense for Nvidia to gradually shift its story away from LLMs and inference models toward other exciting projects—perhaps in the realms of imagery and video. For such a pivot to succeed, though, a widely adopted mass-consumer application in that category would likely need to come online first.

It's time to shift the AI narrative to keep the punters involved.

Let's get back onto the regular schedule.

P.S. I've still not had a chance myself to check out DeepSeek. But another LLM tool is not what we're after right now, even if it's cheap.

Subscribe to receive updates from Speevr Intelligence

Most recent by Speevr Intelligence

Share this page

DeepSeek | An Opportunity To Buy Nvidia?

It’s a free for-all, so we’ll have a crack at it