Critical Components: More than Just a Line Item on the BOM
Dependencies are a real thing, whether in hardware or software. As AI and software increasingly intersect with the real world, the tech community is becoming more aware of the importance of hardware dependencies, supply chains, and bills of materials. Unfortunately, these factors can limit innovation speed and emerging technologies' scalability.
Scarcity in the supply chain is an important issue, demonstrated by the current competition for NVIDIA GPUs. This not only constrains availability but also raises concerns about monopolistic risks. Relying on a single supplier for critical components can pose significant risks, stressing the need for diversity in supply chains and risk management strategies.
Interestingly, these challenges can also serve as competitive advantages. Startups that can secure access to scarce, specialized components can attract top-tier talent and potentially outperform their competitors.
Finally, the technological landscape is constantly evolving. As some technologies like LiDAR are phased out in favour of more efficient solutions like camera-based computer vision, other technologies like GPUs are becoming indispensable. This underscores the importance of adaptability and foresight in both hardware and software dependencies.
How Far Would You Go for a GPU?
Great discussions in the recent episode of the “Hard Fork” podcast by Kevin Roose and Casey Newton.
Data is often dubbed the new gold or oil, but GPUs are the indispensable pickaxes and shovels for this AI rush. The lengths people go to acquire GPUs are nothing short of astonishing. In some cases, entire venture deals hinge on access to these crucial compute resources. In a fascinating twist, startups are luring top-tier talent by touting their GPU inventory as a unique perk, not just standard infrastructure.
This development highlights our dependency on a monopoly provider and underscores the vital role of procurement, supply chain, and hardware development—even in sectors seemingly as intangible as software. As we advance into the rise of robotics and distributed intelligence systems, we should question whether supply can keep up with skyrocketing demand for high-end compute power over the next decade.
Bosch Bids Bye-Bye to LiDAR
I've long advised my portfolio companies and mentees that LiDAR is not the future of autonomous driving and robotics. Leaders like Tesla have shown that computer vision models can effectively replace LiDAR by using simple and cheap cameras. These camera-based systems are not just more powerful but also more cost-effective, lighter, and less power-hungry.
Emerging technologies from companies like Airy3D in Montreal allow us to enjoy the benefits of LiDAR using simple single-camera solutions.
Sidewalk Strolls, Jetsons, and Level 4 Autonomy
Ironically, while major players are divesting from LiDAR, some startups are pushing the envelope of Level 4 autonomy using old-school active sensors like LiDAR and ultrasonics.
NVIDIA is at the core of these advancements, whose platforms and GPUs are embedded everywhere, from mobile systems to cloud server farms. While I love the NVIDIA Jetson platform for its use in edge AI, my anxiety grows as we face a supply chain crunch.
The growing fields of mobile, humanoid, and service robotics are all vying for these precious chips, and companies need to recognize that these components aren't just another line item on the bill of materials—they're critical to innovation and commercial success.