Nvidia's AI Monopoly
AMD announced two new Radeon GPUs today. If their presentation slides are trustworthy, they handily defeat their Nvidia counterparts in videogame frames-per-second-per-dollar. They also have physically smaller cards which draw much less power than the competition…
…but Nvidia still gets markedly better performance in basically everything besides rasterized gaming.
AMD's refusal to compete with Nvidia outside of gaming frustrates me. It's clear they have the ability and expertise to at least attempt it. Radeon even already has some level of hardware acceleration for AI, video encoding, and ray tracing. But as I understand it, AMD's developer tooling for this hardware is feeble compared to Nvidia's CUDA ecosystem.1
I have some hope that Intel will compete with Team Green in the future. But it won't be for another few years at least. They’re too busy playing catch-up on the decades of extra driver dev experience Nvidia has.
An Intel Arc engineer recently told me that drivers are the bottleneck. Driving the available silicon to reach its full potential is difficult. They told me that 6 months ago, Arc was nearly 50 times slower than today due to driver inefficiencies. They also said there's still another 5-15% gainable through future driver updates. This would place the A770 roughly on par with an RTX 3070 (its price competitor) in several benchmarks.
Arc shows promise—the A770 and A750 are a solid first salvo. But promise alone does not a fast GPU make. Intel has a real opportunity to disrupt the market in the future. For now, though, it seems non-gamer GPU customers will again be Nvidia's hostages for the next 18 months.
As a machine learning engineer, I am rooting for Team Blue. I dream of true competition arising in the AI compute world. But until then, RTX 40-series prices are still sky-high—and those are consumer cards! A100 DGX server units go for $200,000, and there aren't even reliable price estimates yet for their upcoming H100-touting successors! Team Green has no real competition in AI, and consumers (and by extension scientific advancements in AI) are hurting for it.
Enjoy it while it lasts, Jensen.
I am not a GPU engineer and cannot confirm this firsthand, but the general sentiment I’ve seen online from those who are backs this up.