Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
IT

Nvidia Dominates GPU Shipments With 94% Share (tomshardware.com) 43

An anonymous reader shares a report: The total number of GPUs sold for the second quarter of 2025 hit 11.6 million units, while desktop PC CPUs went up to 21.7 million units, according to a Jon Peddie Research report. This is a 27% increase in graphics card shipments and a 21.6% jump in CPU shipments from the last quarter, which is a change from the usual drop in deliveries we've seen in recent years.

"AIB prices dropped for midrange and entry-level, while high-end AIB prices increased, and most retail suppliers ran out of stock. This is very unusual for the second quarter," said Jon Peddie Research president Dr. Jon Peddie. "We think it is a continuation of higher prices expected due to the tariffs and buyers trying to get ahead of that."

As for the three major GPU manufacturers, Nvidia still has the lead, taking in 94% of the market -- an increase of 2.1% over the previous quarter -- while AMD is at a distant second place with 6%. This is still a much better position than Intel, though, whose market share is so small it did not even register on the chart.

Nvidia Dominates GPU Shipments With 94% Share

Comments Filter:
  • Still not going to buy from those scammers.

    • Re:Yes, so? (Score:4, Interesting)

      by serviscope_minor ( 664417 ) on Thursday September 04, 2025 @04:40PM (#65639422) Journal

      If you need to do ML training, you either need to buy from those scammers (not really sure how they are scamming) or you have a wild uphill ride because AMD are completely fucking insane and cannot give their arse with two hands and a map.

      Here's how you start with Nvidia:

      Get a GPU. Literally doesn't matter how. You can wander into CEX with a bunch of cash or provision a B200 on your cloud provider of choice or literally anything in between. Apt install the drivers. Pip install pytorch and it works.

      For AMD:

      Well first check the compatibility lists. Did this transition product match? Who knows? Will it work tomorrow? Lol. Can you get one that does work??? Goodluckwiththat. Fight through that and then comes the docker. Once it's finally working, you find AMD are chasing the long tail of rubbing CUDA code which has been hand optimised for a competitor different architecture so kind of sucks of it did work.

      I swear with 10 decent engineers, targeting pytorch and a couple of others in all their cards they could start making inroads on ml.

      But they are insane.

    • Care to elaborate?

      If it's just a scam of a price, everyone can mostly agree. But if it's not a scam on just the price - what's the problem?

      As merely an end user primarily for gaming their products have been reliable for the four to six years I keep each model in service. Well except for the original RivaTNT that bricked itself the first time I ever tried a firmware update in my life. $99 gone, and that still bugs me.
      • I'm not a chip expert, but it seems being optimized for gaming and optimized for AI should be different enough to split chip models.

        • by ceoyoyo ( 59147 )

          Vibe product development is next I guess?

          Game graphics are linear algebra, specifically lots of parallel affine transforms, which require a bunch of matrix multiplies.

          AI, by which you mean deep learning using neural networks, is linear algebra, specifically lots of parallel affine transforms, which require a bunch of matrix multiplies. Oh, and a (very) occasional clamp to zero operation.

          Anyway, Nvidia does split the product lines. There are gaming GPUs, which work fine for training deep learning models, and

        • I'm not a chip expert, but it seems being optimized for gaming and optimized for AI should be different enough to split chip models.

          Let's see.
          One of them relies heavily on raw parallel compute power.
          The other...relies heavily on raw parallel compute power.

          While those chips may have been originally designed for 3D gaming computation, the nice thing about CUDA is that it is generic enough and you can easily leverage that compute power in other applications as well.

          The not-so-nice reality about CUDA is that NVidia deliberately does not offer downward compatibility with old CUDA versions in their next GPU generations, so you have to regular

  • by PubJeezy ( 10299395 ) on Thursday September 04, 2025 @02:03PM (#65639054)
    Except none of them are actually being utilized in any productive way. They're a scam company and they're simply side-selling their product to other corporate customers. Taxpayers are the ones actually paying for this unproductive crap.

    It just game out that almost half of Nvidia's sales are going to two customers building data centers. These data centers are being built with massive amounts of tax breaks and govt subsidies. So the govt gives Microsoft a bunch of free money to buy crap. They spend it on GPU's from NVidia. They hook them up, create scarcity in the energy markets and drive up the cost of living. And then they're reimbursed for their spending or allowed to use it to offset their tax burden. At this end of false supply chain is a pile of federal money being pushed through a private corporation while American consumers are frozen out of the market.

    NVidia and it's relationships to the rest of the market is no longer plausible. It's clearly part of a cartel. Microsft, Apple and Facebook have been deliberately inflating this company via false scarcity and aggressive marketing. This just another example of circular revenue chains meeting govt subsidies.

    Even if you take these scammers at their word, the reason for all of these GPU's for their data centers is so they can...checks notes...run chatbots? WTF? That's not remotely plausible. These products are not popular at scale and corporate oligarchs who are personally invested in the companies selling these products are forcing unproductive adoption on their companies in order to boost their personal investments. That's all we're seeing.

    It's is a big club and we're not in it. NVidia, Apple, Microsoft, Google and Meta are bad actors. They are colluding with each other to manufacture metrics and falsify demand via circular revenue chains. They're all actively committing crimes against the American people for the sake of stock prices and energy market manipulation. It's disgusting and anyone promoting these fake products is to be ignored.
  • by sodul ( 833177 ) on Thursday September 04, 2025 @02:06PM (#65639064) Homepage

    I'm a bit baffled that there is so little market share for other players. AMD is only 6% and Intel is insignificant? It's like saying Toyota has 94% market share because their cars are so much better than everyone else. Is this all for the AI bubble, where Nvidia has the better software frameworks, akin the roads are made of Toyota only so you need to buy a Toyota if you want to go anywhere? What about video games market, which is the original market for Nvidia? I would assume that AMD has more than a tiny percentage, if only in consoles, but maybe that's not counted here. AFAIK PS5 uses an AMD GPU, while the Switch I and II use Nvidia.

    My understanding is that the AMD and Intel GPUs give you better bang for your bucks than Nvidia, but Nvidia just dominates in raw performance. To go back to a car analogy, not everyone buys a hypercar, most folks want to buy a reasonably priced vehicle good enough for the job. I guess in that case that's the embedded video chip you find on most Intel, Apple and phone chips.

    • The vast majority of those cards are not being used as video cards. AMD and Intel still make those.
      • That report was about consumer grade GPUs.

        Sure some 5900â(TM)s and 5800â(TM)s might be used in local inference tasks, but ai data centers are stocked with a a completely different set of GPU configurations that nothing in the article alludes to being included (H100â(TM)s or H200â(TM)s) for example.

      • This reported figures in the 3rd paragraph are from the Steam Hardware Survey. It is looking almost exclusively at GPUs which are used to run Steam games - i.e. video cards. NVIDIA is absolutely dominating gaming video cards. AMD is selling fuck all and Intel is a laughing stock.

    • It's written for proprietary Nvidia apis. Nvidia has so much money that they will send engineers to you to provide millions of dollars worth of free support. Combine that with the huge lead in adoption for those proprietary apis and AMD has never been able to get a leg up.

      That's on the server side, mostly machine learning, llms and what not. The same goes for workstations where all the software support is Nvidia because they can afford to give away tens of millions if not hundreds of millions of free su
    • It's like saying Toyota has 94% market share because their cars are so much better than everyone else.

      Unlike the car industry there are very limited useful metrics to judge GPU on rather than specific performance in a limited set of applications. If all that mattered for cars was top speed, and that was relevant to consumers there would absolutely consolidation towards a single dominant player.

    • It is very common for a market like this to become a monopoly. Especially when there is an essential API to interact with hardware. Software is expensive to create and the creators of software do not wish to write to more than one API whenever possible - two APIs mean two rounds of testing and odd bugs that are only seen on one system and not another (possibly even harder in the open source world where most programmers will only have one of the systems and so will primarily write the software and test with

      • By that logic we would not have Windows/Linux/macOS today. Same for PlayStation, Switch, XBox (kinda dying) or SteamDeck.

        On the other hand we've seen the monopoly twice with the browsers wars: first IE, now Chrome.

        We did have a long run where Intel was just crushing the CPU market and the fall is pretty hard on them now. The same could happen to NVidia.

        • How long did it take to get Linux to a state where it was widely used? Both Linux and MacOS is still a minority and ignored by many software companies. Windows still has a very large desktop market.

          You are right. Nvidia can also fall. There are transitions. One dominant player gives way to another. But it is pretty rare. Linux taking over the server market is one such example, Chrome overtaking IE is another. The timescale of these major changes seems to be around 10-20 years. Whether you like it or not, Nv

  • by 2TecTom ( 311314 ) on Thursday September 04, 2025 @02:30PM (#65639130) Homepage Journal

    There's little need for newer Nvidia GPUs in competitive gaming anymore, we all run with our graphics settings tuned to performance mode anyways. All that eye candy doesn't help. Ray tracing is still basically unusable in many games as well. The price of the new cards is prohibitive and exploitive and the market is hostile to average consumers, my solution is to boycott the high end and look for deals in the middle and the used marketplace, for instance, just look at RTX 3090 prices. That a four year old card still costs more and outperforms most 50 series cars is a crime. Not to mention the lack of VRAM scandal. There's no point in even buying a 50 series card until the adequate VRAM versions are released. Nvidia is just toying with us. I'd much rather support AMD and Intel anyways, we need real competition for a functional free market, so I vote with my wallet.

    • 5070 seems pretty good if you can find one for a reasonable sale price, which seems to be happening occasionally as 5 series sales have slowed. But overall, the 5 series is a bust. They don't offer enough performance improvement over 4 series to make it worth switching from one to the other, so they are having a hard time selling it even to compulsive upgraders. Obviously they don't really care as the vast majority of their income comes from data centers.

      I think the PC sweet spot for most people today is go

      • by 2TecTom ( 311314 )

        The main benefit of the 5070 is that it's readily available however, I'd wait for the ti or the super with more VRAM as it's often the real bottleneck not the speed, but the card's memory capacity. So yah, the 50 series isn't much of an improvement over the 40 series. I'm recommending AMD both for CPUs and GPUs at this point, the price versus performance is at least acceptable and Intel GPUs for entry level systems. Mostly I've shifted to the used market whenever possible. There a lot of used hardware out t

    • we all run with our graphics settings tuned to performance mode anyways

      I'm glad you speak for the entire gaming market. It's good to have such an authority figure here on Slashdot. I hope you have some references to show you were the appointed spokesperson for gamers, because if not you risk coming across as a gaslighting arsehole.

      I don't run in performance mode, I target best visual quality with a playable framerate. I guess today I found out I'm not a "gamer" like the rest of "you all".

    • This is true. The reason one is incentivized to obtain a new graphics card is because nvidia marketed the hell out of ray-tracing. Game graphics was a solved problem before ray-tracing made it not-solved again.
  • by laughingskeptic ( 1004414 ) on Thursday September 04, 2025 @03:36PM (#65639318)
    NVIDIA introduced CUDA (Compute Unified Device Architecture) in 2007 as a parallel computing platform and API that allowed developers to harness the power of GPUs for general-purpose computing. CUDA revolutionized computing by enabling massive parallelism, making GPUs ideal for scientific simulations, cryptography, machine learning, AI, and more. It provided accessible libraries and tools for developers in C, C++, Python, Fortran, and other languages. CUDA quickly became the backbone of GPU computing in academia and industry as a result of NVIDIA's open-source efforts.

    18 years later, Intel has neither jumped on the CUDA bandwagon nor produced a compelling competing library. Intel in institutionally incapable of creating something like CUDA because it is too hard to tie future revenues to an effort like this.

    NVidia bought into John Nickolls' vision. Intel completely missed this boat.
    • What do you mean "open-source"? If CUDA were open-source, you could run it on AMD GPUs or any other reasonable hardware. Instead, Nvidia provides libraries that only run on their own hardware.
      • The interface library is OSS. The parts where the work are done are closed. But those parts are Nvidia-specific, so they wouldn't be so much help to someone trying to implement CUDA for another type of GPU.

        ROCm is fully OSS, but it doesn't support whole families of AMD GPU. So it's doubtful that AMD would have support across their line even if CUDA were fully OSS, since they can't even manage to provide support across their line for their in-house-developed CUDA competitor.

        Before we can take AMD seriously i

    • by bartle ( 447377 )

      18 years later, Intel has neither jumped on the CUDA bandwagon nor produced a compelling competing library. Intel in institutionally incapable of creating something like CUDA because it is too hard to tie future revenues to an effort like this.

      More over, for all that Intel loves promoting their "AI" processors, it doesn't seem like they've really got a usable API that open source AI projects can take advantage of. At least, the AI projects I've looked at seem to support Nvidia and not much else.

    • Err CUDA is not even remotely open source. In fact it's closed source nature is why OpenCL was created.

  • ive seen good reviews of arc cards (ing and dedicated) but intel seems to be going slow with them?
  • by NovusPeregrine ( 10150543 ) on Friday September 05, 2025 @01:09AM (#65640234)
    That puts them well within range they can be called a monopoly and broken up. So that maybe their card manufacturing will be out back into the hands of people who aren't dickheads artificially inflating their value.

Counting in octal is just like counting in decimal--if you don't use your thumbs. -- Tom Lehrer

Working...