Intel Is Suddenly Very Concerned With 'Real-World' Benchmarking (extremetech.com) 72
Dputiger writes: Intel is concerned that many of the benchmarks used in CPU reviews today are not properly capturing overall performance. In the process of raising these concerns, however, the company is drawing a false dichotomy between real-world and synthetic benchmarks that doesn't really exist. Whether a test is synthetic or not is ultimately less important than whether it accurately measures performance and produces results that can be generalized to a larger suite of applications.
Let me guess (Score:5, Insightful)
How much you want to bet the benchmarks that are "inaccurate" are those Intel has been getting beat on, while the "real world" ones are tests that Intel expects it will do best on?
Re: (Score:3)
No! They'd never do that!
When you can't beat them, do your best to discredit them with FUD!
Re: (Score:2)
Apparently it stripped out my 'sarcasm' tags... and there is no edit button. Lovely.
Re: (Score:3)
As frustrating as it is , the ability to edit wouldn't work well in these parts.
Re: (Score:2)
Re: (Score:2)
I think you are looking for the <. I drop mine all the time.
Re: (Score:2)
That or I just wasn't paying attention...
Re:Let me guess (Score:4, Interesting)
To be honest, AMD seems to be obsessed with CineBench benchmarks, as their software seems to be well optimized to use as many processor cores as possible.
That said, how often do you use CineBench in real life? I don't use it at all. I'd imagine that 95% of computer users are in the same boat as I am. Personally, I prefer to see a more meaningful benchmark like how fast it can unzip a huge file or crunch numbers in a large Excel spreadsheet.
Re: (Score:1)
how often do you use CineBench in real life? I don't use it at all.
That's the problem.
I rarely unzip a "huge" file or crunch numbers in a huge Excel spreadsheet. Everyone has different applications, which makes it very difficult to come up with a "real world" benchmark.
Re: (Score:2)
I hear you... the most intensive application I use on my PC is building Docker containers. That said, that seems to tax the SSD more than it taxes the CPU, and it's not something that your typical end user would be doing.
Re: (Score:2)
Re: (Score:2)
Meaning, a .zip file can only be compressed/uncompressed so fast until the HDD/SDD hits it's data RW limit.
Which is presumably why people are testing LZMA instead of deflate nowadays?
Re:Let me guess (Score:4, Insightful)
To be honest, AMD seems to be obsessed with CineBench benchmarks, as their software seems to be well optimized to use as many processor cores as possible.
CineBench is an excellent indicator of how a processor will perform under a heavy load. This is important because Intel and AMD are competing for money and the real money is in the high-end server market where server workloads max out every possible processor.
Profit on desktop CPUs are razor thin (to keep out new competitors) and thus quiet irrelevant when it comes to making money.
Re: (Score:3)
While true, CineBench is one of those "embarrassingly parallel" workloads that benefit from just having more cores. Not all high-end server workloads perform like that, however. For example, H264 encoding is only parallelizable in pass-2. Even various Photoshop filters don't scale with cores the way CineBench does.
So while I wouldn't discount it as "useless", it's certainly an outlier, not the norm for HPC workloads.
Re:Let me guess (Score:5, Informative)
For example, H264 encoding is only parallelizable in pass-2.
Of course it isn't. You just run as many jobs in parallel as required to utilize the machine fully.
Re: (Score:3)
If my code was maxing out the server I would be really worried. I get on the optimization bus if we start clocking 50%. Max out is for peaks, major events or DDOS attacks.
Re:Let me guess (Score:4, Interesting)
If my code was maxing out the server I would be really worried. I get on the optimization bus if we start clocking 50%. Max out is for peaks, major events or DDOS attacks.
Then you're not in HPC, that's for sure. I'm a developer for an HPC job scheduler. If say, an EDA customer doesn't see their HPC cluster running at close to full utilization all the time, they start asking questions and implying that the job scheduler isn't capable of scaling up and keeping their cluster fully utilized. (Sometimes, the CPUs aren't maxed out but memory is. Sometimes it's a tool license allocation issue. )
Re: (Score:3)
Think of Netflix transcoding farms... which is what a measurable chunk of Amazon's data centers are. Unused CPU cycles are unsold CPU cycles to Amazon and unused CPU cycles simply don't exist from Netflix's perspective since they only pay for what they use.
Even if you are doing all in-house work, virtualization should have gotten you to stop thinking "% CPU" a long time ago. If my stuff consumes 50% of all of the cores, the other 59 guests on the same host are going to have a bad day.
Re: (Score:2)
If you spent $x and you are only utilizing 50% of the CPU capacity, and assuming that the CPU capability is linear, then you spent twice as much money as you needed to spend.
The objective is to ensure that all resources are 100% consumed at the same time. If they are not then you (a) spent money on something you did not need to spend money on; and, (b) failed to spend money where it was required to be spent.
Re:Let me guess (Score:5, Interesting)
Yes, there is a point that AMD is hyping up a particular niche which matters less in general market.
In the other benchmarks, Intel and AMD are more roughly equivalent in performance.
Still some problems for Intel:
1) AMD is pricing *much* lower than Intel, so the 'budget' choice is just as good for most things and *MUCH* better in other things.
2) With a broader market of higher core count systems out there, some developers may invest to take more advantage of the extra core count
3) Real world often involves running applications concurrently and chaotically. So even for applications that aren't so multi-core friendly, they won't get bogged down by extra applications.
Intel was too distracted and too used to not having pressure from AMD.
AMD's success has been highly correlated with the arrival and departure of Jim Keller. He joined AMD the first time and AMD produced something much more compelling than Intel had at the time (K7/K8). His departure marked AMD trailing Intel. He returned and had a great deal of responsibility for the architecture leading to today. Unfortunately, he has left AMD again and last year ended up at Intel. Time will tell whether AMD has a more robust engineering team without Keller this time around.
Re: (Score:2)
1) AMD is pricing *much* lower than Intel, so the 'budget' choice is just as good for most things and *MUCH* better in other things.
That is what switched me to AMD when I got my new computer last year. AMD was significantly cheaper than an Intel of comparable performance. It was a no-brainer.
Re: (Score:3)
Intel was too distracted and too used to not having pressure from AMD.
Hardly. Intel was distracted by the low power processor market (used in phones), and on either an engineering or leadership level, Intel utterly failed to make a mark in that hardware niche. One can't claim Intel wasn't focusing on the wrong market niche.
Intel's disastrous loss of ground again AMD is all Intel's fault. They had the bad luck (besides inferior engineering) of cutting (security) corners on their SMT features, and near "simultaneously" did not anticipate that their CPU design would have prob
Re: (Score:2)
One can't claim Intel wasn't focusing on the wrong market niche.
While true, I didn't say they were distracted by the wrong things, just too distracted to stay ahead of AMD in their core market. Of course the fact they were unable to pull it off makes it a distraction in hindsight. However, at the time I called the efforts as futile, that Android on Intel was about as desirable as Windows on ARM, that the software+hardware ecosystems had been established and was skeptical of Intel's or Microsoft's ability to get their desired results in their respective efforts. Intel
Re: (Score:2)
1) AMD is pricing *much* lower than Intel, so the 'budget' choice is just as good for most things and *MUCH* better in other things.
This is one of the main reason's I've always been rooting for AMD. Your money goes further. I don't game on PC, so I don't need every clock cycle. Just like on the Intel side, I'm not the guy buying the top-of-the-line i9 that costs $1000. I try to find that balance between speed/power and cost, and find the sweet spot. Because AMD is cheaper on all levels, that sweet spot is broader.
3) Real world often involves running applications concurrently and chaotically. So even for applications that aren't so multi-core friendly, they won't get bogged down by extra applications.
I think this is where a lot of people miss the real-world application part. gamers want single-core speed. everything e
Re: (Score:1)
1) AMD is pricing *much* lower than Intel, so the 'budget' choice is just as good for most things and *MUCH* better in other things.
That might currently be the biggest advantage for AMD, but also the one Intel can most easily remove by lowering their own prices.
Right now on the desktop, Intel only has a relevant advantage in gaming and that only with their high end models. As soon as you go to cheaper models, you will find some AMD CPU for a similar price that will do as well in games and better at most applications.
BTW, I think for most office applications and media consumption it is irrelevant if you buy Intel or AMD. Either will do t
Re: (Score:2)
If you are unzipping huge files or running a large excel spreadsheet you don't need a high end ryzen, i9, xeon, threadripper etc. These benchmarks are for high end processors. That is why they run things like Cinebench and other compute intensive applications. I mostly run some chromatography simulation software and it parallelizes well and stuff like Cinebench is a good indication of how my software will also perform while something like excel or unzipping a file is not.
Re: (Score:2)
Re: (Score:3)
Re: (Score:3)
Byte magazine used to make Apple Macs look bad compared to PCs by enabling SANE when compiling the benchmarks on Macs. SANE is Standard Apple Numeric Environment, a highly situational library that took floating point processor output and made sure the last few digits were identical to the Apple II.
So, yes. It isn't just the hardware companies that sometimes try to game benchmarks.
Re: (Score:2)
I particularly enjoyed how they presented slides at the same event [extremetech.com] using synthetic benchmarks to compare their CPUs generation to generation as well as to ARM CPUs.
Re: (Score:2)
Don't forget that intel has been caught having it's "optimizations team" or whatever they call that group that goes to major software vendors to help them optimize their software for intel CPUs, sabotage AMD CPU performance with their intel optimizations.
One of the most likely reasons why photoshop still performs like complete crap on AMD hardware compared to intel's for example.
Company advocates in its own best interests.. (Score:3)
Re: (Score:2)
The irony is that anything gets treated as "news" by mainstream media. Its all about infotainment, not discerning and uncovering real news.
Comment removed (Score:4, Insightful)
Re: (Score:2)
Intel is losing, and they're upset
When you can't keep your multi-million/year job because you're losing, wouldn't you be upset? Good going, you like to make redundant comments the way news media considers this issue as news. I got some news for you; your direct experience still doesn't mean jackshit to Intel or statistically minded people, even after Intel adjusts and returns to making billions/year.
Re: (Score:3)
Intel, tell us how big of a performance hit did we take because of Spectre/Meltdown patches?
Ummm zero? Because why on earth would you patch out something that has no impact on you other than performance? Though you mention servers so you probably actually have a legitimate use case to patch out Spectre/Meltdown.
The reality though is Intel is losing and it has nothing to do with Spectre / Meltdown. It just makes more financial sense to go AMD, and if you need a multicore system then it makes more than just financial sense.
Re: (Score:3)
Ummm zero? Because why on earth would you patch out something that has no impact on you other than performance? Though you mention servers so you probably actually have a legitimate use case to patch out Spectre/Meltdown.
The reality though is Intel is losing and it has nothing to do with Spectre / Meltdown. It just makes more financial sense to go AMD, and if you need a multicore system then it makes more than just financial sense.
It's more that you have to prevent it from being patched out and avoid any new Intel processor that has it patched in microcode.
no claro (Score:2)
Isn't "results that can be generalized to a larger suite of applications" the same as "real world" performance?
Dear Intel (Score:5, Insightful)
The only benchmarks that matter to me is actual performance in my use case. Any other measure of performance is unimportant to me. You can market all you want. You can cry everyday. You can do the PR song and dance non-stop. In the end none of that will matter to me. I'll purchase the best product for my use case and you will never be able to convince me to do otherwise. Same applies to you too AMD.
Re:Dear Intel (Score:5, Insightful)
Re: Dear Intel (Score:1)
Re: (Score:2)
Really? I just went to the configuration page for a Dell PowerEdge R840 server. It lists 34 CPU options, and would cost about a half million dollars to buy four of each. That's one server model from one vendor. Just making a simple decision like "is a four socket server right for me" is a $50,000 experiment. Add to that "few fast cores or many slower cores" and we're pushing a quarter million. "AMD or Intel?"- half a million. A lot of really large companies can afford to do this, but most cannot.
Add licensi
Re: (Score:1)
respectfully yours, $evil_corp
Too Bad (Score:5, Insightful)
Re: (Score:2)
>"The problem for Intel here is that typical usage by the vast majority of computer users don't stress modern CPUs "
+1
This is exactly why I was able to easy go 8 years using a Phenom II processor in my home computer (Linux). It just worked fine for most use cases. The thing that made the difference of a lifetime? Switching to an SSD. THAT was the game-changer, and made my then 4-year-old Phenom II system sing and all thoughts of updating anything else went away and stayed away for 4 more years! And
Duh! (Score:2)
Re: (Score:1)
Re: (Score:3)
The same can be said about EPA ratings.
Tell that to Volkswagen...
Re: (Score:2)
This is really wonderful to see (Score:5, Interesting)
And so we (AMD) did what Intel also did; we looked at a variety of real world applications such as Winstone (scripted MS Office tasks, produced by magazine publisher Ziff Davis), canned scripts for Quake / Unreal engines, and anything else that painted the products we had in a generally competitive light. To be sure, there were independent benchmarks being run by the likes of TomsHardware.com, and the conclusions reached by those independent actors could be in agreement or disagreement with the work of my technical marketing group.
Now the proverbial shoe is on the other foot, and AMD is finally stomping Intel not just on price-performance metrics, but in raw performance as well. We see Intel going back to the playbook they used in the old K6 v Pentium days where FUD ruled the day. The difference now is that AMD has built a marvelous ecosystem with system board manufacturers, and Intel is likely relatively powerless to tell such vendors that they stand to lose early access to later design specifications if they play nice with AMD today. There were lawsuits about this, and Intel lost in court. AMD is in good fighting shape now, and Intel is apparently going back to deception as a means of getting consumers to vote *against* a superior product with their wallets rather than convincing customers that they (Intel) have a superior product.
What's old is new again...
Re: (Score:2)
and AMD is finally stomping Intel not just on price-performance metrics, but in raw performance as well.
Where? Where money is not an issue, what AMD CPU would I choose to use over an Intel CPU, for what use case? (And you were a freaking manager of a benchmarking team...)
No, AMD only wins on "value", $/IPC. And the irony is the one use case they should have been able to displace Intel was server virtualization, but businesses have rational reasons not to cut over to AMD en masse. (How are they going to get the CPU/motherboard allocation they need, and how can they be sure 4 years later that there won't be
Re: (Score:2)
and AMD is finally stomping Intel not just on price-performance metrics, but in raw performance as well.
Where? Where money is not an issue, what AMD CPU would I choose to use over an Intel CPU, for what use case?
If money is not an issue, have a custom chip designed.
Money is always an issue.
Re: (Score:2)
Why is your question relevant? As someone that manages many a VMWare environment while typical vmotion won't work across architecture it is still trivially easy to find a reboot window to do an offline vmotion between AMD and Intel. You aren't locking yourself into AMD so any company using that rationalization as an excuse is just an Intel fanboy. AMD does not win just on value. The irony of your statement is that even when I have budget for a Xeon platinum processor I can almost never actually get it. EPYC
Just goes to show everything changes (Score:2)
I am sure in time it will turn again.
Advantages of incumbency (Score:2)
Currently not even close ... (Score:5, Informative)
Today, you can put together an entire 128-hardware-thread AMD server with 1TB RAM for $15k ... in a one socket system! With Intel, the 1-socket systems mostly max out at 64GB RAM, and you have to go all the way up to an ultra-expensive 4 socket solution to achieve 128-hardware-threads and 1TB RAM, although soon it will only require 2 sockets.
Intel is an amazing company, but they really dropped the ball over the last 5 years.
Easy... (Score:5, Informative)
Re: (Score:2)
That's because any of the 'real world' programs which happened to be compiled using Intel's own c compiler give them an artificial boost. They've gotten sued over this jn the past, because their compiler only used MMX extensions and such if it detected the 'genuine intel' processor flag, and NOT just check if the cpu said it could do MMX - effectively forcing AMD processors to run things with an arm tied behind their back reducing their real-world performance.
Intel still does that with their compiler; It ignores feature flags and uses the processor ID instead for exactly the reason you give.
The only result of the lawsuit is that Intel has to admit that they do this in their documentation. There is no requirement that they support or even not deliberately cripple the performance of processors made by other companies.
A SPEC member complaining about benchmarks? (Score:3)
Intel has been a member of SPEC [spec.org] for decades; a charter member IIRC. So they've had direct input into, and access of, industry-standard benchmarks for all of those decades. So where's the beef? Sounds like whining to me.
Intel really needs to worry about Cuisinart (Score:2)
Obligatory Dilbert comic (Score:5, Insightful)
Imagine that.... (Score:1)
Intel is advertising for AMD (Score:2)
If I were an AMD sales rep, I'd grab those "AMD is not better than us in average desktops, they're better than us in high performance workstations and servers!" statements and go straight to major high performance workstations and server vendors and show them these statements.
"Look, intel themselves are telling everyone that we're better than them in the stuff you're selling. You seem to be selling a lot of intel hardware in spite of these statements. Let's talk about upgrading your products in accordance w
It's the operating system, stupid. (Score:2)
Real-world doesn't mean writing bare-metal computation programs. It means everyday use which depends on the operating system. The one most people use is pretty godawful. Don't pretend that you're ready to use on startup until you are really ready for ME to use it. No, as a matter of fact, your annoying system tasks are no more important than what I need to do.
Then, of course, there is the problem that exists between chair and keyboard. That will kill performance right there.
You know what they say about opinions (Score:1)
Whether a test is synthetic or not is ultimately less important than whether it accurately measures performance and produces results that can be generalized to a larger suite of applications.
Says you.
How is this best put... (Score:1)