Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Security Intel

Intel Discloses Three More Chip Flaws (reuters.com) 126

Intel on Tuesday disclosed three more possible flaws in some of its microprocessors that can be exploited to gain access to certain data from computer memory. From a report: Its commonly used Core and Xeon processors were among the products that were affected, the company said. "We are not aware of reports that any of these methods have been used in real-world exploits, but this further underscores the need for everyone to adhere to security best practices," the company said in a blog post. Intel also released updates to address the issue and said new updates coupled those released earlier in the year will reduce the risk for users, including personal computer clients and data centres. In January, the company came under scrutiny after security researchers disclosed flaws that they said could let hackers steal sensitive information from nearly every modern computing device containing chips from Intel, Advanced Micro Devices and ARM.
This discussion has been archived. No new comments can be posted.

Intel Discloses Three More Chip Flaws

Comments Filter:
  • Intel realy needs to start cutting prices to keep up with amd.

    And on the high end desktop line all cpu needs to max out pci-e lanes. Going as low as 16 is just an joke there.

    • what does this have to do with the article except intel is in both?? You IDIOT, you buffoon

      • by Anonymous Coward

        what does this have to do with the article except intel is in both?? You IDIOT, you buffoon

        It has everything to do with a consumer faced with a choice between Intel and AMD.

        Intel processors are disproportionately susceptible to security problems.

        16 lanes is woefully inadequate. Single graphics card by itself would consume all of them.

        Lack of ECC.

        AMD is a no brainer at this point.

        • Does AMD support ECC? Or any other manufacturer? I thought the prevailing view was that it is far better to suffer the occasional catastrophic crash or data corruption, rather than pay a few dollars more for reliable RAM.

    • by Anonymous Coward
      I know that a lot of people want that to happen whether they favour Intel or AMD. If Intel were to cut their prices AMD would probably follow suit and most consumers would be better off.
      But does Intel need to do that from their own perspective though?
      If you look at their market share the majority of people still appears to go for Intel despite the higher prices.
      If my overpriced shit would sell that well, why should I reduce my prices? The same logic seems to apply to nVidia graphics cards.
      Things may cha
    • Another thing observed in the wild is the lack of i11 n-core chip sets and cpu chips. I think Intel could definitely show some urgency in all these under powered tablet, and phone solutions.
    • by HiThere ( 15173 )

      Nonsense. All they need to do is ensure that all stories in the press blame all CPU chips equally, even when that isn't true.

  • ... at least nobody is bidding for "exclusives" on the firmware patches.

  • by Anonymous Coward on Tuesday August 14, 2018 @03:25PM (#57126004)

    The Reuters article quote Intel's blog: "...this further underscores the need for everyone to adhere to security best practices," the company said in a blog post.

    That first best practice would be not buying Intel chips. Glad there's an alternative.

    • Or ARM. Or AMD. Really, with advice like that, perhaps you should just not use a computer.
  • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Tuesday August 14, 2018 @03:28PM (#57126024) Homepage Journal

    No doubt Intel found out that someone else was going to disclose these flaws, so they got out ahead of it. They're pulling a Rudy here; try to beat the scandal, but then create one with their attempt to deflect responsibility to someone else:

    "We are not aware of reports that any of these methods have been used in real-world exploits, but this further underscores the need for everyone to adhere to security best practices,"

    Yeah, Intel. Everyone. Including the folks who have done the worst job of adhering to security best practices... Intel. You guys skipped security checks until after they were necessary to gain a performance advantage over AMD, and now you're trying to deflect attention from that by suggesting that security is someone else's responsibility. But the CPU is the heart of the machine, and you're responsible for deliberately compromising its security for a business advantage.

    • by Anonymous Coward on Tuesday August 14, 2018 @03:39PM (#57126096)

      Today's Wired article [wired.com] says the details of the Foreshadow attack [foreshadowattack.eu] would be presented tomorrow. Somebody is coordinating all this.

    • Re: (Score:3, Insightful)

      by thegarbz ( 1787294 )

      Yeah, Intel. Everyone. Including the folks who have done the worst job of adhering to security best practices... Intel.

      Wow, hyperbole much? I've yet to see an Intel flaw expose millions of online accounts, spread credit cards and social security numbers, bring down industry through crippling bugs that were exploited.

      Perspective man, you desperately need some.

      • by AmiMoJo ( 196126 )

        That's only because the response to it was relatively well coordinated and patches became available fairly quickly after the slightly premature announcement.

        The damage done is quite real and measurable though. 60% performance loss for some server tasks, meaning a massive cost increase or degradation of service.

        I'd love to see some stats on how many people sued Intel over this. They paid for my new workstation but I'd love to know how many more.

        • That's only because the response to it was relatively well coordinated and patches became available fairly quickly after the slightly premature announcement.

          HOLY FUCKING SHIT. No man. No. Not remotely. Not even slightly. Mass machines still not patched. Holes still everywhere. Daily security issues still being discovered.

          There was NOTHING AT ALL good to say about the response to this from any party at all, not from Intel, not from MS, not from the open source community. It was a textbook example of a horrid clusterfuck of a response.

          The damage done is quite real and measurable though. 60% performance loss for some server tasks, meaning a massive cost increase or degradation of service.

          Again hyperbole. The actual performance loss for many server tasks is about equal to the performance gains that have occurred than

  • https://img.purch.com/amd2-png... [purch.com]
    "Robust h/w and s/w ecosystem"
    "Robust h/w"
    "Robust"

    Intel was too cocky about their "robust" ecosystem.
    This is not just a backfire... this is a 2 years hw and sw security breach spree.

  • Intel Down, AMD Up (Score:4, Informative)

    by The New Guy 2.0 ( 3497907 ) on Tuesday August 14, 2018 @03:46PM (#57126152)

    Intel seems to be having problems again, while AMD is rolling out 2nd Gen Ryzen Threadrippers this week. AMD's got the high-end processor market all to itself, while Intel is revealing that they were never that good as they advertised.

    Intel could have had a monopoly if they didn't make the Pentium bug math error. Computers are supposed to be "perfect" at computations, but the Intel bug threw some court cases in the wrong direction. I'm not sure they can be trusted anymore.

    Now AMD is rolling out processor changes that were discussed here on Slashdot years ago, and they're off in the speed races and higher core limits. (Intel maxes out at about 6, new Threadripers offer 32 hyperthreaded cores that simulate 64 processors.)

    Intel better go back to the drawing boards... they're behind in a game they used to always win.

    • Intel better go back to the drawing boards... they're behind in a game they used to always win.

      Used to almost always win. The Athlon absolutely pounded Intel's chips at the time it came out, when it had superior processing power and power consumption. And let's not forget that Intel was forced to implement the amd64 ISA to maintain compatibility with AMD's superior processors again. Intel's primary advantage all along has been volume; what's changed is that now that's their only advantage.

    • by Anonymous Coward


      Intel could have had a monopoly if they didn't make the Pentium bug math error.

      What? That was in 1994, and consumers might have read about it in a newspaper, and not cared. It was basically a non-event, except among some scientists and mathematicians who rely on accuracy.

      Computers are supposed to be "perfect" at computations

      No, computers are supposed to be very, very good at computations and rarely make mistakes. They were never supposed to be perfect. Bits get flipped from cosmic rays, rowhammer, or ju

    • Intel could have had a monopoly

      I don't understand. Are you saying the only thing that prevented a monopoly was the math bug? Or better still are you suggesting that Intel hasn't been in an absurdly dominant position over the past decade?

      • AMD was almost out of business in the late 1990s... they were paying Intel for design patent rights, and couldn't compete on price or quality, until Intel stumbled with the Pentium math error. It was expected that AMD would merge with Intel, who would have basically shut the company down completely. Digital Equipment Corp. was trying the break into the WIndows game, but never got off the ground, and their processor line was shut down with the Compaq/HP mergers.

        Mr. Ryzen was with AMD in early 2000s, and laid

        • Yeah but I think you have a causality issue there. AMD was on life support before the Pentium bug, and they were on life support long after the Pentium bug. The only thing that gave them a boost was Intel's seemingly cyclic phase of resting on the laurels long enough to let competitors actually do some innovation. Even after the FDIV bug in the previous generation the P6 dominated the industry. It wasn't until Netburst that Intel gave AMD a chance to do anything to claim back some market share.

    • I was there Gandalf (Score:5, Informative)

      by epine ( 68316 ) on Tuesday August 14, 2018 @06:42PM (#57127202)

      Intel could have had a monopoly if they didn't make the Pentium bug math error.

      Computers are supposed to be "perfect" at computations, but the Intel bug threw some court cases in the wrong direction. I'm not sure they can be trusted anymore.

      Good lord, you can't be serious. The road to silicon nirvana is paved with errata sheets. (And always has been.)

      Furthermore, the division bug is a terrible example to bolster your cause, because the algorithm was correct in the first place, and the implementation of the algorithm in digital logic was correct in the first place, and then they dropped a very small stitch in the transfer to silicon layout. Had the stitch been any larger, they would have easily caught it during silicon validation. Hint: on randomized inputs, the bug is only triggered about once in 9 billion cases.

      Achieving 100% test coverage for all 3.1 million transistors is non-trivial, especially given the processing power available in 1990 three years before the Pentium was first released (what with cheap-ass PC memory costing $60,000/GB in 1990 dollars; double that for server-grade ECC).

      The only shitty thing Intel did in this chapter was try to sweep it under the run after the horse bolted the barn.

      And the truth of this is that back then, not a lot of software used the FP unit (most people had previously saved a few bucks by purchasing the 486SX castrato, which lacked the hardware floating point unit altogether, and most development shops pretty much assumed this was the defacto situation on the ground, so integer math was almost always preferred).

      It really was true that 90% of the people purchasing these chips were at low risk of any real consequence (the two-frame bump in the night right as you're closing in for the money shot in Falcon 3.0 possibly excepted—Falcon 3.0 was legendary for actually using the hardware floating point unit to actually compute a (mildly degraded) military-calibre flight model back in the 486 era (when nothing else did). The accurate inertial momentum effects when rolling hard simply blew everyone's mind. It was so good, you almost felt it through your feet (if you had been wise enough to invest in the 486DX).

      Poof! VERTIGO! VERTIGO! as the conspicuous fourth wall universally present in every kinetic 3-space simulator up until then suddenly vanished without a trace.

      There was just no way to point this recall at only those who needed it (proof of a previous 486DX purchase order would have been a not-bad fence; hard to believe if you had previously purchased the 486SX that just now you suddenly gave a shit, though wankers are gonna wank).

      So it's either pay to recall 9 processors causing a problem for every 1 processor that really needs to be replaced (at an enormous, globally unproductive expense), or panic and do a fatally stupid PR snow job. Intel picked door #2.

      "Daddy, daddy, where does CO2 come from?"

      "Well, son, it comes from flushing $500 million worth of almost perfectly good CPUs down the crapper practically unused, and then baking up a fresh set."

      Guess what? I'm old as fuck, and still sharp as a tack. So if your asbestos underpants are in any kind of mild disrepair, I'd stick to spinning mythical stories about the 1970s or the 1960s, if I were you.

      (Hint: I was already reading the 8008 data sheet to pass the time in my grade eight literature classroom. I would have had to mow my weekends to smithereens to actual own one at the price back in the day—not the very first version from 1972—but right around the time they came up with a simplified version reducing the number of mandatory voltage supplies from -12, +12, +5 to just +5. So even the mid-seventies are not quite free and clear for mythical reconstruction, wherever my lawn is found.)

      • The 11/17 division bug was in nearly every Intel Pentium processor on the Syracuse University campus in 1999, and showed up in my statistics textbook as well. Intel had to replace every chip it sold during that era.

        Computer/Calculator math is perfect in nearly every true implementation these days. Network transmission errors have gone away by error corrections. The problems of the 1970s are no longer a factor, the new set of problems is mainly the generation of heat in computers, time consumed, and compress

      • Good lord, you can't be serious. The road to silicon nirvana is paved with errata sheets. (And always has been.)

        I think you might want to wake up, smell the coffee and work out what this 'errata sheet' actually means and why it's happening.

      • oh wize Wizard - LOVED your post. and when, at the end, you had Sherman set the WAYBAC machine to 1972 ... my mind wandered back to my APL and Fortran classes. Saruman, out!
    • Intel seems to be having problems again, while AMD is rolling out 2nd Gen Ryzen Threadrippers this week. AMD's got the high-end processor market all to itself, while Intel is revealing that they were never that good as they advertised.

      Intel could have had a monopoly if they didn't make the Pentium bug math error. Computers are supposed to be "perfect" at computations, but the Intel bug threw some court cases in the wrong direction. I'm not sure they can be trusted anymore.

      Now AMD is rolling out processor changes that were discussed here on Slashdot years ago, and they're off in the speed races and higher core limits. (Intel maxes out at about 6, new Threadripers offer 32 hyperthreaded cores that simulate 64 processors.)

      Intel better go back to the drawing boards... they're behind in a game they used to always win.

      For gaining access to the cpu, you need to have access to the VM that boots that CPU. And if you have that, then what is the fuss about?
      I at home or with my small business server, I don't give a shit about the security flaw. I don't run a bank and frankly, I do most of my financial transactions via my cellphone. Why are we not concentrating on reality to see if someone next to me can read my cellphone contents.

      For the security breach would you need to be running software that somehow got installed and is

  • by SeaFox ( 739806 ) on Tuesday August 14, 2018 @04:29PM (#57126474)

    The lack of disclosed vulnerabilities does not mean vulnerabilities do not exist.
    To think "no news is good news" is not that far from "Security through Obscurity".

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      The lack of disclosed vulnerabilities does not mean vulnerabilities do not exist. To think "no news is good news" is not that far from "Security through Obscurity".

      I'll take "potentially has problems" over "definitely has problems"

      And with hindsight being 20/20:

      I've been on AMD for the past 10 years. When the Meltdown flaw was revealed I was not affected by it.

      It makes sense to continue using AMD, especially with Intel announcing even more flaws.

    • by AmiMoJo ( 196126 )

      What we do know is that because AMD correctly handles these kinds of situations they are not affected by most of these bugs. So the probability of there being similar flaws in AMD processors is much lower, even if we can't say that it is zero.

  • Nobody can get it fucking right. I give up thinking anything will get any better.
    • by Anonymous Coward

      Security is not a top priority.

      It takes too long to do right and sacrifices performance and is not a selling point.

      So, I agree with you.

  • by Anonymous Coward on Tuesday August 14, 2018 @06:34PM (#57127184)

    A brief history...

    Intel followed the very successful Pentium 3 design with Netburst, a radical new architecture that used a VERY long pipeline in the chase for a 10GHz (eventually) clock. It was terrible, but Intel paid outlets at the time, like Slashdot, to promote it as the second coming of chr-st.

    Meanwhile AMD was using its newly aquired team of CPU architects to build the world's first 64-bit compatible x86 chip, and the world's first true dual core x64 chip. And it was fantastic.

    No matter how much lies Slashdot et al were paid to say about Netburst, its hopelessness was obvious from day one (who would have guessed an ultra-long-pipeline stunk for this type of application). So after a few generations, Intel went back to the Pentium 3 design, crossed it with AMD's best patents (legal cos of a croos patent agreement between Intel and AMD), and made the Core 2 which today continues as the improved 'core' architecture in Intel's Slylake etc.

    What we did not know at the time was that Intel removed hardware memory access tests that a multi-core and or multi-threaded architecture that shares memory resourses must use. These tests are supposed to take the form of "lock and key" where a thread has a 'key' (id number) that must be tested in a 'lock' for any shared memory access. No lock and key means MUCH faster memory access and higher clocks/lower power- curiously EXACTLY those benefits seen over AMD til the release of AMD's Zen (but even then Intel keeps the clock advantage).

    Yes today's Intel parts, at best get 5Ghz while AMD's Zen+ is at 4.3 GHz cos of that 'illegal' (in computer science terms) Intel CHEATING. And that cheating is why Intel suffers from the terrible unstoppable exploits that Zen does not.

    Buy Intel and you are buying broken by design. Buy AMD's Ryzen and you are getting 'best of class' unless that buggy 0.7 GHz really matters to you.

    Tiday Intel compounds its cheating with buying the review methodology used to benchmark AMD products. So AMD just launched a 32-core 64-thread processor and Intel paid the usual suspects to bench only using programs known to use 8-cores or less. Whereas you or I would then run FOUR instances of the benchmark at the same time to actually stress the 32-cores, not one of the review sites even attempted this.

    Actually the Linux reviews were different since so many key Linux apps scale to any number of threads. They, of course, showed AMD's new threadripper to be a monster. But the bought and paid for Windows 10 reviews sites all 'wondered' who would want a 32-core part, given that "no windows user ever does more than one thing at a time on their computer". This is Intel's dirty money in play.

    PS I use the AMD 8-core 1700 in windows. It is jaw-droppingly awesome. Unlike Intel, you can just have everything working at the same time (and I came from Intel systems where one heavy app means you must close down other heavy apps first). Evey bad word currently said about AMD is financially sponsored by Intel's gigantic PR fund.

  • by Anonymous Coward

    ..."but this further underscores the need for everyone to adhere to security best practices,"

    I.e. Don't use Intel

  • Comment removed based on user account deletion
    • I do not recall seeing this many security problems cropping up over the last 30 years when it came to processors. Is this new or is Intel now having to deal with all the corners they have been cutting to gain an advantage?

      Most likely a combination of the two. With cloud computing being all the rage and with more sophisticated OS security (at least for mainstream desktop use) researchers and government agencies have started to focus more on exploiting issues in hardware, whether it be with the physical design of the hardware or the firmware that runs directly from flash.

  • Until then, *shrug*. These vulnerabilities are coming too fast with too little context to understand how they will impact security operations. I see a flood of articles crowing about the dangers of these issues, yet honestly, I haven't seen much real world impact. Maybe it's because I don't interact with desktop users or run untrusted javascript, I dunno. However, I just wish every security advisory had a nutritional information section where they had to admit "No, we still can't figure out how to make this
  • This is just more planned obsolescence PR--another nudge to go buy new chips. I'm still not buying new chips.

A committee takes root and grows, it flowers, wilts and dies, scattering the seed from which other committees will bloom. -- Parkinson

Working...