Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Security AMD Intel

Researchers Discover Seven New Meltdown and Spectre Attacks (zdnet.com) 98

A team of nine academics has revealed today seven new CPU attacks. The seven impact AMD, ARM, and Intel CPUs to various degrees. From a report: Two of the seven new attacks are variations of the Meltdown attack, while the other five are variations on the original Spectre attack -- two well-known attacks that have been revealed at the start of the year and found to impact CPUs models going back to 1995. Researchers say they've discovered the seven new CPU attacks while performing "a sound and extensible systematization of transient execution attacks" -- a catch-all term the research team used to describe attacks on the various internal mechanisms that a CPU uses to process data, such as the speculative execution process, the CPU's internal caches, and other internal execution stages. The research team says they've successfully demonstrated all seven attacks with proof-of-concept code. Experiments to confirm six other Meltdown-attacks did not succeed, according to a graph published by researchers. Update: In a statement to Slashdot, an Intel spokesperson said, "the vulnerabilities documented in this paper can be fully addressed by applying existing mitigation techniques for Spectre and Meltdown, including those previously documented here, and elsewhere by other chipmakers. Protecting customers continues to be a critical priority for us and we are thankful to the teams at Graz University of Technology, imec-DistriNet, KU Leuven, & the College of William and Mary for their ongoing research."
This discussion has been archived. No new comments can be posted.

Researchers Discover Seven New Meltdown and Spectre Attacks

Comments Filter:
  • 2018... (Score:2, Funny)

    by Anonymous Coward

    the year of the k6 processor.

  • I hereby claim copywrite and trademark privileges to the above work. All rights reserved. Please enquirer directly for permissions or use licensing.

    "a sound and extensible systematization of transient execution attacks"

    • Trademark doesn't even protect a "work." If it was doing work, we know it wasn't a mark.

      Maybe apply for a design patent next time.

  • by fahrbot-bot ( 874524 ) on Wednesday November 14, 2018 @01:23PM (#57643792)
    Researchers discover that computers are only 100% secure while powered down and still in the box.
    Further investigation is need to determine how this affects productivity.
    • Re:News Flash (Score:4, Insightful)

      by Anonymous Coward on Wednesday November 14, 2018 @01:51PM (#57643972)

      Do you have proof that a powered down computer in the box is actually secure? After all, they have batteries and some components are active.

    • Researchers discover that computers are only 100% secure while powered down and still in the box.

      Further investigation is need to determine how this affects productivity.

      Breaking News: Computers can be broken into using a simple screwdriver, even when powered down, leaving hard-drives exposed to hacking attempts...

      News at 11....

    • If this affected all computers equally, you'd even have a point; as it is you're just trying to be misleading while sounding smaht.

  • will be lost in vulnerability workarounds :-/
    • by DontBeAMoran ( 4843879 ) on Wednesday November 14, 2018 @02:36PM (#57644278)

      For the majority of users, we could be doing fine with computers from 1998 if the operating systems, applications and the Web had not suffered so much bloat, especially because of the overuse and bloat of using multiple javascript librairies because web monkeys are too lazy to write their own five lines functions in javascript.

      The only regular users who need so much computing power are gamers, where security is not exactly critical.

      Then there is an extreme minority of users and datacenters who need both security and computing power, but those are specialized users and should move to a different architecture.

      • by Anonymous Coward

        I doubt the problem is laziness. It's incompetence.

        Programmers used to be incredibly skilled, not least because the machines where they cut their teeth were insanely limited which lead to intimate knowledge of how a computer actually worked, assembler, etc. They were working _really_ close to the hardware, because that was the only way to get anything useful out of the machine, which was horrifically expensive.

        Since the 90's however, the exploding capabilities of hardware have allowed companies to consisten

        • > And even if they do know, they are not given time to fix their crappy code,

          Nail. Head.

          "There is never time to do it right but there is always time to do it over."

          • by Zmobie ( 2478450 ) on Wednesday November 14, 2018 @04:52PM (#57645256)

            This is more the problem. I've known both flavors or software devs, code monkey or genuine talent, and while obviously you want the later they both face the same issues. The number one issue for code optimization is time constraints. Until the performance of the code actually becomes a problem and there is significant monetized benefit to improving the underlying architecture/design/implementation no one in management is really going to care if the dev got the algorithm to run in O(n) or O(logn). The questions they ask are simply does it work, is the customer ok with how it works, and how fast can you give it to us/them to buy/sell?

            I remember one piece of code I managed to write on a project that I was actually extremely proud to have written. It was a completely proprietary need, so it had to be done in house with no libraries and was a very core piece of the service I was creating. I was able to do O(1) insertions, keep it self sorting for easy traversals, and perform O(1) lookups while it cleaned itself up. It even had a very minimal spatial complexity because I managed to do some wonderful pointer magic (in .NET at that) with a few different data structures. Once I completed the service I was actually satisfied with the implementation and that part of the code didn't have to be touched again (ever from what I was told by a few other project leads years later). You know what the project manager said? He didn't care at all and was actually upset that I didn't finish it early so I could work on the other developer's pieces because they were going slower than he wanted (I was even on time with the delivery and still got that response). Upper management sided with him and to avoid further friction moved me to a different project and threw a code monkey onto their team that spat out hot garbage...

            I think if you talk to most good developers they probably have a similar story if they have been in the industry for more than 5 years or so. Scope creep and the insatiable desire for new tangible features are the biggest enemies to efficient and optimized code these days. We probably have just as many genuinely talented developers in the industry as ever imho.

            • Wow! Your story is disheartening but mirrors what Ive seen. I'm starting to hate the corporate culture more and more. No one has time to answer questions, everyone is too busy trying to get their stuff done, etc.

              Smart programmers, like you, get shat on and then companies wonder why they have a hard time finding god people! I swear clueless management has to be responsible for 49% of the problems along with the other 49% being code monkeys.

      • Comment removed based on user account deletion
        • I guess it's a difference of experience, I grew up with computers and started on a Tandy Color Computer 2 with a tape cassette reader and a CPU that ran at less than 4MHz with 64KB of RAM.

          From my point of view, I see 600MHz CPUs with 128MB of RAM as more than enough to do basic office work. I used to be enough, then bloat creeped in around the era you're thinking of as being the golden era of computing.

          Without bloat, a single computer with a quad-core CPU and 4GB of RAM would be the server doing the workloa

  • Maybe... (Score:5, Interesting)

    by jd ( 1658 ) <imipakNO@SPAMyahoo.com> on Wednesday November 14, 2018 @01:27PM (#57643822) Homepage Journal

    ...This wasn't the best way to improve performance. There are other approaches, or modifications to existing ones.

    Does anyone know if Itanium 3 was affected? If not, Intel might want to revisit it, as there's bound to be commercial interest in fast, secure processors. (Because it was a ground-up redesign, it would have been free of defects from mainstream processors.)

    I'm guessing the UltraSPARC/T3 is safe, for similar reasons. Totally different internal architecture.

    • Anyone know how this affects really strange x86 CPUs like the Transmeta Crusoe, or VIA stuff? I'm also curious about MIPS, HPPA, Alpha, VAX, and POWER.
    • Re:Maybe... (Score:4, Interesting)

      by thegarbz ( 1787294 ) on Wednesday November 14, 2018 @03:25PM (#57644656)

      ...This wasn't the best way to improve performance.

      Maybe given the incredibly low threat posed by side channel attacks given that they literally require letting someone not only run code on your computer but also have the opportunity to characterise that computer in attempt to learn how to actually achieve something with a speculative execution attack, maybe given all that it was a GREAT way to improve performance.

      We are nearly 1 year in, and there have been no nefarious exploits utilising this despite the fact that for the most part you could consider perfectly patching these holes almost impossible. Remember that when you think of trade-offs.

  • by shaitand ( 626655 ) on Wednesday November 14, 2018 @01:34PM (#57643854) Journal
    https://zdnet1.cbsistatic.com/hub/i/2018/11/14/15e46793-eebf-46b5-8fbd-23896b34a1ae/9641c5228c53fbde1d8778dd94ae5832/new-meltdown-attacks.png

    Not that quantity of vulnerabilities is everything but Intel and Arm are in serious relative trouble... again. How many of their performance and power advantages over the last several years have been substantially due to the of taking secure design shortcuts? AMD may be even further than the lead than we've realized.
    • How many of their performance and power advantages over the last several years have been substantially due to the of taking secure design shortcuts?

      Probably none given the incredibly difficulty of doing anything useful with such an attack without already having unsupervised access to a computer. By none I mean they didn't take any shortcuts and instead put through what looks like a reasonable performance trade-off.

      Or are you taking a shortcut right now reading this? I suggest if you're worried about Spec Execution attacks you start with the low hanging fruit and take an axe to your modem. You'd be insane to have a connection to the internet if you're w

      • Actually I think this is a great thing, the people who are likely most concerned are those who are trying to lock down information from those who have unsupervised access such an employer or the copyright/gaming cartels with DRM.
    • Not that quantity of vulnerabilities is everything but Intel and Arm are in serious relative trouble... again. How many of their performance and power advantages over the last several years have been substantially due to the of taking secure design shortcuts? AMD may be even further than the lead than we've realized.

      Basically none of the ARM advances over the past several years would be rolled back by this, because it is only a tiny portion of their portfolio that is even vulnerable at all, and those are the newer chips that are in few products. The affect on ARM has to do with promised offerings in the future, not the offerings in the past whose advantages have driven their adoption in the marketplace.

    • by Kuruk ( 631552 )
      Notice AMD is called out so often but.
  • Oh no, not again.

    • Many people have speculated that if we knew exactly why the bowl of petunias had thought that we would know a lot more about the nature of central processing units.

  • How do you like your clouds now? Do you even know all APTs that now have your keys?
    • by kiviQr ( 3443687 )
      Keep in mind that cloud provider can fix it for all. Much better option than people having racks in their closet and never patching it.
      • If you have a rack in your closet and have to protect it from yourself, you have worse problems.

        This should only be a concern if you have a rack in somebody else's closet, or somebody else's rack in your closet.

  • Speed....Security...Cheap...Pick only two, can't have it all!!!
    • I'll have you know that the ATmega328P has all three!

      I mean... 16MHz is fast, right?

      • 16Mhz is your crappy arduino board, you can't blame the ATmega328 for that.

        And stop saying P at the end, the ones you buy have -P at the end which stands for PDIP, but the actual 328P with the P as part of the processor name is exactly the same as the 328 it just uses less power on standby. So you don't mention the P part when its true, you only mention the P when you're confused about the part numbers.

        ATmega328 supports up to 20Mhz using an external oscillator. It actually works up to over 30Mhz. But out o

  • For a second I was really curious what SPECTRE was up to and what James Bond was going to do about it.

  • And fixing them will introduce more attack vectors. What a man can make, a man can break. That is why I don't think quantum communication and encryption is actually unbreakable.

Avoid strange women and temporary variables.

Working...