Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Intel Upgrades

Next Gen Intel CPUs Move To Yet Another Socket 254

mr_sifter writes "According to a leaked roadmap, next year we'll be saying hello to LGA1155. The socket is 1-pin different from the current LGA1156 socket Core i3, i5, and some i7s use. Sandy Bridge CPUs will be based on the current 32nm, second-generation High-k metal gate manufacturing process. All LGA1155 CPUs will have integrated graphics built into the core instead of a separate chip. This is an upgrade from the current IGP, PCI Express controller and memory controller in Clarkdale CPUs. which is manufactured on the older 45nm process in a separate die (but still slapped together the same package). This should improve performance, as all the controllers will be in one die, like existing LGA1366 CPUs."
This discussion has been archived. No new comments can be posted.

Next Gen Intel CPUs Move To Yet Another Socket

Comments Filter:
  • Well, it's one louder...err faster, isn't it?

  • by Lord Lode ( 1290856 ) on Wednesday April 21, 2010 @10:22AM (#31922624)

    I can see that integrated graphics in a CPU can be handy for some applications, like low-power mobile stuff and such.

    But for a desktop PC, isn't this a disadvantage? If you're using a proper graphics card, couldn't that space in the CPU be used for better things than a redundant graphics circuit?

    • ...couldn't that space in the CPU be used for better things than a redundant graphics circuit?

      At first I read that as "retarded graphics circuit". Still made perfect sense...

    • by TheLink ( 130905 )
      Basically they've run out of ideas on how to use those billions of transistors to make things faster or better.

      It's either:
      1) Another CPU core
      2) Yet more cache.

      And now GPUs...

      Too bad Intel can't make great GPUs.
      • Basically they've run out of ideas on how to use those billions of transistors to make things faster or better.

        How about a specialized CPU? Lots and lots and lots of weak single-threaded cores with their local non-shared memory, all running their own small program and connected to a very fast bus, allowing them to pass messages to each other. It would be ideal for many emerging applications, such as image recognition and AI in general.

        The thing is, a general-purpose serial CPU is already as fast as it's ev

    • Not if you convince "proper graphics card" to see it all as a CPU integrated in their graphics card.

      I don't think it'd be very hard right now to convince an alienware buyer to uy a computer that's essentially a graphics card with all the rest integrated around it. Except, maybe, the hard drive. And even there you could argue "it has a SSD for you to install one or two games at a time. You can buy a standard HD for the rest."

      The only thing to leave outside would have to be the mouse (some elite pro killer ra

    • to bad it's the same gma crap that amd has a better on board chip and plane to work on getting in the cpu + letting it boast a add in ati card as well. what will intel card do just shut down when a better card in installed?

    • Heck, I remember "integrated graphics" the first time round. It was called "using the CPU to do graphics", and it was good enough for us to render buggy whips in 2D, sometimes even 2.5D.

      Also, what's with kids these days playing their hippety-hop music way too loud using integrated chips rather than a good old ISA SoundBlaster 16?

    • by Targon ( 17348 )

      Low end systems become even cheaper to produce when the chipset on the motherboard does not need to include graphics support. Also, if your add-in video card fails, you can always run off integrated until you can replace it. You are right about a 'proper' video card being a better choice overall, but if you look at those $400 to $500 computer towers being sold all over the place, not a single one has a dedicated video card.

      Now, AMD is moving forward with their Fusion project, which will add a GPU to s

    • But for a desktop PC, isn't this a disadvantage? If you're using a proper graphics card, couldn't that space in the CPU be used for better things than a redundant graphics circuit?

      Don't look at the PC enthusiast/gamer market. Look at the desktop PC for basic business use. Cost is much more king there, as long as performance is acceptable. You gotta cut a lot of costs if you want to be able to slap down a whole PC for less than $200.

      I wouldn't be surprised if in a couple more generations we're looking back at 'system on a chip' designs. No northbridge, southbridge, video controller, etc... Just a central chip on a board with power and interface leads.

    • Not sure if it's possible, but I'm guessing that if one added a graphics card, then the processing power of the graphics portion of the CPU could be used for other things. Granted, I wouldn't expect CUDA type performance, but I'd think a few new instructions that allowed programmers to specifically target unused graphics units for processing SIMD instructions would be welcome. Same thinking goes for the AMD chips. Basically an either-or choice: all-in-one chip, or increased computational power... which are

    • by tlhIngan ( 30335 )

      I can see that integrated graphics in a CPU can be handy for some applications, like low-power mobile stuff and such.

      But for a desktop PC, isn't this a disadvantage? If you're using a proper graphics card, couldn't that space in the CPU be used for better things than a redundant graphics circuit?

      Except, most PCs don't have proper graphics cards. It's why the top video card manufacturer is not nVidia or AMD, it's Intel. Yes, Intel sells the most graphics chipsets.

      And graphics is one of the last high-speed de

    • I can see that integrated graphics in a CPU can be handy for some applications, like low-power mobile stuff and such.

      But for a desktop PC, isn't this a disadvantage? If you're using a proper graphics card, couldn't that space in the CPU be used for better things than a redundant graphics circuit?

      One could make the same argument about motherboards right now. A lot of them come with onboard graphics that takes up space on the board better used for SATA ports or some such, and yet people still buy them and stick video cards on them.

      I've seen a lot of systems. If it has a video card, odds are better than 50/50 it's also got onboard video.

    • by LWATCDR ( 28044 )

      Simple.
      There is a HUGE segment of the population that doesn't need anymore graphics capability than what even the crappy Intel integrated graphics offer.
      The current offerings are much better.
      Here is the maximum graphics requirement for about 80% of all Windows PCs.
      Will it playback 1080p video.
      And that is the maximum they require.

      A lot of people never play any video game that is more graphically intensive than Plants vs Zombies.
      A lot of people never play any video better than what is on Youtube.

      Think all the

  • All LGA1155 CPUs will have integrated graphics built into the core

    Will the new integrated GPU have performance even on par with a Wii's GPU, or is it the same GMA (i.e. "Graphics My Ass") that's been built into Intel boards for years?

    • If it's the graphic chip from the i5-661, then it's competitive with the AMD's IGP (AMD might have better drivers though)

    • It will probably be junk like usual. If they released on board graphics on par with something like a 9800 GT it would crush NVidia and AMD/ATI as there probably isn't enough of a market above that to keep them operating.

      Then there will be Federal investigations and anti-trust lawsuits... they just don't need that kind of trouble.

    • I have an i3 cpu. Given the pricing, I don't expect great things from the integrated graphics, but it's certainly been capable for light to medium gaming, and as an office desktop (we're standardizing on it at work), it's fantastic. If you want to run Crysis or Dragon Age, go buy a $150 gaming card. Otherwise, as an integrated graphics package, it's all I need and much better than I'm accustomed to.

  • And yet,... (Score:5, Informative)

    by Pojut ( 1027544 ) on Wednesday April 21, 2010 @10:33AM (#31922814) Homepage

    ...the AM2+/AM3 socket on my AMD board continues to be useful for new AMD CPUs literally years after I originally purchased it.

    • by Kjella ( 173770 )

      Same socket, but can it run all the newer processors? That at least happened to be with a Shuttle I had that I thought about upgrading - for various reasons with the board it couldn't even with a BIOS upgrade. And there always seemed to be some sort of shift like AGP to PCIe, PATA to SATA, DDR2 to DDR3, USB 1.0 to 2.0 or some various other good reasons to upgrade anyway. Expansion cards are just silly expensive compared to motherboards, I'm guessing due to volume.

      To take one example, any decent mobo today c

      • by Pojut ( 1027544 )

        Yup, it can do it all...the ONLy thing i can't do is run DDR3 (it has four DDR2 slots), but other than that I can take care of all the new stuff (exceptin' USB3.0 and Sata6, of course...but not much on the market can do that yet either)

        • Almost every board on the market does USB3 and SATA6 except at the lowest end. The chipsets don't support it, but motherboard manufacturers put additional chips on the motherboards for both

          http://giga-byte.ca/Products/Motherboard/Products_Overview.aspx?ProductID=3284 [giga-byte.ca]

          I fully agree however that a 3 year old AMD motherboard with a new CPU gives you just about the same experience as brand new system as long as the motherboard OEM provides ongoing support through BIOS updates. I'm a loyal Gigabyte customer for t

    • Re:And yet,... (Score:4, Informative)

      by Jazz-Masta ( 240659 ) on Wednesday April 21, 2010 @12:11PM (#31924600)

      ...the AM2+/AM3 socket on my AMD board continues to be useful for new AMD CPUs literally years after I originally purchased it.

      Intel had a long run with the Socket 775 boards, and AMD pulled this stunt back with their Socket 939 to AM2 upgrade. AM2 is a 940 pin socket.

      I do agree AMD did something right with their AM2, AM2+, AM3 sockets being interchangeable for many CPUs. Just some of the more interesting features get disabled when running an AM3 cpu on an AM2 socket.

  • A win for AMD (Score:5, Insightful)

    by Albanach ( 527650 ) on Wednesday April 21, 2010 @10:37AM (#31922894) Homepage

    I can't understand why they would force another socket design on customers. I am using a four year old motherboard and recently replaced my AMD CPU with a current model. It was a drop in replacement. Sure I could get some benefits from a newer MB, but I can make the upgrade at a time of my choosing. I can spread the cost, get the big boost from the CPU now and get a smaller boost from a new MB in a year's time.

    Board manufacturers have to spend money implementing the new socket. Retailers are stuck with old stock that no-one wants because a new socket is around the corner.

    It raises prices and hurts the end user. Why are we still seeing this behavior?

    • It raises prices and hurts the end user. Why are we still seeing this behavior?

      I think you answered your own question in the first three words of the question....

    • It raises prices and hurts the end user. Why are we still seeing this behavior?

      It raises prices and helps Intel.

    • Uh, perhaps because renegades like me and thee - heck, we're probably filthy hackers, and we may even have links to organised crime - who upgrade our systems are an insignificantly small market, and Intel are happy to cede it to AMD in order to squeeze more profit out of the other 98% of their customers?
    • Re:A win for AMD (Score:4, Insightful)

      by PhrstBrn ( 751463 ) on Wednesday April 21, 2010 @11:12AM (#31923522)

      Because Intel sells motherboards and chipsets too. They don't want to sell you just a new processor, they want to sell you a new processor and a motherboard.

      If Intel thought they could make more money by keeping their stuff backwards compatible, they would, but I'm sure the bean counters figured the amount of sales lost to AMD would be less than the profits they could make by forcing you to buy new motherboards too, and I would tend to agree with that.

      I don't like it, I don't think it's good for consumers, but it makes sense from Intel's perspective.

    • Board manufacturers get to push a new board model for people who want to upgrade the CPU.

      I upgraded a CPU once. The CPU required a new motherboard. The new motherboard required new RAM and new gfx card. And the new components combined required a new PSU.

      Pure business.

    • Intel is riding high right now and thinks that everyone will fall right in step no matter what Intel does. They're getting greedy, pure and simple and it's about to bite them in the ass.
      I was going to write something completely different after this, and then I read the last line of the article.
      "Oh, one last thing: one of our sources states LGA2011 will launch with quad-and six-core CPUs (with Hyper-Threading so eight and 12 execution units) although another source has stated eight-core CPUs are also on
    • The design of a CPU includes the way it interfaces to the motherboard. If you make a new CPU on the same interface (bus), you don't get full performance. And you can't optimize power either. And it buys you very little to not pair the two up. Very few people upgrade their CPU, they usually buy a CPU with the motherboard and don't change it until they get a new motherboard.

      And heck, few people even buy their own motherboard anyway! People who build their own systems don't realize how few people do so now. It

  • by Hatta ( 162192 ) on Wednesday April 21, 2010 @10:49AM (#31923084) Journal

    How about you design the next socket with twice as many pins as you think you'll need? Then we won't run out and have to buy a whole new motherboard when we just want a faster CPU.

    • Re: (Score:3, Funny)

      by GungaDan ( 195739 )

      The new one has one FEWER pin than the current socket. So obviously next time they should either design one with a single removable pin, or no pins at all.

  • I'm sure that combining the two biggest heat sources in a computer on the same die is a very well thought move. Especially for mobile versions. Yay.

  • A large part of the performance gain in new generation processors is actually the combination of the processor and chipset. The core i5, core i7, etc. processors did away with a a separate memory controller -- that itself has been a huge power and speed advantage. Without upgrading the stuff supporting the chip, you don't get much benefit from an upgrade.

  • 16 pci-e lanes to low when the chipset lacks usb 3, and other things like sata 3.0 and other new buses fores MB makes to use switchs and other stuff to fit in video + sata 3.0 + usb 3.0 or cut down the video card to x8.

  • by obarthelemy ( 160321 ) on Wednesday April 21, 2010 @11:32AM (#31923884)

    I've never really upgraded CPUs. By the time my CPU is outdated (2-3 years), my motherboards usually is, too: newer RAMS (SDR - > DDR -> DDR2 -> DDR3), faster HD interfaces ( PATA -> SATA -> SATA2 -> SATA3) and others (USB -> USB2 -> USB3; PCI -> PCIE -> PCIE2), bigger/faster HDs... In the end, I usually rotate entire PCs, they go My Main PC - > My Backup PC -> My parents / Niece.

    My gripe with Intel is more about the price of their MBs, especially compared to AMD's. The cheapest AMD MB within an AMD IGP is listed at 54 euros at my favorite retailer ( Asus AMD2+, not 3, but perfs are broadly the same), while Intel's cheapest MB is 84 euros (Gigabyte). Their low-end CPUs are also kinda expensive. And their IGPs also still kinda suck, even for playing video, and definitely for even light gaming.

    The interesting thing these days is smaller size. Mini-ITX mainboards are becoming common, there's cheapish ones with AMD2/3 or 1156 sockets, good cases (Silverstone...), huge HDs. Unless you really need a graphics card, you can build a very small and quiet PC.

  • Title should have been "Intel wants to sell you motherboards and shit, along with their new cpu line" ...

    i really got tired of this old trick.

"What man has done, man can aspire to do." -- Jerry Pournelle, about space flight

Working...