Next Gen Intel CPUs Move To Yet Another Socket 254
mr_sifter writes "According to a leaked roadmap, next year we'll be saying hello to LGA1155. The socket is 1-pin different from the current LGA1156 socket Core i3, i5, and some i7s use. Sandy Bridge CPUs will be based on the current 32nm, second-generation High-k metal gate manufacturing process. All LGA1155 CPUs will have integrated graphics built into the core instead of a separate chip. This is an upgrade from the current IGP, PCI Express controller and memory controller in Clarkdale CPUs. which is manufactured on the older 45nm process in a separate die (but still slapped together the same package). This should improve performance, as all the controllers will be in one die, like existing LGA1366 CPUs."
This socket goes to 1155 (Score:2)
Well, it's one louder...err faster, isn't it?
Re: (Score:2)
Nigel Tufnel: Look at this pin... still has the old tag on, never even used it.
Marty DiBergi: [points his finger] You've never used...?
Nigel Tufnel: Don't touch it!
Marty DiBergi: We'll I wasn't going to touch it, I was just pointing at it.
Nigel Tufnel: Well... don't point! It can't be used.
Marty DiBergi: Don't point, okay. Can I look at it?
Nigel Tufnel: No. no. That's it, you've seen enough of that one.
Integrated graphics in the CPU? (Score:4, Interesting)
I can see that integrated graphics in a CPU can be handy for some applications, like low-power mobile stuff and such.
But for a desktop PC, isn't this a disadvantage? If you're using a proper graphics card, couldn't that space in the CPU be used for better things than a redundant graphics circuit?
Re: (Score:2)
...couldn't that space in the CPU be used for better things than a redundant graphics circuit?
At first I read that as "retarded graphics circuit". Still made perfect sense...
Re: (Score:2)
It's either:
1) Another CPU core
2) Yet more cache.
And now GPUs...
Too bad Intel can't make great GPUs.
Re: (Score:2)
How about a specialized CPU? Lots and lots and lots of weak single-threaded cores with their local non-shared memory, all running their own small program and connected to a very fast bus, allowing them to pass messages to each other. It would be ideal for many emerging applications, such as image recognition and AI in general.
The thing is, a general-purpose serial CPU is already as fast as it's ev
Re: (Score:2)
Re: (Score:2)
The 3dfx Voodoo card was released less than 1.5 decades ago. Game console generations are getting longer and longer. My crystal ball says few people will be buying standalone graphics cards in 5 years.
Re: (Score:2)
Not if you convince "proper graphics card" to see it all as a CPU integrated in their graphics card.
I don't think it'd be very hard right now to convince an alienware buyer to uy a computer that's essentially a graphics card with all the rest integrated around it. Except, maybe, the hard drive. And even there you could argue "it has a SSD for you to install one or two games at a time. You can buy a standard HD for the rest."
The only thing to leave outside would have to be the mouse (some elite pro killer ra
to bad it's the same gma crap that amd has a bette (Score:2)
to bad it's the same gma crap that amd has a better on board chip and plane to work on getting in the cpu + letting it boast a add in ati card as well. what will intel card do just shut down when a better card in installed?
Re: (Score:3, Interesting)
http://www.anandtech.com/show/2972/the-rest-of-clarkdale-intel-s-pentium-g6950-core-i5-650-660-670-reviewed/2 [anandtech.com]
i5-661 (with the fastest on-package graphics) is performance-competitive with AMD's latest integrated graphics. The slower on-package GPU from Intel are behind, but not by much. Nothing Intel can't solve in its next processor (especially as AMD did not increase its IGP performance)
Re: (Score:2)
Also, what's with kids these days playing their hippety-hop music way too loud using integrated chips rather than a good old ISA SoundBlaster 16?
Re: (Score:2)
Low end systems become even cheaper to produce when the chipset on the motherboard does not need to include graphics support. Also, if your add-in video card fails, you can always run off integrated until you can replace it. You are right about a 'proper' video card being a better choice overall, but if you look at those $400 to $500 computer towers being sold all over the place, not a single one has a dedicated video card.
Now, AMD is moving forward with their Fusion project, which will add a GPU to s
Re: (Score:2)
But for a desktop PC, isn't this a disadvantage? If you're using a proper graphics card, couldn't that space in the CPU be used for better things than a redundant graphics circuit?
Don't look at the PC enthusiast/gamer market. Look at the desktop PC for basic business use. Cost is much more king there, as long as performance is acceptable. You gotta cut a lot of costs if you want to be able to slap down a whole PC for less than $200.
I wouldn't be surprised if in a couple more generations we're looking back at 'system on a chip' designs. No northbridge, southbridge, video controller, etc... Just a central chip on a board with power and interface leads.
Re: (Score:2)
Not sure if it's possible, but I'm guessing that if one added a graphics card, then the processing power of the graphics portion of the CPU could be used for other things. Granted, I wouldn't expect CUDA type performance, but I'd think a few new instructions that allowed programmers to specifically target unused graphics units for processing SIMD instructions would be welcome. Same thinking goes for the AMD chips. Basically an either-or choice: all-in-one chip, or increased computational power... which are
Re: (Score:2)
Except, most PCs don't have proper graphics cards. It's why the top video card manufacturer is not nVidia or AMD, it's Intel. Yes, Intel sells the most graphics chipsets.
And graphics is one of the last high-speed de
Re: (Score:2)
I can see that integrated graphics in a CPU can be handy for some applications, like low-power mobile stuff and such.
But for a desktop PC, isn't this a disadvantage? If you're using a proper graphics card, couldn't that space in the CPU be used for better things than a redundant graphics circuit?
One could make the same argument about motherboards right now. A lot of them come with onboard graphics that takes up space on the board better used for SATA ports or some such, and yet people still buy them and stick video cards on them.
I've seen a lot of systems. If it has a video card, odds are better than 50/50 it's also got onboard video.
Re: (Score:2)
Simple.
There is a HUGE segment of the population that doesn't need anymore graphics capability than what even the crappy Intel integrated graphics offer.
The current offerings are much better.
Here is the maximum graphics requirement for about 80% of all Windows PCs.
Will it playback 1080p video.
And that is the maximum they require.
A lot of people never play any video game that is more graphically intensive than Plants vs Zombies.
A lot of people never play any video better than what is on Youtube.
Think all the
Re: (Score:3, Interesting)
Um, no. Cache is very important, especially with 64-bit code. In fact, x86 is a terribly die-area-inefficient architecture; we'd be a lot better off with a modern RISC, opening up space for more cache.
Re:Integrated graphics in the CPU? (Score:4, Informative)
Your point would have been valid 10 years ago but the die area used for the CISC instruction decoder on a modern x86 processor is negligible. Infact the x86 instruction set is more compact than a pure RISC cpu so you can fit more instructions into the instruction cache (ARM processors have a THUMB mode with more compact 16bit instructions because of this).
Re:Integrated graphics in the CPU? (Score:5, Informative)
The key is modern RISC, not RISC. x86 is horribly inefficient. I'm not talking about the instruction decoder, I'm talking about the instruction semantics. x86 was never designed for today's high-performance CPUs, and the result is that the instruction set basically allows the programmer to do anything they want, even if it goes against modern CPU design optimizations. This forces the CPU to devote a large amount of die area to workaround logic that detects the thousands of possible dirty tricks that a programmer might use which are allowed by the ISA. For example, every modern RISC requires that the programmer issue cache flush instructions when modifying executable code. This is common sense. x86 doesn't, which means there needs to be a large blob of logic checking for whether the data you just touched happens to be inside your code cache too. The fact that on x86 you can e.g. use one instruction to modify the next instruction in the pipeline is just so ridiculously horribly wrong it's not even funny. There are similar screw-ups related to e.g. the page tables. I can't even begin to imagine the pains that x86 CPU engineers have to go through.
You can make an x86 chip reasonably small and very slow, or very large and very fast. x86 doesn't let you have it both ways to any reasonable degree.
Re: (Score:2)
The neat thing about the x86 architecture is that it has forced the chip designers to be really clever. E.g. the register limitations has forced them to find ways to make level 1 cache really fast; you'll be hard pressed to find non-x86 chips with faster level 1 cache. Similarly, the system call latency is fantastic. Most importantly the (quite) strong memory ordering provided by x86 means that x86 is pretty much unmatched when it comes to inter-CPU communication. Look at the hoops e.g. PA-RISC goes through
Re: (Score:2)
You can make an x86 chip reasonably small and very slow, or very large and very fast. x86 doesn't let you have it both ways to any reasonable degree.
Isn't that what ARM is for? Besides I want a Very Fast Machine with a huge heatsink.
Re: (Score:2)
RISC typically needs more RAM than CISC (and it seems less than 10% of the die area is devoted to x86 instruction decoding, at least in high-performance processors), so you'll trade the space for more cache for the need for more main memory.
Re: (Score:2)
I love how everyone jumped so quickly on the instruction decoding bandwagon. Of course instruction decoding is cheap these days, even for x86. The problem isn't decoding, it's the huge amount of dirty things that instructions can potentially do after being decoded. Things that go against modern high-performance CPU design principles.
Re: (Score:3, Interesting)
In fact, x86 is a terribly die-area-inefficient architecture; we'd be a lot better off with a modern RISC, opening up space for more cache.
Is this ignoring the fact that most of Intel's chips for many years have basically been RISC processors with an x86 translation unit?
Re: (Score:2)
Is this ignoring the fact that most of Intel's chips for many years have basically been RISC processors with an x86 translation unit?
This doesn't really make sense. ALL CISC processors are pretty much RISC processors with a translation front end. This has been true since the 8086 and (especially) the 68000, when RISC wasn't invented. The whole point of RISC is that it was discovered that you can live without that front end. Look up microcode.
The original 8086 was a bit RISC-like in that some instructions were in hard-coded logic and didn't go through the microcode layer. Modern x86 is less RISC-like, because all instructions need to go t
Real-time high-def geometry rendering (Score:2)
Who needs a proper graphics card these days?
Only people who need real-time high-def geometry rendering.
More people will need this than you might think. Let's look at each piece of your claim:
Re: (Score:2)
I'm using a desktop that I recently built with a Core i3-530 and the built-in graphics are quite acceptable, even at 1600x900(the monitor was free so I won't complain about the odd resolution). The only place they suffer is in high-performance areas like games. The IGP is designed to process Full HD video.
Re: (Score:3, Insightful)
More people will need this than you might think. Let's look at each piece of your claim:
I think that the issue here is where you place the line on a 'proper' graphics card.
By that I mean that today even integrated video cards are easily able to keep up with GUIs, play even blue-ray movies, etc...
I'm not sure SVG/Canvas, rasterization will really bog down modern integrated graphic engines. Or if it doesn't support it, it'll fall back to the CPU, and assuming you're not doing anything too CPU intensive at that moment, it won't matter. You don't need a 5870 to run Office or IE.
Re: (Score:2)
I would be shocked if the GPU integrated into Intel's next-gen CPU doesn't blow away what's in the XBox 360, which after all is a medium-high end card from 2005. And yet, you'll notice there seems to be little push for next-gen consoles beyond the XBox 360 and PS3. Integrated video will eventually be good enough for most applications including games, the question is not i
Re: (Score:2)
Don't forget that modern Operating Systems are able to store the contents of windows in the Graphics memory, leaving more room in system RAM for other items. This also speeds up the loading of less recently viewed windows (by having them cached less often).
Re: (Score:2)
I hate to break it do you, but a huge portion of PC users DON'T play anything more intensive than Farmville on their systems - if they game at all.
Even for myself - I do play games on my PC, but only on 1 of them. I've got 5 systems (Windows desktop, Linux desktop, Linux laptop, Mac desktop, and Windows desktop at work) and ONLY my Windows desktop at home ever sees any gaming. In the other 4 I really don't care what chip is in them because Chromium, Visual Studio, Safari, etc simply don't need it.
The simp
Re: (Score:2)
I hate to break it do you, but a huge portion of PC users DON'T play anything more intensive than Farmville on their systems - if they game at all.
Ya know... I've seen Cafe World bring a dual-core with 4 gigs of RAM to its knees. 60% CPU usage, with half a gig of RAM in use for Firefox alone... with no other apps running, nor even any other browser windows/tabs open.
Hell, it's sluggish and choppy on the quad-core 2.8 in my living room. Farmville is a little better on most days, but still.
Browser-based Flash Game != low-powered app. Might not be graphics intensive, admittedly... but Zynga really needs a head check when it comes to resource usage.
Re: (Score:2)
Not all integrated graphics are made the same. Intel integrated are utter shit.
I haven't tried nvidia integrated graphics.
But the ATI 3300 HD series of integrated chips?
It is as good as a top of the line GPU from about 3 or 4 years ago. 128Mb integrated memory, 32 stream processors, decent clock speed. There are a handful of high end games it won't do well with, but it will meet the needs of most people, including those playing TF2.
What GMA stands for (Score:2, Funny)
All LGA1155 CPUs will have integrated graphics built into the core
Will the new integrated GPU have performance even on par with a Wii's GPU, or is it the same GMA (i.e. "Graphics My Ass") that's been built into Intel boards for years?
Re: (Score:2)
If it's the graphic chip from the i5-661, then it's competitive with the AMD's IGP (AMD might have better drivers though)
Re: (Score:2)
It will probably be junk like usual. If they released on board graphics on par with something like a 9800 GT it would crush NVidia and AMD/ATI as there probably isn't enough of a market above that to keep them operating.
Then there will be Federal investigations and anti-trust lawsuits... they just don't need that kind of trouble.
Re: (Score:2)
I have an i3 cpu. Given the pricing, I don't expect great things from the integrated graphics, but it's certainly been capable for light to medium gaming, and as an office desktop (we're standardizing on it at work), it's fantastic. If you want to run Crysis or Dragon Age, go buy a $150 gaming card. Otherwise, as an integrated graphics package, it's all I need and much better than I'm accustomed to.
And yet,... (Score:5, Informative)
...the AM2+/AM3 socket on my AMD board continues to be useful for new AMD CPUs literally years after I originally purchased it.
Re: (Score:2)
Same socket, but can it run all the newer processors? That at least happened to be with a Shuttle I had that I thought about upgrading - for various reasons with the board it couldn't even with a BIOS upgrade. And there always seemed to be some sort of shift like AGP to PCIe, PATA to SATA, DDR2 to DDR3, USB 1.0 to 2.0 or some various other good reasons to upgrade anyway. Expansion cards are just silly expensive compared to motherboards, I'm guessing due to volume.
To take one example, any decent mobo today c
Re: (Score:2)
Yup, it can do it all...the ONLy thing i can't do is run DDR3 (it has four DDR2 slots), but other than that I can take care of all the new stuff (exceptin' USB3.0 and Sata6, of course...but not much on the market can do that yet either)
Re: (Score:2)
Almost every board on the market does USB3 and SATA6 except at the lowest end. The chipsets don't support it, but motherboard manufacturers put additional chips on the motherboards for both
http://giga-byte.ca/Products/Motherboard/Products_Overview.aspx?ProductID=3284 [giga-byte.ca]
I fully agree however that a 3 year old AMD motherboard with a new CPU gives you just about the same experience as brand new system as long as the motherboard OEM provides ongoing support through BIOS updates. I'm a loyal Gigabyte customer for t
Re:And yet,... (Score:4, Informative)
...the AM2+/AM3 socket on my AMD board continues to be useful for new AMD CPUs literally years after I originally purchased it.
Intel had a long run with the Socket 775 boards, and AMD pulled this stunt back with their Socket 939 to AM2 upgrade. AM2 is a 940 pin socket.
I do agree AMD did something right with their AM2, AM2+, AM3 sockets being interchangeable for many CPUs. Just some of the more interesting features get disabled when running an AM3 cpu on an AM2 socket.
A win for AMD (Score:5, Insightful)
I can't understand why they would force another socket design on customers. I am using a four year old motherboard and recently replaced my AMD CPU with a current model. It was a drop in replacement. Sure I could get some benefits from a newer MB, but I can make the upgrade at a time of my choosing. I can spread the cost, get the big boost from the CPU now and get a smaller boost from a new MB in a year's time.
Board manufacturers have to spend money implementing the new socket. Retailers are stuck with old stock that no-one wants because a new socket is around the corner.
It raises prices and hurts the end user. Why are we still seeing this behavior?
Re: (Score:2)
I think you answered your own question in the first three words of the question....
Re: (Score:2)
It raises prices and hurts the end user. Why are we still seeing this behavior?
It raises prices and helps Intel.
Re: (Score:2)
Re:A win for AMD (Score:4, Insightful)
Because Intel sells motherboards and chipsets too. They don't want to sell you just a new processor, they want to sell you a new processor and a motherboard.
If Intel thought they could make more money by keeping their stuff backwards compatible, they would, but I'm sure the bean counters figured the amount of sales lost to AMD would be less than the profits they could make by forcing you to buy new motherboards too, and I would tend to agree with that.
I don't like it, I don't think it's good for consumers, but it makes sense from Intel's perspective.
Re: (Score:2)
Board manufacturers get to push a new board model for people who want to upgrade the CPU.
I upgraded a CPU once. The CPU required a new motherboard. The new motherboard required new RAM and new gfx card. And the new components combined required a new PSU.
Pure business.
Re: (Score:2)
I was going to write something completely different after this, and then I read the last line of the article.
"Oh, one last thing: one of our sources states LGA2011 will launch with quad-and six-core CPUs (with Hyper-Threading so eight and 12 execution units) although another source has stated eight-core CPUs are also on
because design includes the interface (Score:2)
The design of a CPU includes the way it interfaces to the motherboard. If you make a new CPU on the same interface (bus), you don't get full performance. And you can't optimize power either. And it buys you very little to not pair the two up. Very few people upgrade their CPU, they usually buy a CPU with the motherboard and don't change it until they get a new motherboard.
And heck, few people even buy their own motherboard anyway! People who build their own systems don't realize how few people do so now. It
One freaking pin?! (Score:4, Funny)
How about you design the next socket with twice as many pins as you think you'll need? Then we won't run out and have to buy a whole new motherboard when we just want a faster CPU.
Re: (Score:3, Funny)
The new one has one FEWER pin than the current socket. So obviously next time they should either design one with a single removable pin, or no pins at all.
This simplifies cooling design so much! (Score:2)
I'm sure that combining the two biggest heat sources in a computer on the same die is a very well thought move. Especially for mobile versions. Yay.
Re: (Score:3, Interesting)
The processor is only one part of performance (Score:3, Interesting)
A large part of the performance gain in new generation processors is actually the combination of the processor and chipset. The core i5, core i7, etc. processors did away with a a separate memory controller -- that itself has been a huge power and speed advantage. Without upgrading the stuff supporting the chip, you don't get much benefit from an upgrade.
16 pci-e lanes to low when the chipset lacks usb 3 (Score:2)
16 pci-e lanes to low when the chipset lacks usb 3, and other things like sata 3.0 and other new buses fores MB makes to use switchs and other stuff to fit in video + sata 3.0 + usb 3.0 or cut down the video card to x8.
Changing sockets sounds bad, but (Score:3, Interesting)
I've never really upgraded CPUs. By the time my CPU is outdated (2-3 years), my motherboards usually is, too: newer RAMS (SDR - > DDR -> DDR2 -> DDR3), faster HD interfaces ( PATA -> SATA -> SATA2 -> SATA3) and others (USB -> USB2 -> USB3; PCI -> PCIE -> PCIE2), bigger/faster HDs... In the end, I usually rotate entire PCs, they go My Main PC - > My Backup PC -> My parents / Niece.
My gripe with Intel is more about the price of their MBs, especially compared to AMD's. The cheapest AMD MB within an AMD IGP is listed at 54 euros at my favorite retailer ( Asus AMD2+, not 3, but perfs are broadly the same), while Intel's cheapest MB is 84 euros (Gigabyte). Their low-end CPUs are also kinda expensive. And their IGPs also still kinda suck, even for playing video, and definitely for even light gaming.
The interesting thing these days is smaller size. Mini-ITX mainboards are becoming common, there's cheapish ones with AMD2/3 or 1156 sockets, good cases (Silverstone...), huge HDs. Unless you really need a graphics card, you can build a very small and quiet PC.
Wrong title (Score:2)
Title should have been "Intel wants to sell you motherboards and shit, along with their new cpu line" ...
i really got tired of this old trick.
Re:Sigh (Score:4, Interesting)
There's always AMD's Fusion on the horizon. If they can execute well on that they have a chance to do what they did with the Athlon. Intel has yet to demonstrate that they actually have GPU tech that can compete with nVidia and ATI in this space. I really hope they do, Intel has had too long at the top of the market and they're getting all monopolistic again.
Re: (Score:2, Interesting)
As in, I hope AMD can execute, not I hope Intel have tech that can compete with nVidia and ATI. The former would lead to better competition, the latter would give the monopolist more power.
That'll teach me to not preview.
Re: (Score:3, Insightful)
Gah! I meant "that'll teach me to preview".
Someone pass me a mallet. My head seems to need a little percussive maintenance.
Re: (Score:2)
Someone help me here. Although I understand the basic need for new sockets sometimes, I dont understand the drive to implement them so often and frequently. If there truly is only a single pin difference, wouldn't it make sense to at least attempt to design the chip to meet existing sockets? It also seems like it would speed adoption of a new processor if it is socket-compatible with existing motherboards. This is the piece that confuses me. It seems like any time a new socket is required, it's bad for busi
Re: (Score:2)
I dont understand the drive to implement them so often and frequently.
The cynic in me says 'Yeah, its a money grab; new cpu = new socket, and the user shells out for new hw, Intel wins big until everyone else reverse engineers the whole set up and sells clones."
But then the rationalist in me knows that socket aside, the new cpu is going to require a new chip set, especially if the package contains a gpu as well. So its kind of moot. Plus, anything they can do to remove pins is generally a good thing. Generally. One less pin sounds like they've made something more efficient.
Re: (Score:2, Insightful)
I really hope that AMD gets back on top and can compete with Intel on the top-level CPUs again. I am tired of the Intel fanboy's crapping all over AMD for the last few years, and really the industry NEEDS AMD to get back on top and help drive the price of these Intel chips down. The price gap is so huge between AMD and Intel that it makes building a top of the line Intel machine very daunting for us working-class enthusiasts and system builders.
Thankfully AMD's new hexacores will work in AM3 sockets so a mo
Re: (Score:2)
Re: (Score:2)
I don't think the GP is upset at *Intel* in this regard; I think it's more a perfectly realistic consumer complaint: "I wish there were more competition in this space because that would be better for me as the consumer." AMD dropped the ball pretty badly after a very strong run with earlier Athalons. It'd be great to see them get back into the game and really help push things along again.
Re: (Score:2)
AMD dropped the ball pretty badly after a very strong run with earlier Athlons. It'd be great to see them get back into the game
(corrected one product name...)
That's not so simple. How much of "AMD dropping the ball" was because if illegal, anticompetitive practices of Intel? Practices which, essentially, robbed AMD from money needed for aggresive R&D and fab expansion.
Re: (Score:3, Informative)
Intel, through illegal practises, prevented AMD from benefiting fully from their lead with K7 and early K8 Athlons. This illegally rerouted money weakened AMD R&D and fabs, while strenghtening Intel ones at the same time.
Re: (Score:3, Insightful)
So basically AMD's failures are always Intel's fault and not their own, right?
Re: (Score:3, Insightful)
At least they're trying?
Trying it is not enough. It's 2010, and AMD bought ATI almost 4 years ago (1 [arstechnica.com]), so there are no excuses. I would be glad of buying AMD+ATI integrated graphics instead of Intel, but it is a no-no until drivers for Linux reach its Windows counterparts performance-wise, and of course, I will not buy anything from AMD+ATI until then, not before. I buy products based on facts, not promises (I already made a mistake 3 years ago buying a AMD/ATI integrated graphics, still today without proper driver for Linux WTF
Re:Figures... (Score:4, Interesting)
You upgrade the CPU/Motherboard/RAM. Big woop.
You would need a new motherboard regardless if they changed the socket or not. You would also need new RAM since the RAM requires lower operating voltages.
They probably did this so you don't try to plug in the new CPU on your old motherboard thinking it was a straight upgrade when it requires different circuitry.
Re: (Score:2)
One would think somebody at Intel noticed by now that it's good to write motherboard specs with bigger headroom for lower voltages...
Of course, they simply don't want it; Intel chipsets bring quite a lot money, too.
Re: (Score:2)
That's to be expected if for many systems sensible upgrades are blocked. Anyway, if the group willing to upgrade is so small...why Intel blocks it?
Re: (Score:2)
Because it's not worth the extra engineering effort to cater to the 3 people on the planet who would base a purchasing decision on it.
Re: (Score:2)
What about extra engineering effort to block upgrades, eh?
Re: (Score:2)
I can agree it might be oversight...which doesn't speak very good of Intel if AMD manages to do it usually. Heck, I've seen a very late i865 ASRock motherboard with Core 2 Duo support. And latest Intel CPUs (as well as those upcoming in 2011) basically use just PCI Express and some interface to output video...
Yes, very few people want and expect to mess around with the insides...so why Intel, seemingly, does some effort to outright block such possibility?
Re: (Score:2, Insightful)
Re: (Score:2)
Not without a new chipset.
I don't get why socket compatibility matters when you need a new chipset anyways.
I believe the new processors are using PCI Express 3.0 and require more lanes/copper as well.
Re:Figures... (Score:4, Informative)
Only because Intel chooses to obsolete old chipsets (or, more preciselly, arbitrarily changes bus specs on new motherboards - I've seen an ASRock one for C2D with i865). AMD somehow manages to keep latest versions of their CPU interconnect backwards compatible...you really want to say Intel isn't capable of doing so? (especially if Intel simply uses PCI Express for those chips, which is explicitly backwards compatible)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
But doesn't get the performance or power savings.
It would be cheaper for Intel to not change the socket.
Re: (Score:2)
Ummm, why? You can upgrade the CPU on an AM2/AM2+ motherboard with at most a flash of the bios. And the AM2/+ CPUs are typically backwards compatible (a AM2+ will run on an AM2, but with reduced functionality). So that AM2 board you purchased 4 years ago is still compatible with the latest processors (but not with DDR3). Given AMD's track record with sockets, I'd be surprised if the AM3 gets "phased out" within the next 5 ye
also amd HT is in all CPU's unlike Intel that only (Score:2)
also amd HT is in all CPU's unlike Intel that only has there in high end cpus.
so intel low end cpu are stuck with low pci-e lanes to the point where usb 3.0 can get in the way of x16 video cards make some boards use a pci-e switchs. and foreing apple to use core 2 in there 13" laptop just to get good video with needing to add full video chip + chipset.
Intel also uses this to lock out NVidia. They should put there new bus in the i3 i5 i7 (low end) and not crap GMA video + 16 pci-e lanes.
This why form day 1 a
Re: (Score:3, Insightful)
There are different things to consider. On the AMD side of things, which everyone is using for comparison, you can often drop a new CPU into pretty much any AM2+ or AM3 motherboard with just a firmware update. You don't need to replace the RAM or motherboard, and you get the benefits of the new CPU. Going to a new MEMORY type would require a new motherboard, but with all of the new AMD processors, they support BOTH DDR2 and DDR3 memory.
There really is no good excuse for needing an all new chipset fo
Re: (Score:2)
With my current AM3 socket, I can upgrade to a 6-core AMD chip with just a BIOS update. Why can't Intel do that?
Re: (Score:2)
I can do that with my 1366 socket too.
Re: (Score:2)
With my current AM3 socket, I can upgrade to a 6-core AMD chip with just a BIOS update. Why can't Intel do that?
They could. But since only a tiny fraction of people ever upgrade CPUs, there's no reason to cripple your CPUs with support for old chipsets when you can just release a new one; every current AMD CPU has to support DDR2 RAM as well as DDR3, for example, and there's some evidence that requirement is significantly affecting AMD's memory performance with DDR3.
It would be different if AMD had better CPUs than Intel, but since Intel's are the fastest right now you can either buy the fastest CPU with the appropri
Re:Figures... (Score:5, Informative)
Re: (Score:2)
But upgrading CPUs has become much more attractive lately - you can go, say, from cheap singlecore (AMD still has some singlecore Semprons; plus singlecore Athlon64 AM2 was quite popular for some time) in original, cheap machine to...also cheap now quadcore. Getting huge boost for very little money (you might also upgrade memory while ddr2 is still cheap)
Of course Intel simply wants you to buy more; chipsets are also quite lucrative after all (maybe pointing out it's a horrible waste would work with current
Re: (Score:2)
But you were likely to buy it 3 years ago, in the form of AM2 Athlon64. Now, and still for quite some tinme to come, you can slap in a quadcore. Or if buying now some cheap CPU you would still be able to upgrade to significantly faster one later on...
You can also donate just the old CPU, btw...somebody will need it (old sticks of RAM? You just buy reasonable amount in two sticks at first and have two free slots for later expansion...yes, you have take note to get a motherboard with 4 RAM slots, but that won
Re: (Score:2)
I just upgraded a Northwood 2.8 to a Core 2 Duo 3.0. I used the same graphics card (8800 GT) in the new machine.
Night and day difference.
Re: (Score:2)
Power bill dropped that much?
It probably did... it certainly runs a *lot* cooler. I'll have to see if it is actually noticable on the next electric bill.
Mostly I was amazed at how much higher the framerate is on the 8800GT. The same video card went from ~15 FPS with medium settings on Northwood to ~60 FPS on high settings on Core2.
Re: (Score:2)
Frankly, I'd rather slap on a separate GFX card altogether, than waste transistors in my main processor for physics and pixel processing.
Or in a desktop PC, you could have a GMA running one monitor and a GeForce running the other.
Re: (Score:2)
Re: (Score:2)
If your sound card blows, you could replace it. If your mainboard blows, you could replace it. Why replace everything when one goes kaput?
Just as discrete CDROM controllers (ISA) went the way of the dodo, just like IDE controller boards did the same, just like the network cards got integrated...
Re: (Score:2)
Damnit, I just upgraded my old Athlon 64 3500+ with a nice new Core i5 750 as well. £320 for processor, mobo and 4GB memory. Good job I was hoping for it to last for a while, cos it sure looks like Intel don't want me to just upgrade my processor when it gets to be lacking.