That's pretty much the plan. There is a limited run of 10W Ivy bridge parts due by the end of this year for tablets so don't expect any high volume product using them. Rather they're mainly for prototyping 10W Haswell products for 2013/early 2014. Still gotta wait.
Though if if you think about it, the real tablet chip isn't going to be Haswell but the 14 nm follow up Rockwell. Power consumption will go down slightly but the main benefit will be more IO integration in the SoC. Who needs a chipset?
Has anyone isolated CPU idle power from the rest of the system? Core i7-3770k ran at 75W idle (whole system) according to the AT review of Ivy Bridge, how much of that do we suppose was the processor?
The real question is what exactly Intel means by "20x lower platform idle power". Does the 'platform' simply include processor and chipset? The more it includes, the more impressive the number becomes.
It wouldn't shock me if a lot of those gains are in the GPU portion of the chip. When you combine the effort they've put into the GPU + a mature 22nm process, a 20x reduction in idle power for the die seems plausible.
To answer the actual question, I doubt that number is taking anything into account that is off of the CPU package. Total system idle would probably drop by 5W or so.
I am afraid that this number is mostly marketing smoke. Of course, a large reduction in power consumption is always welcome, and it is certainly an engineering achievement getting a 20x reduction on a platform already as efficient as Sandy/Ivy bridge. However, Lowering the power consumption from 1W to 0.05W will likely have very limited tangible effect, if at all, given that other components will easily dominate the consumption at that point.
And yes, this is an indication of the CPU+GPU consumption: it says little of the consumption at system level.
From what I've read, idle power for the last couple of generations has been pretty low, like 30W or less. Hopefully Intel has learned from its experience with early Atoms that CPU idle power doesn't mean much when the northbridge and other components are still pigs. As idle power drops towards zero, these other components will become even more important.
The arrandale version is also available in english: http://ssj3gohan.tweakblogs.net/blog/6112/85w-core... If you want a power efficient computer. Buy a motherboard with less features or remove your graphics card. The processor doesn't really matter.
Hmm, it's coming up on the five year mark, time for me to build a new workstation. I'm not in any rush and every time I read new news I flip flop on whether I should just do it this winter or wait for Haswell. There's always the next chip to wait for but my plan was to have a new system built by next summerish regardless and it wouldn't happen any sooner than towards the end of this one.
Increased GPU performance isn't too important to me. Any gaming I do would still run off a dedicated GPU.
Lower power consumption, particularly at idle, is nice but if it's just the CPU as others are already wondering, how much of a dent is that really going to make? I leave my system on all the time, every watt is nice but the rest may already be hungrier than Ivy Bridge at idle.
Well, if you don't care much for graphics performance and you are looking for a workstation then I'm not sure Haswell has anything interesting to offer since graphics and low power consumption is what Haswell is all about. I was thinking of waiting until 2013 for Intel to release performance 6 core Haswell parts but from what I've read that just isn't going to happen. The kiddies have to be able to watch 4k video on their postage-stamp smartphone screens and that seems to be what Intel is most concerned with these days. Haswell could be really interesting for long battery life productivity work on a laptop though. If I were building a system today for performance I'd be looking at maybe multiple AMD Opterons. From what I've read Intel is refusing to even comment on Haswell's CPU performance. To me that's a bad sign and implies that it's not so hot. I wouldn't be surprised if it were just a couple of percents better than Ivy Bridge.
That's the overall impression I've been getting as well. Most of the information I've been reading hasn't really warmed me up on Haswell. Since I commented I have been rethinking things and on the whole I'm not that convinced Haswell would be too important. It could be great for mobile but on a desktop it just doesn't sound like a big step.
It may also include the chipset. I don't recall what Cougar Point was running, but I think it was 60nm (maybe 45nm). It certainly is not even 32nm and I know Panther Point is the same process node as Cougar Point, so it is possible that Shark Bay (that is Haswell's new chipset, right?) might actually be on 22nm and jump a couple of process nodes. Chipsets ususally seem to lag 1-2 node generations behind the processors.
So that could be where the bulk of the power savings is, though I am sure there is a lot of processor power savings as well.
For a desktop, very little impact. In a notebook that is in S1 (IE possibly idle, but not asleep). Other motherboard stuff, drive idle/sleep power consumption, WiFi adapter draw, display power, DRAM, etc are already a fair amount larger than chipset/CPU in aggregate at idle. Now, sure, that might still get you an extra 10-15% battery life. What it will do is provide a good boost in tablets where there is less stuff connected already. Likely a smaller/lower power display, etc. Also it gives you a huge boost for this "connected" sleep mode where most everything else is powered down. In that case, compared to trying to do it with a modern processor, it could be the difference between a 2-3w "connected/sleep" power draw and a couple of hundred milliwatts (maybe 200-400mw) (or 20-30hrs and a week). This is of course compared to current day S3, which probably draws around 150-300mw to keep the RAM in lower power refresh mode and basically everything else off.
Frankly Haswell is sounding pretty impressive, both for x86 tablets and for notebooks. I am sure it won't hurt for desktops either, but in that kind of power envelope the improved GPU is superflous for most users (though it might just make a mean HTPC for me) and the lower power draw most don't care about.
Okay, so the power thing is cool, really even for desktops. The video is cool maybe for tablets...maybe
But for desktops and notebooks it continues to piss me off that they're blowing hundreds of millions of transistors on their stupid video that could be used towards a more powerful CPU...or heck, I'd sooner they just chopped that part off and took the profit!
Ugh, what AMD is doing is actually okayish at the low end, but for a mid range system? There's just no excuse for this integrated stuff...
Agreed, it's very disappointing to see Intel essentially halt any development on the performance side. There's no way I'd build a system today without a stand alone GPU and as it stands I can't even think about building a system with Ivy Bridge without feeling like I'm being cheated by being forced to pay for the useless half of the chip.
Although honestly I don't even totally blame Intel for all of that. I understand that Intel has to adapt to the market and that they are being forced to compete with ARM on the mobile end whether they like it or not. What I don't understand quite is why the market is moving in that direction. Heavy duty graphics power makes sense to me for large screen displays for watching HD content at home or multi-LCD gaming setups but doesn't make any sense to me at all for mobile devices. Honestly, I don't even own a smartphone or tablet but aren't current chips already powerful enough to handle what the average user wants to do with them? How much smoother can Jelly Bean get? Where exactly does Intel think it's going to go with it's processors after they build one that can run Battefield 6 and Windows 9 Ultra Super Duper Edition on my clamshell cell phone?
While they're hyping the IGP and low power since geeks with desktops and discrete GPUs are a shrinking minority of the market, Take at slides 11-13; they've doubled theoretical FP throughput with new AVX2 instructions (will need app recompiles at a minimum to benefit), doubled cache bandwidth, along with misc improvements to branch prediction a larger OOO buffer, and reduced latencies for virtualization.
Let's put this in perspective. Haswell DOES increase CPU performance per clock. They've doubled the L2 bandwidth by allowing L2 loads once per clock instead of once per two clocks. They've even added a whole new ALU pipeline! The Intel Core microarchitectures have been 3 issue on the ALU side since the original Core 2 Duo. With Haswell, Intel will be able to dispatch 4 ALU operations per clock and 8 total micro-ops (compared to 3 and 6 in Ivy Bridge). This is certainly a major improvement compared to the normal "make caches bigger" strategy.
Why should they? AMD failed to push Intel forward in the performance race... They won't even be trying anymore. Blame them, not Intel. Why should Intel focus more on something they've already one by a large margin??
Cortex A15 and 64bit ARMv8 are around the corner, it's a huge threat for Intel in the mobile space. Heck, they're literally "begging" carriers to launch Medfield Android smartphones, and trying to push x86 everywhere. Intel was taken by surprise in the mobile space just like Microsoft was. Haswell makes sense bigtime, and NOT focusing on performance, but on power efficiency makes even MORE sense.
I just don't see how anyone is expecting Intel NOT to focus on those sides in their next platform..... But to ease your anger, Intel has designed Haswell to be modular and salable as hell. It can go from ultra low power mobile to ultra high performance. If AMD (somehow) manages to raise the bar in performance, Intel will be ready to beat whatever AMD has to offer by another wide(r) margin. Nothing is motivating Intel in the desktop space ATM.
They have doubled the AVX FP, increased ALU, branch, load/store. TLB, cache bandwidth etc. but 8ops/c vs 6ops/cycle. Clock-for-clock it looks like Haswell could do a minimum of 30% more per cycle, 20% to 80% higher benchmark scores, 30% better for avg. user (assuming 10% average clock speed reduction). Judging by the power budget they might have to reduce normal clock speed. They will increase the turbo boost to be similar to current max boost (3.8 to 4GHz) but nominal clock will be 3.1GHz instead of 3.4GHz. They will have to re-layout the execution units to fit additional resources and to spread the heat from critical areas. Additional metal layers will b added for routing signals and power. More local power gated regions to reduce local and global power usage and allow other regions to run hotter than before. Big difference to power consumption when running 2 simple threads doing 2 IPC each compared to 2 higher IPC threads doing 4 IPC each. At least 1 temperature sensors for each core, GPU block, MC etc. and fast throttling to stop area of <5mm^2 silicon from going beyond design limit even thou the CPU case temp and TDP below max.
I'm not surprised that Intel isn't really focusing on performance with Haswell considering their push to mobile and no threatening competition from AMD. It seems like Intel is just playing it safe and increasing performance just enough to make it considered an upgrade over Ivy Bridge. Looking at the slides all it seems like they are doing with Haswell is exactly what they did from Nehalem ----> Sandy Bridge, doubling up and adding some new features / instructions. The performance difference between Nehalem and Sandy Bridge wasn't that huge and some of it had to do with process shrink, so I'm expecting the same again. It's hard to say right now without benchmarks but it doesn't seem like a very compelling upgrade from my i5 2500k.
I'm not sure I agree with this new philosophy that chip makers seem to be embracing which seems to be 1 chip to cover all the market segments i.e. mobile, tablet, notebook, desktop, server, etc. I do understand that there are huge cost savings to be had by adopting this model, but I think it is a tradeoff. The processing needs of enterprise are much different than some teenager checking facebook. I think these different market segments developed for a reason and to lump all their needs together as "one size fits all" might be a little short sighted. If AMD management were smart they would see Intel's focus on mobile as an opportunity, unfortunately AMD has the mobile bug too and it seems like they want to undercut Intel on sub $300 laptops. I'm not entirely sold that mobile is the future and like someone else mentioned Intel is already late to the game. They don't really have a product competitive with ARM's offering so they've sort of slotted in to a niche of powerful tablets / ultrabooks in the mobile space for now.
Very soon ARM and Intel will cross paths as ARM strives to increase performance and Intel strives for high power efficiency. With Intel so focused on competing with ARM in the mobile space, it seems like they are not as concerned with advancing the high end. This is where AMD might be able to catch them off guard. Like I mentioned before I'm not completely sold on the idea that mobile is the future of computing. It's entirely possible that mobile hardware is just a stop-gap until everything moves server side. There are still some kinks to be ironed out before tablets, smartphones, etc. can be replaced by thin clients but it can't be ruled out as a possibility. If this is the direction the market goes then enterprise will take center stage. If I were AMD I would be throwing R & D into designing a great server processor first and foremost, and then fill out the rest of their product line based on this. I'm not sure the Bulldozer / Piledriver architecture is gonna get them there, if they can't fix the problems with it they should just cut their losses and start over or even have divergent products for different markets.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
23 Comments
Back to Article
Silver Bullet 126 - Tuesday, September 11, 2012 - link
Hopefully will see this CPU in tablets soon after Ivy Bridge .. maybe even skip IVB and put this in? SC2 on tablet :)Kevin G - Tuesday, September 11, 2012 - link
That's pretty much the plan. There is a limited run of 10W Ivy bridge parts due by the end of this year for tablets so don't expect any high volume product using them. Rather they're mainly for prototyping 10W Haswell products for 2013/early 2014. Still gotta wait.Though if if you think about it, the real tablet chip isn't going to be Haswell but the 14 nm follow up Rockwell. Power consumption will go down slightly but the main benefit will be more IO integration in the SoC. Who needs a chipset?
jaydee - Tuesday, September 11, 2012 - link
Has anyone isolated CPU idle power from the rest of the system? Core i7-3770k ran at 75W idle (whole system) according to the AT review of Ivy Bridge, how much of that do we suppose was the processor?Khato - Tuesday, September 11, 2012 - link
The real question is what exactly Intel means by "20x lower platform idle power". Does the 'platform' simply include processor and chipset? The more it includes, the more impressive the number becomes.A5 - Tuesday, September 11, 2012 - link
It wouldn't shock me if a lot of those gains are in the GPU portion of the chip. When you combine the effort they've put into the GPU + a mature 22nm process, a 20x reduction in idle power for the die seems plausible.A5 - Tuesday, September 11, 2012 - link
To answer the actual question, I doubt that number is taking anything into account that is off of the CPU package. Total system idle would probably drop by 5W or so.tuxRoller - Tuesday, September 11, 2012 - link
If that number is for a laptop, that would be amazing.yankeeDDL - Wednesday, September 12, 2012 - link
I am afraid that this number is mostly marketing smoke.Of course, a large reduction in power consumption is always welcome, and it is certainly an engineering achievement getting a 20x reduction on a platform already as efficient as Sandy/Ivy bridge.
However, Lowering the power consumption from 1W to 0.05W will likely have very limited tangible effect, if at all, given that other components will easily dominate the consumption at that point.
And yes, this is an indication of the CPU+GPU consumption: it says little of the consumption at system level.
Metaluna - Tuesday, September 11, 2012 - link
From what I've read, idle power for the last couple of generations has been pretty low, like 30W or less. Hopefully Intel has learned from its experience with early Atoms that CPU idle power doesn't mean much when the northbridge and other components are still pigs. As idle power drops towards zero, these other components will become even more important.Darkstone - Tuesday, September 11, 2012 - link
'pretty low', 'like 30W or less'.You're pretty far of. Take a look at this dutch blog:
http://ssj3gohan.tweakblogs.net/blog/8217/fluffy2-...
Ivy bridge desktop = 1.56w
arrandale desktop = <3w
The arrandale version is also available in english:
http://ssj3gohan.tweakblogs.net/blog/6112/85w-core...
If you want a power efficient computer. Buy a motherboard with less features or remove your graphics card. The processor doesn't really matter.
csroc - Tuesday, September 11, 2012 - link
Hmm, it's coming up on the five year mark, time for me to build a new workstation. I'm not in any rush and every time I read new news I flip flop on whether I should just do it this winter or wait for Haswell. There's always the next chip to wait for but my plan was to have a new system built by next summerish regardless and it wouldn't happen any sooner than towards the end of this one.Increased GPU performance isn't too important to me. Any gaming I do would still run off a dedicated GPU.
Lower power consumption, particularly at idle, is nice but if it's just the CPU as others are already wondering, how much of a dent is that really going to make? I leave my system on all the time, every watt is nice but the rest may already be hungrier than Ivy Bridge at idle.
Decisions decisions.
csroc - Tuesday, September 11, 2012 - link
"wouldn't happen any sooner than towards the end of this one."Changed what I was saying but didn't fix that. Supposed to say "towards the end of this year"
Magichands8 - Tuesday, September 11, 2012 - link
Well, if you don't care much for graphics performance and you are looking for a workstation then I'm not sure Haswell has anything interesting to offer since graphics and low power consumption is what Haswell is all about. I was thinking of waiting until 2013 for Intel to release performance 6 core Haswell parts but from what I've read that just isn't going to happen. The kiddies have to be able to watch 4k video on their postage-stamp smartphone screens and that seems to be what Intel is most concerned with these days. Haswell could be really interesting for long battery life productivity work on a laptop though. If I were building a system today for performance I'd be looking at maybe multiple AMD Opterons. From what I've read Intel is refusing to even comment on Haswell's CPU performance. To me that's a bad sign and implies that it's not so hot. I wouldn't be surprised if it were just a couple of percents better than Ivy Bridge.csroc - Tuesday, September 11, 2012 - link
That's the overall impression I've been getting as well. Most of the information I've been reading hasn't really warmed me up on Haswell. Since I commented I have been rethinking things and on the whole I'm not that convinced Haswell would be too important. It could be great for mobile but on a desktop it just doesn't sound like a big step.Wonder if there will be Ivy Bridge E.
azazel1024 - Tuesday, September 11, 2012 - link
It may also include the chipset. I don't recall what Cougar Point was running, but I think it was 60nm (maybe 45nm). It certainly is not even 32nm and I know Panther Point is the same process node as Cougar Point, so it is possible that Shark Bay (that is Haswell's new chipset, right?) might actually be on 22nm and jump a couple of process nodes. Chipsets ususally seem to lag 1-2 node generations behind the processors.So that could be where the bulk of the power savings is, though I am sure there is a lot of processor power savings as well.
For a desktop, very little impact. In a notebook that is in S1 (IE possibly idle, but not asleep). Other motherboard stuff, drive idle/sleep power consumption, WiFi adapter draw, display power, DRAM, etc are already a fair amount larger than chipset/CPU in aggregate at idle. Now, sure, that might still get you an extra 10-15% battery life. What it will do is provide a good boost in tablets where there is less stuff connected already. Likely a smaller/lower power display, etc. Also it gives you a huge boost for this "connected" sleep mode where most everything else is powered down. In that case, compared to trying to do it with a modern processor, it could be the difference between a 2-3w "connected/sleep" power draw and a couple of hundred milliwatts (maybe 200-400mw) (or 20-30hrs and a week). This is of course compared to current day S3, which probably draws around 150-300mw to keep the RAM in lower power refresh mode and basically everything else off.
Frankly Haswell is sounding pretty impressive, both for x86 tablets and for notebooks. I am sure it won't hurt for desktops either, but in that kind of power envelope the improved GPU is superflous for most users (though it might just make a mean HTPC for me) and the lower power draw most don't care about.
dishayu - Tuesday, September 11, 2012 - link
All thanks to the on die VR and little tweaks here and there?Wolfpup - Tuesday, September 11, 2012 - link
Okay, so the power thing is cool, really even for desktops. The video is cool maybe for tablets...maybeBut for desktops and notebooks it continues to piss me off that they're blowing hundreds of millions of transistors on their stupid video that could be used towards a more powerful CPU...or heck, I'd sooner they just chopped that part off and took the profit!
Ugh, what AMD is doing is actually okayish at the low end, but for a mid range system? There's just no excuse for this integrated stuff...
Magichands8 - Tuesday, September 11, 2012 - link
Agreed, it's very disappointing to see Intel essentially halt any development on the performance side. There's no way I'd build a system today without a stand alone GPU and as it stands I can't even think about building a system with Ivy Bridge without feeling like I'm being cheated by being forced to pay for the useless half of the chip.Although honestly I don't even totally blame Intel for all of that. I understand that Intel has to adapt to the market and that they are being forced to compete with ARM on the mobile end whether they like it or not. What I don't understand quite is why the market is moving in that direction. Heavy duty graphics power makes sense to me for large screen displays for watching HD content at home or multi-LCD gaming setups but doesn't make any sense to me at all for mobile devices. Honestly, I don't even own a smartphone or tablet but aren't current chips already powerful enough to handle what the average user wants to do with them? How much smoother can Jelly Bean get? Where exactly does Intel think it's going to go with it's processors after they build one that can run Battefield 6 and Windows 9 Ultra Super Duper Edition on my clamshell cell phone?
DanNeely - Tuesday, September 11, 2012 - link
While they're hyping the IGP and low power since geeks with desktops and discrete GPUs are a shrinking minority of the market, Take at slides 11-13; they've doubled theoretical FP throughput with new AVX2 instructions (will need app recompiles at a minimum to benefit), doubled cache bandwidth, along with misc improvements to branch prediction a larger OOO buffer, and reduced latencies for virtualization.aicom - Wednesday, September 12, 2012 - link
Let's put this in perspective. Haswell DOES increase CPU performance per clock. They've doubled the L2 bandwidth by allowing L2 loads once per clock instead of once per two clocks. They've even added a whole new ALU pipeline! The Intel Core microarchitectures have been 3 issue on the ALU side since the original Core 2 Duo. With Haswell, Intel will be able to dispatch 4 ALU operations per clock and 8 total micro-ops (compared to 3 and 6 in Ivy Bridge). This is certainly a major improvement compared to the normal "make caches bigger" strategy.lilmoe - Tuesday, September 11, 2012 - link
Why should they? AMD failed to push Intel forward in the performance race... They won't even be trying anymore. Blame them, not Intel. Why should Intel focus more on something they've already one by a large margin??Cortex A15 and 64bit ARMv8 are around the corner, it's a huge threat for Intel in the mobile space. Heck, they're literally "begging" carriers to launch Medfield Android smartphones, and trying to push x86 everywhere. Intel was taken by surprise in the mobile space just like Microsoft was. Haswell makes sense bigtime, and NOT focusing on performance, but on power efficiency makes even MORE sense.
I just don't see how anyone is expecting Intel NOT to focus on those sides in their next platform..... But to ease your anger, Intel has designed Haswell to be modular and salable as hell. It can go from ultra low power mobile to ultra high performance. If AMD (somehow) manages to raise the bar in performance, Intel will be ready to beat whatever AMD has to offer by another wide(r) margin. Nothing is motivating Intel in the desktop space ATM.
tygrus - Tuesday, September 11, 2012 - link
They have doubled the AVX FP, increased ALU, branch, load/store. TLB, cache bandwidth etc. but 8ops/c vs 6ops/cycle.Clock-for-clock it looks like Haswell could do a minimum of 30% more per cycle, 20% to 80% higher benchmark scores, 30% better for avg. user (assuming 10% average clock speed reduction). Judging by the power budget they might have to reduce normal clock speed. They will increase the turbo boost to be similar to current max boost (3.8 to 4GHz) but nominal clock will be 3.1GHz instead of 3.4GHz. They will have to re-layout the execution units to fit additional resources and to spread the heat from critical areas. Additional metal layers will b added for routing signals and power. More local power gated regions to reduce local and global power usage and allow other regions to run hotter than before. Big difference to power consumption when running 2 simple threads doing 2 IPC each compared to 2 higher IPC threads doing 4 IPC each. At least 1 temperature sensors for each core, GPU block, MC etc. and fast throttling to stop area of <5mm^2 silicon from going beyond design limit even thou the CPU case temp and TDP below max.
Pixelpusher6 - Wednesday, September 12, 2012 - link
I'm not surprised that Intel isn't really focusing on performance with Haswell considering their push to mobile and no threatening competition from AMD. It seems like Intel is just playing it safe and increasing performance just enough to make it considered an upgrade over Ivy Bridge. Looking at the slides all it seems like they are doing with Haswell is exactly what they did from Nehalem ----> Sandy Bridge, doubling up and adding some new features / instructions. The performance difference between Nehalem and Sandy Bridge wasn't that huge and some of it had to do with process shrink, so I'm expecting the same again. It's hard to say right now without benchmarks but it doesn't seem like a very compelling upgrade from my i5 2500k.I'm not sure I agree with this new philosophy that chip makers seem to be embracing which seems to be 1 chip to cover all the market segments i.e. mobile, tablet, notebook, desktop, server, etc. I do understand that there are huge cost savings to be had by adopting this model, but I think it is a tradeoff. The processing needs of enterprise are much different than some teenager checking facebook. I think these different market segments developed for a reason and to lump all their needs together as "one size fits all" might be a little short sighted. If AMD management were smart they would see Intel's focus on mobile as an opportunity, unfortunately AMD has the mobile bug too and it seems like they want to undercut Intel on sub $300 laptops. I'm not entirely sold that mobile is the future and like someone else mentioned Intel is already late to the game. They don't really have a product competitive with ARM's offering so they've sort of slotted in to a niche of powerful tablets / ultrabooks in the mobile space for now.
Very soon ARM and Intel will cross paths as ARM strives to increase performance and Intel strives for high power efficiency. With Intel so focused on competing with ARM in the mobile space, it seems like they are not as concerned with advancing the high end. This is where AMD might be able to catch them off guard. Like I mentioned before I'm not completely sold on the idea that mobile is the future of computing. It's entirely possible that mobile hardware is just a stop-gap until everything moves server side. There are still some kinks to be ironed out before tablets, smartphones, etc. can be replaced by thin clients but it can't be ruled out as a possibility. If this is the direction the market goes then enterprise will take center stage. If I were AMD I would be throwing R & D into designing a great server processor first and foremost, and then fill out the rest of their product line based on this. I'm not sure the Bulldozer / Piledriver architecture is gonna get them there, if they can't fix the problems with it they should just cut their losses and start over or even have divergent products for different markets.