I suspecting that the big loss in performance on high details compared to 750M may be related to L4 eDRAM running short than driver issue, as AA, Intel never had good performance with filters, they support hardware x2 AA yet?
Yeah, doesn't AA hammer bandwidth? The eDRAM helps performance, but it's still quite low compared to what the other cards are paired with, even in best case scenarios.
I don't think its just that. Compared to the competition like the Trinity's iGPU and the GT 650M, the texture fill rate is rather low. That impacts performance not only in texture bound scenarios with settings cranked up but anti-aliasing as well. The fillrate of the top of the line Iris Pro 5200 is about equal to Trinity while the version in the iMac would fall short. The GT 650M is 40% better than the top of the line Iris Pro and over 55% better than iMac version.
There's also something to be desired about Intel's AA implementation. Hopefully Broadwell improves on this.
This seems counter intuitive. It's acting as a CPU+GPU shared cache correct? Intel architectures are relatively cache bandwidth starved and you'd think that 128MB of L4 would help keep the lower levels filled.
Or that the working set of most benchmarks (if not most apps) is captured with a 4 or 6MB cache? Caching's basically irrelevant for data that is streamed through.
Before that happens Apple will likely need to get their stand-alone Apple display "high-res". I don't expect it to go 2x like avery other one of their display; instead I would suspect it to be 4K, which is exactly 1.5x over the current 27" display size. Note that Apple mentioned 4K many times when previewing the Mac Pro.
Also, the most common size for quality 4K panels appears to be 31.5" so I would't be surprised to see it move to that size. When the iMacs are to get updated I think each would then most likely use a slightly larger display panel.
Why does Apple have to go to exactly 4K? We all understand the point, and the value, of going to 2x resolution. The only value in going to exactly 4K is cheaper screens (but cheaper screens means crappy lousy looking screens, so Apple doesn't care).
Apple is pretty locked in to the current screen sizes and 16:9 aspect ratio by the ID, and I can only imagine they will stick with the status quo for at least one more generation in order to recoup some of their obviously considerable design costs there.
Since Apple sells at best a couple million iMacs of each form factor in a year’s time, they kinda have to source panels for which there are other interested customers—we’re not even close to iPhone or iPad numbers here. Thus I’d reckon we’ll see whatever panels they intend to use in future generations in the wild before those updates happen. As solipsism points out, the speculation that there will be a new ATD with a 31.5”, 3840x2160 panel released alongside the new Mac Pro makes total sense because other vendors are already shipping similar displays.
I listed the size and resolution of previous LCD iMacs, as well as possible higher resolutions at 21.5” and 27”. Configurations that truly qualify as "Retina" are highlighted in green, and it looks as though pixel doubling will be Apple’s strategy when they make that move. I also highlighted configurations that require two DP 1.1a links or a DP 1.2 link in yellow, and those that demand four DP 1.1a links or two DP 1.2 links in red for both standard CVT and CVT with reduced blanking. Considering Apple has yet to ship any display that requires more than a single DP 1.1a link, and all of the Retina options at 27" are in the red is probably reason alone that such a device doesn't exist yet.
I also included the ASUS PQ321Q 31.5" 3840x2160 display, and the Retina MacBook Pros as points of comparison to illustrate the pricing issues that Retina iMacs would face. While there are affordable GPU options that could drive these displays and still maintain a reasonable degree of UI smoothness, the panels themselves either don't exist or would be prohibitively expensive for an iMac.
OR what you chart tells us is that these devices will be early adopters of the mythical (but on its way) DisplayPort 1.3? Isn't it obvious that part of the slew of technologies to arrive when 4K hits the mainstream (as opposed to its current "we expect you to pay handsomely for something that is painful to use" phase will be an updated DisplayPort spec?
Unlike HDMI 1.4, DisplayPort 1.2 can handle 4K just fine. I'd imagine DP 1.3 should take us to 8K.
What baffles me is that every Mac Apple has shipped thus far with Thunderbolt and either a discrete GPU or Haswell has been DP 1.2 capable, but the ports are limited to DP 1.1a by the Thunderbolt controller. So even though Intel is supposedly shipping Redwood Ridge which has a DP 1.2 redriver, and Falcon Ridge which fully supports DP 1.2, we seem to be getting three generations of Macs where only the Mac Pros can actually output a DP 1.2 signal.
Furthermore, I don't know of any panels out there that actually support eDP HBR2 signaling (introduced in the eDP 1.2 specification in May 2010, clarified in eDP 1.3 in February 2011, and still going strong in eDP 1.4 as of January this year). The current crop of 4K displays appear to be driven by converting a DisplayPort 1.2 HBR2 signal that uses MST to treat the display as 2 separate regions into a ridiculously wide 8 channel LVDS signal. Basically, for now, driving a display at more than 2880x1800 seems to require multiple outputs from the GPU.
And to answer your question about why 4K, the problem is really more to do with creating a panel that has a pixel pitch somewhere in the no man's land between 150 and 190 PPI. Apple does a lot of work to make scaling decent even with straight up pixel doubling, but the in-between pixel densities would be really tricky, and probably not huge sellers in the Windows market. Apple needs help with volume in this case, they can't go it alone and expect anything short of ludicrously expensive.
My bad. I had in mind the fancier forms of 4K like 10 bits (just possible) and 12 bit (not possible) at 60Hz, or 8bit at 120Hz; not your basic 8 bits at 60Hz. I should have filled in my reasoning.
Interesting to see were Apple wants their starting point to be.
Obviously they don't want that much choice in their iMacs, still waiting to see what they do with their mac mini's too.
Can't help but wonder if you can actually separate the glass and LCD though, has anybody tried? What technique and adhesive does LG/Apple use here? Has anybody tried to run another LCD-panel from the newer iMacs (2012/13)? Older panels might actually be drivable, would be a paint to rebuild a new iMac for those though.
Everything I have read says it's one integrated unit with the Glass/LCD together. I am also interesting and will keep looking but it does not look possible so far.
It's sold as one unit, but obviously it's bonded with an adhesive, depending on how it might not be possible to separate the two. It should maybe be able to drive older iMac displays though. But those are a few mm thicker.
As it's gapless the adhesive covers the whole panel which probably ruins it anyway, or will be hard to clean as you can't really use (strong) solvents on the LCD, but it's fully doable on phones which also adheres the screens with adhesive. Would be interesting to know if that kind of adhesive can be loosened by heat though. Plus it would probably be harder then a small screen here.
"I’ve heard some developers complain about this in the past, partly blaming it on a lack of lower level API access as OS X doesn’t support DirectX and must use OpenGL instead."
Wait, doesn't OpenGL get you *closer* to the hardware than DirectX??
There is also the issue that the driver sits farther up the stack than it does in Windows. Mach being a hybrid microkernel and all. I'm sure they can close the gap with a lot of work, but it doesn't erase the fact that Mac OS's basic structure is not designed to provide direct hardware access.
"Storage & Fusion Drive – By default all of the iMacs come with a 3.5” mechanical hard drive." The physical harddrive size is differentiated between the 21.5" and the 27". I suspect that you already knew this but it slipped.
My opinion: Many games look pretty good at medium and it is not clear that the compromise in frame rate is worth it to push for higher settings vs running at medium with higher resolution.
Regarding the Iris Pro gaming results - the jump from 13x7 medium (where Iris seems to have plenty of power to spare) up to 16x9 high/very high - this is a change of two variables: both resolution and settings.
Can you rerun some of the gaming tests at 16x9 medium (or maybe native panel resolution - 19x10 medium where 13x7 results were close to 60fps) to see what that shows?
one other question - is there any ability to OC the IGP in the bios? Typically Intel gpus have a fair bit of OC headroom and it would be interesting to see: 1) what kind of performance/playability can be extracted 2) to what extent the igp clock frequency is the limiter on Iris Pro vs cpu, memory bandwidth, etc.
It is crazy how cool these things run. I have a late 2012 27" i7 iMac and there is no part of it that is close to hot. The fan dumps hot air through a single hole which is very cleverly hidden below the hinge of the base so you don't even feel that unless you really look for it. Even when I've run the thing at max performance, having eight Mathematica kernels running in parallel for fifteen min or so, there's no obvious heating and no obvious fan noise.
The difference with even as recent as the 2007 iMac (my last model) is night and day. Part is the much more efficient CPUs, part is also the much more efficient display.
All this points to the possibility of a 15" retina macbook pro using only integrated graphics, which means comparable performance, with even longer battery life.
Probably only low end MBPr will do that this year, but next year if 6200 or whatever the Broadwell graphics are improved enough then very likely that could happen.
I hope they go the other way with it. GT3e for the 13" duo (especially for the retina, that would be a boon),and keep higher end discreet graphics in the larger models.
You forgot to mention that the fusion drive is not supported by Windows (if you are using bootcamp) "If you partition the drive, the new partition will not be part of the logical-volume group that Fusion Drive uses, so it will not benefit from the speed of the SSD" http://reviews.cnet.com/8301-13727_7-57549766-263/...
Man that Iris looks a lot worse in reality than I expected it to be. You really have to crank down the resolution and details to get some decent gaming performance. Then again this is the base imac, it's got compromise written all over it.
Looks like my 2010 Mac Pro with 3.33 Hex core is still going strong. Finally going to add an SSD to it, and now I'm just waiting to see what AMD's R9 series GPU's are like.
It is impressive to see how well these iMacs are doing these days though. :)
The difference to the 55W Macbook Iris Pro is so big in some tests that we can't explain it with a 10% lower GPU frequency. Anand once again failed to give us readers proper system infos. You have to learn that 8GB DDR3-1600 is not enough because it can be 2x4 GB in dualchannel or 1x8GB in singlechannel.
Typo on last paragaph of page 4: 'Doing *to* brings the price of the entry level 21.5-inch iMac up to $1499...'
Interesting to see another look at Iris Pro. The current generation continues to leave me a bit disappointed, I had such high hopes for it. Here's hoping that Intel makes some significant strides in the next generation (i.e. signigicantly more than 10-15% improvement)
I suspect Broadwell will improve things once again, but Intel seems to be consistently one generation behind what we actually want for that generation.
As soon as Broadwell comes out I'm sure we'll all be on the "wait for the actual new architecture" boat, such is technology :P
But if Broadwell packs twice the EUs and the eDRAM bandwidth to feed it, that would be quite nice on the GPU side. I just hope they can improve the CPU side more.
i'm again baffled by this. who in their right minds would pay 1,3K for something with a low-end i5 and an iGPU? those specs belong in a $500 laptop. i really can't wrap my head around why anyone would do this. think of all the hardware you could get for that money! you'd be looking at an i5-4670k + gtx770
The problem is that you are looking at this from a technical spec based point of view.
It is true that you can build a more powerful computer for less, some may argue far less. Still, these calculations usually fail to take account of the design of the chassis, build/material quality and value of the overall design of the iMac/MacPro/Whatever other Apple product you can think of.
One must consider that the average person does not wish to build their own computer and is not interested in a specification check list.
Most people want a fast, reliable and user friendly computer and are willing to pay a fair price for this experience. The entry level 21" iMac is a perfect fit for the large majority of the market.
Yes, you need to look at it from an iDouche point of view.
Besides, you can't put an intrinsic value on having an iFaeces Pro on your desk so you can look cool and smug, and feel good when you see all the Apple commercials on TV.
I guess if you think people who spends $100 for jeans are "idiots", "ignorant". You are correct. But that doesn't mean that designer jeans manufacturer stops making them. In fact, they are highly profitable.
They are also not idiots.
The #1 reason why people why designer jeans is because THEY LOOK BETTER. They all do the same thing .... cover your crotch. But there's value in style .... if you don't care of it, it doesn't mean that other people who value it are "ignorant" or "idiot".
That's alot of idiots running around the earth ..... have you thought maybe it's the other way around?
i guess i'm too rational to prioritize computer design above price/performance. if i want a pc for general office tasks, spending more than €400 seems like a waste of money. If i were to spend 1,3K on a pc, i'd prefer it to be good at it's purpose. i dont look at the pc, but at what it displays on the screen.
They have become pretty inexpensive nowadays. A Dell 21" Dell Ultrasharp IPS-screen is only 40-50 USD more expensive than the cheapest TN-panel. Still don't think people are fair when complaining about the cost of the iMac. It's quiet, it looks good, it holds a great resale value. A lot of people would think twice about dropping $500 a month of their car, why not spend a few extra quid on a computer? Back in the days a computer was expensive, now it's so cheap it's almost silly.
Not that I'd buy a base model 21.5-inch iMac, but I think most consumers could care less about their CPU and GPU specs and are looking more at the overall package. For most office desk jockeys, school computer labs, customer use / internet kiosks, parents, etc., the iMac probably makes more sense than a custom built overclocked gaming rig.
It also depends on how much you value what you get in said package. The i5-4570R is essentially an i5-4430S with Iris Pro 5200, so it's likely a $240-270 part. Then there's the rather well calibrated 1920x1080 IPS panel which would run another $150 or so. The keyboard and mouse that are included have a combined retail price of $138. Then there's the 3x3:3 802.11ac Wi-Fi plus Bluetooth 4.0, Broadcom GbE NIC and SDXC UHS-I card reader, Thunderbolt, 720p camera, dual microphones, built-in amp and stereo speakers, plus whatever value you put on OS X and the bundled iLife suite.
Someone that wants everything built in that doesn't sound or look like a piece of crap. Add up everything that is included and it's not that expensive. Especially when you consider the cpu is quad core (almost all laptop i5's are dual core), 802.11ac, dual mics, webcam, speakers, bluetooth, sdxc, thunderbolt, wireless keyboard and trackpad, a calibrated IPS monitor, and the best part is no noise and only one cable with no power brick. Try and do that with any diy build and tell me how much it is.
IPS screens aren't cheap, nor are AIO designs. Dell, HP, and Lenovo can't get lower prices when they try to compete in specs or display quality either.
A $500 laptop isn't going to have a good display, keyboard, or trackpad either, and resale value is going to be nothing. Comparing a trash laptop to something with much higher quality components that holds value over time doesn't make sense.
This argument is getting old and inaccurate. People don't always want laptops. Additionally, bottom of the barrel hardware has a very short life span. Sometimes, it is nice to have a well integrated AIO without cables running everywhere. Sure, not everyone wants a computer that will last 3+ years and that is perfectly fine.
But, just for fun, here's some pricing from NewEgg parts I believe would be of similar quality to an iMac. Yes, you can find cheaper cases/PSU/etc, but an iMac isn't built using the cheapest components. Likewise, feel free to buy a Dell! But, to call people who buy an iMac ignorant/ out of the mind/ etc is just proving a lack of insight into the situation and market pricing.
250 i5 cpu w/ IRIS (good luck finding it! NewEgg's uses HD4600) 150 mobo 75 case 75 psu 235 IPS 21.5" lcd 65 hdd 50 webcam w/ mic 75 ram 140 windows 65 wireless kb/mouse 35 speakers ?? shipping/tax ?? Time to built ---------------- 1215 Total + extra
What you did is a game created to let Apple look good. I made a similar list with parts from Newegg. It was only $1150 but included a Shuttle case, a 27" IPS-display from Dell, a 240 GB Intel SSD and 16 GB of RAM. And Windows 8 Retail obviously. A similar config but with a 21" display from Apple was $1870. The 27" iMac with similar configuration was almost $2400, i.e. twice as expensive as the PC.
Are you really equating an inflated price to higher quality parts? Quick tell me the difference between Kingston and Samsung Ram! How do you know his PSU isn't a Seasonic? What exactly is the brand HDD Apple uses? I heard this all the time about quality parts when everything Apple uses is Samsung or Foxconn. Yes Foxconn... the epitome of quality OEMs!
You are adding a lot of money for having an AIO. He is using a Dell 27" IPS, that should clue you into the fact that it is an Ultrasharp which is calibrated, also hes using an SSD so again tell me how a Hybrid beats out an SSD.
Lastly, are you 12 or something. Your name calling is immature.
I would (if I didn't already own a 2011 27" imac), which I found to be worth every cent.
It makes perfect financial sense when you realise that you are not so much buying a computer, as you are getting an integrated solution that's ready to use right out of the box. I find that Macs come with excellent functionality without me having to spend much time setting it up.
Already, it comes preloaded with the excellent iLife suite, a great pdf annotation reader (preview), a stable OS that continues to run smoothly 2 years on without me needing to do anything to maintain it, entry to the Apple ecosystem, and access to fairly inexpensive Mac software.
Then, there is the good aftercare service. Once, my Imac developed screen issues. I made a call to the service centre. 20 minutes later, an appointment was made. They came down to my house the next day, brought it back for servicing, and delivered it back to my house 2 days later. Problem solved with minimal effort or stress on my part.
So in the end, the extra that I am paying is well worth it for the promise of a seamless and hassle-free computing experience. Which I value more than if I were to simply build a desktop using parts sourced individually (and then having to deal with all the troubleshooting on my own subsequently).
Mac OS X is the main reason I buy macs. I have a feeling that you are right about most mac users having no clue what they're buying, but I could go on for hours about the advantages of OS X (geek-wise). No, hackintoshes are not an option if you want any kind of reliability, so don't even go there. If you force me to, I will explain my reasoning in depth to practically anandtech-levels, but I'm not in the mood right now. Perhaps another time! :)
Have they announced any dual core Iris Pro parts? I know the original SKU list just had them in the quads. I still assume the 13" rMBP will get the 28W HD 5100 (hopefully in base configuration, but there will probably be a lower spec i5 below that).
It shouldn't have to be dual core to be in the 13" pros though. Intel has quads in the same TDP as the current duals in it. A quad core, with GT3e, that would make it an extremely tempting package for me.
I think anything but GT3e in the 13'' Macbook Pro is going to be a disappointment at this point. How their 13'' "pro" machine has gone this long with sub-par integrated graphics is mind boggling. The move to a retina display really emphasized the weakness.
I'd also venture a guess that cost is the real barrier, as opposed to TDP.
Perhaps. Yeah, the 13" has been disappointing to me, I love the form factor, but hate the standard screen resolution, and the HD4000 is really stretched on the Retina. If it stays a dual core, I don't see a whole lot of appeal over the Macbook Air 13" either. To earn that pro name, it really should be a quad with higher end integrated graphics.
"This is really no fault of Apple’s, but rather a frustrating side effect of Intel’s SKU segmentation strategy."
So I take it I'm not the only one infuriated by the fact that Intel hasn't made Hyperthreading standard on all of it's CPU's.
I remember reading, on this site, that HT adds some insignificant amount of die area, like 5% or something, but is capable of adding up to 50% performance. (in theory). If that's the case the ONLY reason to not include it on EVERY CPU is to nickel and dime your customers. Except it should really be "$100" your customers since only the i7's have HT.
Isn't the physical capability of HT already on ALL cpu's? It just needs to be turned on in firmware right?
With the exception of IIRC dual vs quad core dies and GT2 vs GT3 graphics almost everything that differs between CPUs in a generation is either binning or disabling components if too few dies with a segment non-functional are available for the lower bin.
Intel could differentiate its product line without doing any segment disabling on the dies; but it would require several times as many different die designs which would require higher prices due to having to do several times as much validation. Instead we get features en/disabled with fuses or microcode because the cost of the 'wasted' die area is cheaper than the costs associated with validating additional die configurations.
Actually the GT2 is just a die-harvested GT3. Intel only have 2 and 4 core versions and crystalwell is an add-on die so there are essentially only 2 base dies, at least for consumers.
I do agree about the hyper-threading, there is really no need to disable it. It's not like it really matters in consumer applications anyway.
Note that the reduction in LLC (CPU L3) on Iris Pro may be because some of the LLC is used to hold tag data for the 128MB of eDRAM. Mainstream Intel CPUs have 2MB of LLC per CPU core, so the die has 8MB of LLC natively. The i7-4770R has all 8MB enabled but 2MB for eDRAM tag ram leaving 6MB for the CPU/GPU to use directly as cache (how it is reported on the spec sheet). The i5s generally have 6MB natively (for either die recovery and/or segmentation reasons) but if 2MB is used for eLLC tag ram, that leaves 4 for direct cache usage.
Given that you get 128MB of eDRAM in exchange for the 2MB LLC consumed as tag ram, seems like a fair trade.
HT adds a pretty consistent 25% performance boost across an extremely wide variety of benchmarks. 50% is an unrealistic value.
And, for the love of god, please stop with this faux-naive "I do not understand why Intel does ..." crap. If you do understand the reason, you are wasting everyone's time with your lament. If you don't understand the reason, go read a fscking book. Price discrimination (and the consequences thereof INCLUDING lower prices at the low end) are hardly deep secret mysteries.
(And the same holds for the "Why oh why do Apple charge so much for RAM upgrades or flash upgrades" crowd. You're welcome to say that you do not believe the extra cost is worth the extra value to YOU --- but don't pretend there's some deep unresolved mystery here that only you have the wit to notice and bring to our attention; AND on't pretend that your particular cost/benefit tradeoff represents the entire world.
And heck, let's be equal opportunity here --- the Windows crowd have their own version of this particular fool, telling us how unfair it is that Windows Super Premium Plus Live Home edition is priced at $30 more than Windows Ultra Extra Pro Family edition.
I imagine there are the equivalent versions of these people complaining about how unfair Amazon S3 pricing is, or the cost of extra Google storage. Always with this same "I do not understand why these companies behave exactly like economic theory predicts; and they try to make a profit in the bargain" idiocy.)
Wow, the gaming performance gap between OSX and Windows hasn't narrowed at all. I had hoped, two major OS releases after the Snow Leopard article, it would have gotten better.
The charts show the Iris Pro take a pretty hefty hit any time you increase quality settings. HOWEVER, you're also increasing resolution. I'd be interested to see what happens when you increase resolution but leave detail settings at low-med.
In other words, is the bottleneck the processing power of the GPU (I think it is) or the memory bandwidth? I suspect we could run Mass Effect or something similar at 1080p with medium settings.
"OS X doesn’t seem to acknowledge Crystalwell’s presence, but it’s definitely there and operational (you can tell by looking at the GPU performance results)."
I bet OS X does but not in the GUI. Type the following in terminal:
sysctl -a hw.
There should be line about the CPU's full cache hierarchy among other cache information.
It's amusing to me that the 27" iMac is smaller and thinner than my Dell 27" monitor (the U2711), even though the iMac is a full comptuer while my U2711 is just a monitor.
The iMac's have a great display and Target Display Mode is a cool feature. I would really like to see them expand it to non-Thunderbolt/Apple devices. I would love to play my 360 without needing a second display.
why don't reviewers test dota2? i know it's not a particularity intensive game, but lots of people play it (often 10 x number of players of the second game on steam) and lots of people would like to know how it performs, especially when you up the resolution, and lots of people only play dota2 so its not particularly easy to judge its performance when seeing metro/tr performance on lower resolutions
I know you guys wanted to get a review out, but I think you will see a jump in performance with the upcoming Os X to be released next month. I bet there will be better optimized drivers and support for Open Cl 1.2 and Open GL 4.0
Well, in the US, it's only a $200 adder. And it's a rather performant PCIe based SSD. Have you priced out an alternative that's as fast or faster for less money? Don't get me wrong, Apple maintains a 36% gross margin which is probably considerably higher than, say, Newegg's, but what were you really expecting for this type of CTO option?
Sammy 840 will read about half the speed of the new PCIe SSD in the 2013 iMac. It'll be at least twice as quick. These aren't 2.5" SATA3 drives. They're PCIe. Big difference. Small price to pay for the upgrade. You'll never know you don't have a 1.15 TB SSD in there for most consumer workflows. They. Flat. Fly. It's a bargain. That said, I'm with you on the RAM. $100 for 16 seems fair and nicely set for three-five years. While still being able to sell it for 50% after a half decade :-)
putting a hard drive in, even at a base level, should be considered a crime to computing in this day and age... a 256 gb ssd should be stock... for a company that pushes quality and doesn't concern itself as much with price, apple should have ditched the hdd this generation
256GB is not enough memory for a large proportion of computer users, a more realistic and useful answer would have been for Apple to have made the 1TB Fusion drive the default option.
Personally I prefer the design of the '09 models. Sure it doesn't look "pretty" as these ones (subjective), but I feel that on the imacs, Apple's "thin is good" mentality is annoying.
The "bulge" at the back looks ugly. Also because it's so thin, everything is moved to the back, including the sd card slot, which I feel is better on the side of the computer. It's a pain having to reach around the computer (especially on the 27inch). Also it needs more usb ports
I feel that Apple has gone for looks and forgot function on these imacs
I'm not sure if the graphics comparison to the DDR3 version of the GT 750m is fair. The DDR3 cripples the graphics performance, and while I don't know what the sales volume numbers are, my impression is that most 750m's that ship do so with DDR5. Also, anybody who games and cares about GPU performance, and is the least bit informed, would make sure to get the DDR5 version. And it's not like the DDR5 version is expensive- the Lenovo y500's that occasionally go for sale around $830 have 2GB of DDR5.
So all this demonstrates is that Iris Pro graphics can compete with bandwidth crippled discrete graphics. But if you don't care about gaming, you shouldn't care about whether the integrated graphics are competitive with discrete GPUs. And if you do care about gaming, this performance is still not good enough. The positive tone of this review doesn't seem justified.
nice review, and interesting to see that with Crystallwell there is an integrated GPU that does have acceptable performance. Will be interesting to see in a 13" MBR. What would also be interesting are performance comparisons of the new real GPUs if and how much better they are than the 2012 level. GPU performance is still a bottleneck for iMacs I think (i.e. gaming). M.
Yes --- if you're willing to be daring. Essentially you'd need to boot off a third drive (or the network) then use diskutil cs commands to create an LVG then an LV tying the two drives together. There are instructions on the web giving details.
On a different point, I don't think Anand is correct in saying that Fusion works better than other hybrid solutions because it tracks blocks. I think the real answer is that it does a MUCH smarter job of tracking file "temperature". OSX has, since about 10.3, tracked file temperature (which is essentially a combination of how large the file is and how often it's accessed). This was done back then (and is still done) to move the hottest files to a small "hot files" area at the start of a disk for the obvious performance reasons. Details here: http://osxbook.com/book/bonus/misc/optimizations/#...
My guess is that Fusion essentially hooks into this mechanism, and just redefines the constants controlling how large the hot file area is to have it cover all of the SSD (minus of course the area for file system metadata, the area that is reserved for fast writes, and so on).
I can't think of any realistic situation (within pure OSX) where tracking by blocks rather than files is useful, and it would require a whole new way of looking at the problem. I think the obvious way to test this would be to look at the behavior of VM images which, assume, as a whole don't count as hot because they are very large, but which do have hot blocks inside them. If you look at IO when, say, starting up a VM, do you see all the IO coming from the HD, or do you see it all come from the SSD, with HD accesses coming later once the VM is booted and we're now pulling in less frequently accessed blocks?
What a silly review and comparison. Apple vs Apple, conclusión: the Moore's law still works.
CPU performance: It look like Apple made any work instead of changing the Intel CPU, Apple CPU performance comparison? there's no other comparison of the Intel's CPUs? Just look at them, what to spect?
And over and over... Apple iMAc vs Apple iMacs, who wins? Apple, of course.
Final words..."Apple continues to have the strongest Mac lineup of its history" Does any other have MAC lineups? Of course no, as it's a monopolistic stuff. So It's unnecesary to say what's obvious.
Apple's Haswell?? yo write the article as if Apple did anything on the Haswell desing or manufacture.... Maybe Intel have to say something about it...
And here is what really matters: The iMac’s industrial design is beautiful.
No complains about the small breaking-wrist keyboard or any thing as Apple fights againts Apple it wold be always a winner: Apple.
I'd say the Apple vs. Apple comparisons are valid because of the limited all-in-one market. It's hard to find something comparable; a lot of iMac alternatives are either budget models or use significantly different components (see the ASUS Transformer AIO or Lenovo IdeaCentre A730 as examples). The company practically dominates the category, at least in North America and Europe.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
127 Comments
Back to Article
g1011999 - Monday, October 7, 2013 - link
Finally. I check anandtech several times recently for Iris Pro based iMac 21" review.malcolmcraft - Thursday, October 9, 2014 - link
It's nice, I agree. But for a full-size work station I'd not recommend Mac. /Malcolm from http://www.consumertop.com/best-desktop-guide/Shivansps - Monday, October 7, 2013 - link
I suspecting that the big loss in performance on high details compared to 750M may be related to L4 eDRAM running short than driver issue, as AA, Intel never had good performance with filters, they support hardware x2 AA yet?tipoo - Monday, October 7, 2013 - link
Yeah, doesn't AA hammer bandwidth? The eDRAM helps performance, but it's still quite low compared to what the other cards are paired with, even in best case scenarios.IntelUser2000 - Tuesday, November 12, 2013 - link
I don't think its just that. Compared to the competition like the Trinity's iGPU and the GT 650M, the texture fill rate is rather low. That impacts performance not only in texture bound scenarios with settings cranked up but anti-aliasing as well. The fillrate of the top of the line Iris Pro 5200 is about equal to Trinity while the version in the iMac would fall short. The GT 650M is 40% better than the top of the line Iris Pro and over 55% better than iMac version.There's also something to be desired about Intel's AA implementation. Hopefully Broadwell improves on this.
IanCutress - Monday, October 7, 2013 - link
Interestingly we see Crystalwell not have any effect on CPU benchmarks, although we can probe latency as seen before.willis936 - Monday, October 7, 2013 - link
This seems counter intuitive. It's acting as a CPU+GPU shared cache correct? Intel architectures are relatively cache bandwidth starved and you'd think that 128MB of L4 would help keep the lower levels filled.Flunk - Monday, October 7, 2013 - link
Perhaps it means that the assumption that Intel architectures are relatively cache bandwidth starved is faulty.name99 - Monday, October 7, 2013 - link
Or that the working set of most benchmarks (if not most apps) is captured with a 4 or 6MB cache?Caching's basically irrelevant for data that is streamed through.
tipoo - Thursday, October 10, 2013 - link
The L4 is pretty low bandwidth for a cache though.elian123 - Monday, October 7, 2013 - link
Anand, could you perhaps indicate when you would expect higher-res iMac displays (as well as pc displays in general, not only all-in-ones)?solipsism - Monday, October 7, 2013 - link
Before that happens Apple will likely need to get their stand-alone Apple display "high-res". I don't expect it to go 2x like avery other one of their display; instead I would suspect it to be 4K, which is exactly 1.5x over the current 27" display size. Note that Apple mentioned 4K many times when previewing the Mac Pro.Also, the most common size for quality 4K panels appears to be 31.5" so I would't be surprised to see it move to that size. When the iMacs are to get updated I think each would then most likely use a slightly larger display panel.
mavere - Monday, October 7, 2013 - link
~75% of the stock desktop wallpapers in OSX 10.9 are at 5120x2880.It's probably the biggest nudge-nudge-wink-wink Apple has ever given for unannounced products.
name99 - Monday, October 7, 2013 - link
Why does Apple have to go to exactly 4K? We all understand the point, and the value, of going to 2x resolution. The only value in going to exactly 4K is cheaper screens (but cheaper screens means crappy lousy looking screens, so Apple doesn't care).jasonelmore - Monday, October 7, 2013 - link
4k is 16:9 ratio, to do a 16:10 right, they would have to do 5krepoman27 - Monday, October 7, 2013 - link
iMacs have been 16:9 since 2009, and 3840x2400 (4K 16:10) panels have been produced in the past and work just fine.repoman27 - Monday, October 7, 2013 - link
Apple is pretty locked in to the current screen sizes and 16:9 aspect ratio by the ID, and I can only imagine they will stick with the status quo for at least one more generation in order to recoup some of their obviously considerable design costs there.Since Apple sells at best a couple million iMacs of each form factor in a year’s time, they kinda have to source panels for which there are other interested customers—we’re not even close to iPhone or iPad numbers here. Thus I’d reckon we’ll see whatever panels they intend to use in future generations in the wild before those updates happen. As solipsism points out, the speculation that there will be a new ATD with a 31.5”, 3840x2160 panel released alongside the new Mac Pro makes total sense because other vendors are already shipping similar displays.
I actually made a chart to illustrate why a Retina iMac was unlikely anytime soon: http://i.imgur.com/CfYO008.png
I listed the size and resolution of previous LCD iMacs, as well as possible higher resolutions at 21.5” and 27”. Configurations that truly qualify as "Retina" are highlighted in green, and it looks as though pixel doubling will be Apple’s strategy when they make that move. I also highlighted configurations that require two DP 1.1a links or a DP 1.2 link in yellow, and those that demand four DP 1.1a links or two DP 1.2 links in red for both standard CVT and CVT with reduced blanking. Considering Apple has yet to ship any display that requires more than a single DP 1.1a link, and all of the Retina options at 27" are in the red is probably reason alone that such a device doesn't exist yet.
I also included the ASUS PQ321Q 31.5" 3840x2160 display, and the Retina MacBook Pros as points of comparison to illustrate the pricing issues that Retina iMacs would face. While there are affordable GPU options that could drive these displays and still maintain a reasonable degree of UI smoothness, the panels themselves either don't exist or would be prohibitively expensive for an iMac.
name99 - Monday, October 7, 2013 - link
OR what you chart tells us is that these devices will be early adopters of the mythical (but on its way) DisplayPort 1.3?Isn't it obvious that part of the slew of technologies to arrive when 4K hits the mainstream (as opposed to its current "we expect you to pay handsomely for something that is painful to use" phase will be an updated DisplayPort spec?
repoman27 - Monday, October 7, 2013 - link
Unlike HDMI 1.4, DisplayPort 1.2 can handle 4K just fine. I'd imagine DP 1.3 should take us to 8K.What baffles me is that every Mac Apple has shipped thus far with Thunderbolt and either a discrete GPU or Haswell has been DP 1.2 capable, but the ports are limited to DP 1.1a by the Thunderbolt controller. So even though Intel is supposedly shipping Redwood Ridge which has a DP 1.2 redriver, and Falcon Ridge which fully supports DP 1.2, we seem to be getting three generations of Macs where only the Mac Pros can actually output a DP 1.2 signal.
Furthermore, I don't know of any panels out there that actually support eDP HBR2 signaling (introduced in the eDP 1.2 specification in May 2010, clarified in eDP 1.3 in February 2011, and still going strong in eDP 1.4 as of January this year). The current crop of 4K displays appear to be driven by converting a DisplayPort 1.2 HBR2 signal that uses MST to treat the display as 2 separate regions into a ridiculously wide 8 channel LVDS signal. Basically, for now, driving a display at more than 2880x1800 seems to require multiple outputs from the GPU.
And to answer your question about why 4K, the problem is really more to do with creating a panel that has a pixel pitch somewhere in the no man's land between 150 and 190 PPI. Apple does a lot of work to make scaling decent even with straight up pixel doubling, but the in-between pixel densities would be really tricky, and probably not huge sellers in the Windows market. Apple needs help with volume in this case, they can't go it alone and expect anything short of ludicrously expensive.
name99 - Tuesday, October 8, 2013 - link
My bad. I had in mind the fancier forms of 4K like 10 bits (just possible) and 12 bit (not possible) at 60Hz, or 8bit at 120Hz; not your basic 8 bits at 60Hz. I should have filled in my reasoning.elian123 - Monday, October 7, 2013 - link
Also wondering when (and where) the 4770R will ever turn up.elian123 - Monday, October 7, 2013 - link
Well, one thing Google shows me is that Gigabyte has shown (though not announced) a Brix with 4770R: http://blog.laptopmag.com/gigabyte-brix-iris-graph... and http://www.pcgameshardware.de/CPU-Hardware-154106/...mavere - Monday, October 7, 2013 - link
I think it'd make for a pretty sweet Mac Mini.Penti - Monday, October 7, 2013 - link
Interesting to see were Apple wants their starting point to be.Obviously they don't want that much choice in their iMacs, still waiting to see what they do with their mac mini's too.
Can't help but wonder if you can actually separate the glass and LCD though, has anybody tried? What technique and adhesive does LG/Apple use here? Has anybody tried to run another LCD-panel from the newer iMacs (2012/13)? Older panels might actually be drivable, would be a paint to rebuild a new iMac for those though.
Dennis Travis - Monday, October 7, 2013 - link
Everything I have read says it's one integrated unit with the Glass/LCD together. I am also interesting and will keep looking but it does not look possible so far.Dennis Travis - Monday, October 7, 2013 - link
Oops Interested! :D GrinPenti - Monday, October 7, 2013 - link
It's sold as one unit, but obviously it's bonded with an adhesive, depending on how it might not be possible to separate the two. It should maybe be able to drive older iMac displays though. But those are a few mm thicker.Dennis Travis - Monday, October 7, 2013 - link
Knowing Apple you would probably crack the glass if you tried.Penti - Monday, October 7, 2013 - link
As it's gapless the adhesive covers the whole panel which probably ruins it anyway, or will be hard to clean as you can't really use (strong) solvents on the LCD, but it's fully doable on phones which also adheres the screens with adhesive. Would be interesting to know if that kind of adhesive can be loosened by heat though. Plus it would probably be harder then a small screen here.pdffs - Monday, October 7, 2013 - link
"I’ve heard some developers complain about this in the past, partly blaming it on a lack of lower level API access as OS X doesn’t support DirectX and must use OpenGL instead."Wait, doesn't OpenGL get you *closer* to the hardware than DirectX??
A5 - Monday, October 7, 2013 - link
Not really, no. They're at about the same level of abstraction since they both sit above the driver stack.Flunk - Monday, October 7, 2013 - link
There is also the issue that the driver sits farther up the stack than it does in Windows. Mach being a hybrid microkernel and all. I'm sure they can close the gap with a lot of work, but it doesn't erase the fact that Mac OS's basic structure is not designed to provide direct hardware access.overzealot - Monday, October 7, 2013 - link
The Windows kernel is also a hybrid microkernel.bestham - Monday, October 7, 2013 - link
"Storage & Fusion Drive – By default all of the iMacs come with a 3.5” mechanical hard drive." The physical harddrive size is differentiated between the 21.5" and the 27". I suspect that you already knew this but it slipped.Anand Lal Shimpi - Monday, October 7, 2013 - link
Thank you, fixed :)rootheday3 - Monday, October 7, 2013 - link
My opinion: Many games look pretty good at medium and it is not clear that the compromise in frame rate is worth it to push for higher settings vs running at medium with higher resolution.Regarding the Iris Pro gaming results - the jump from 13x7 medium (where Iris seems to have plenty of power to spare) up to 16x9 high/very high - this is a change of two variables: both resolution and settings.
Can you rerun some of the gaming tests at 16x9 medium (or maybe native panel resolution - 19x10 medium where 13x7 results were close to 60fps) to see what that shows?
rootheday3 - Monday, October 7, 2013 - link
one other question - is there any ability to OC the IGP in the bios? Typically Intel gpus have a fair bit of OC headroom and it would be interesting to see:1) what kind of performance/playability can be extracted
2) to what extent the igp clock frequency is the limiter on Iris Pro vs cpu, memory bandwidth, etc.
jeffkibuule - Monday, October 7, 2013 - link
This is a Mac! There is no bios, never has been.DanNeely - Monday, October 7, 2013 - link
Everyone who isn't an annoying pedant conflates bios and uefi in general usage.repoman27 - Monday, October 7, 2013 - link
And everyone who isn't an annoying pedant would also realize that jeffkibuule meant, "There is no [user accessible BIOS/EFI/UEFI], never has been."BiggieShady - Monday, October 7, 2013 - link
Nice iris vs. geforce gaming comparison. I would love to see temperature and fan speed comparison.name99 - Monday, October 7, 2013 - link
It is crazy how cool these things run. I have a late 2012 27" i7 iMac and there is no part of it that is close to hot. The fan dumps hot air through a single hole which is very cleverly hidden below the hinge of the base so you don't even feel that unless you really look for it.Even when I've run the thing at max performance, having eight Mathematica kernels running in parallel for fifteen min or so, there's no obvious heating and no obvious fan noise.
The difference with even as recent as the 2007 iMac (my last model) is night and day. Part is the much more efficient CPUs, part is also the much more efficient display.
odaiwai - Monday, October 7, 2013 - link
How do you see the actual CPU current speed on an intel Mac? I've been looking for a utility to do that for ages.abazigal - Monday, October 7, 2013 - link
All this points to the possibility of a 15" retina macbook pro using only integrated graphics, which means comparable performance, with even longer battery life.Fun times ahead!
dylan522p - Monday, October 7, 2013 - link
Probably only low end MBPr will do that this year, but next year if 6200 or whatever the Broadwell graphics are improved enough then very likely that could happen.tipoo - Thursday, October 10, 2013 - link
I hope they go the other way with it. GT3e for the 13" duo (especially for the retina, that would be a boon),and keep higher end discreet graphics in the larger models.lefty2 - Monday, October 7, 2013 - link
You forgot to mention that the fusion drive is not supported by Windows (if you are using bootcamp)"If you partition the drive, the new partition will not be part of the logical-volume group that Fusion Drive uses, so it will not benefit from the speed of the SSD"
http://reviews.cnet.com/8301-13727_7-57549766-263/...
ananduser - Monday, October 7, 2013 - link
Man that Iris looks a lot worse in reality than I expected it to be. You really have to crank down the resolution and details to get some decent gaming performance. Then again this is the base imac, it's got compromise written all over it.CharonPDX - Monday, October 7, 2013 - link
Correction: You state "By default all of the iMacs come with a 3.5” mechanical hard drive."iFixIt has shown that the 21.5" iMacs now come with 2.5" mechanical hard drives by default, not 3.5".
AlValentyn - Monday, October 7, 2013 - link
Looks like my 2010 Mac Pro with 3.33 Hex core is still going strong. Finally going to add an SSD to it, and now I'm just waiting to see what AMD's R9 series GPU's are like.It is impressive to see how well these iMacs are doing these days though. :)
mikk - Monday, October 7, 2013 - link
The difference to the 55W Macbook Iris Pro is so big in some tests that we can't explain it with a 10% lower GPU frequency. Anand once again failed to give us readers proper system infos. You have to learn that 8GB DDR3-1600 is not enough because it can be 2x4 GB in dualchannel or 1x8GB in singlechannel.Anand Lal Shimpi - Monday, October 7, 2013 - link
All modern Macs ship in dual-channel mode.It's not just GPU frequency but turbo residency, which is lower on the 4570R for some reason.
thunng8 - Monday, October 7, 2013 - link
There is no 55W Macbook Iris Pro.It was an Intel supplied development board - not even in laptop form factor.
coolhardware - Monday, October 7, 2013 - link
Typo on last paragaph of page 4:'Doing *to* brings the price of the entry level 21.5-inch iMac up to $1499...'
Interesting to see another look at Iris Pro. The current generation continues to leave me a bit disappointed, I had such high hopes for it. Here's hoping that Intel makes some significant strides in the next generation (i.e. signigicantly more than 10-15% improvement)
Anand Lal Shimpi - Monday, October 7, 2013 - link
Edited :)I suspect Broadwell will improve things once again, but Intel seems to be consistently one generation behind what we actually want for that generation.
tipoo - Thursday, October 10, 2013 - link
As soon as Broadwell comes out I'm sure we'll all be on the "wait for the actual new architecture" boat, such is technology :PBut if Broadwell packs twice the EUs and the eDRAM bandwidth to feed it, that would be quite nice on the GPU side. I just hope they can improve the CPU side more.
farhadd - Monday, October 7, 2013 - link
The high end 27" imac is a 775M.kwrzesien - Monday, October 7, 2013 - link
Anand, typo in the specs chart on the first page.The top 27" model graphics should be "NVIDIA GeForce GTX 775M (2GB GDDR5)". 775M instead of 755M.
Anand Lal Shimpi - Monday, October 7, 2013 - link
Edited, thank you!squirrelboy - Monday, October 7, 2013 - link
i'm again baffled by this. who in their right minds would pay 1,3K for something with a low-end i5 and an iGPU? those specs belong in a $500 laptop. i really can't wrap my head around why anyone would do this. think of all the hardware you could get for that money! you'd be looking at an i5-4670k + gtx770saarek - Monday, October 7, 2013 - link
The problem is that you are looking at this from a technical spec based point of view.It is true that you can build a more powerful computer for less, some may argue far less. Still, these calculations usually fail to take account of the design of the chassis, build/material quality and value of the overall design of the iMac/MacPro/Whatever other Apple product you can think of.
One must consider that the average person does not wish to build their own computer and is not interested in a specification check list.
Most people want a fast, reliable and user friendly computer and are willing to pay a fair price for this experience. The entry level 21" iMac is a perfect fit for the large majority of the market.
DukeN - Monday, October 7, 2013 - link
Yes, you need to look at it from an iDouche point of view.Besides, you can't put an intrinsic value on having an iFaeces Pro on your desk so you can look cool and smug, and feel good when you see all the Apple commercials on TV.
nerd1 - Monday, October 7, 2013 - link
So your logic is "average' person is ignorant. It's sad the apple's sales proves that..web2dot0 - Monday, October 7, 2013 - link
I guess if you think people who spends $100 for jeans are "idiots", "ignorant". You are correct. But that doesn't mean that designer jeans manufacturer stops making them. In fact, they are highly profitable.They are also not idiots.
The #1 reason why people why designer jeans is because THEY LOOK BETTER. They all do the same thing .... cover your crotch. But there's value in style .... if you don't care of it, it doesn't mean that other people who value it are "ignorant" or "idiot".
That's alot of idiots running around the earth ..... have you thought maybe it's the other way around?
squirrelboy - Monday, October 7, 2013 - link
i guess i'm too rational to prioritize computer design above price/performance. if i want a pc for general office tasks, spending more than €400 seems like a waste of money. If i were to spend 1,3K on a pc, i'd prefer it to be good at it's purpose. i dont look at the pc, but at what it displays on the screen.web2dot0 - Saturday, October 12, 2013 - link
Your ignorant comment tells me that you walk around like a caveman. Why buy shampoo? Make sure own. Why why toothpaste? Make your own.Why go out to eat? Stay home, it's cheaper. It's all about what goes into your mouth right?
Idiot
foolio5 - Monday, October 14, 2013 - link
Your analogies are poor.tipoo - Monday, October 7, 2013 - link
The 21 inch IPS display costs something too.nerd1 - Monday, October 7, 2013 - link
21" 1080p display are dirt cheap nowadays.Dug - Monday, October 7, 2013 - link
Not a calibrated IPSCalista - Tuesday, October 8, 2013 - link
They have become pretty inexpensive nowadays. A Dell 21" Dell Ultrasharp IPS-screen is only 40-50 USD more expensive than the cheapest TN-panel. Still don't think people are fair when complaining about the cost of the iMac. It's quiet, it looks good, it holds a great resale value. A lot of people would think twice about dropping $500 a month of their car, why not spend a few extra quid on a computer? Back in the days a computer was expensive, now it's so cheap it's almost silly.repoman27 - Monday, October 7, 2013 - link
Not that I'd buy a base model 21.5-inch iMac, but I think most consumers could care less about their CPU and GPU specs and are looking more at the overall package. For most office desk jockeys, school computer labs, customer use / internet kiosks, parents, etc., the iMac probably makes more sense than a custom built overclocked gaming rig.It also depends on how much you value what you get in said package. The i5-4570R is essentially an i5-4430S with Iris Pro 5200, so it's likely a $240-270 part. Then there's the rather well calibrated 1920x1080 IPS panel which would run another $150 or so. The keyboard and mouse that are included have a combined retail price of $138. Then there's the 3x3:3 802.11ac Wi-Fi plus Bluetooth 4.0, Broadcom GbE NIC and SDXC UHS-I card reader, Thunderbolt, 720p camera, dual microphones, built-in amp and stereo speakers, plus whatever value you put on OS X and the bundled iLife suite.
Flunk - Monday, October 7, 2013 - link
Screen, form-factor, OS. Performance isn't the number one concern for many people.Dug - Monday, October 7, 2013 - link
Someone that wants everything built in that doesn't sound or look like a piece of crap.Add up everything that is included and it's not that expensive. Especially when you consider the cpu is quad core (almost all laptop i5's are dual core), 802.11ac, dual mics, webcam, speakers, bluetooth, sdxc, thunderbolt, wireless keyboard and trackpad, a calibrated IPS monitor, and the best part is no noise and only one cable with no power brick. Try and do that with any diy build and tell me how much it is.
KoolAidMan1 - Monday, October 7, 2013 - link
IPS screens aren't cheap, nor are AIO designs. Dell, HP, and Lenovo can't get lower prices when they try to compete in specs or display quality either.A $500 laptop isn't going to have a good display, keyboard, or trackpad either, and resale value is going to be nothing. Comparing a trash laptop to something with much higher quality components that holds value over time doesn't make sense.
robsparko - Tuesday, October 8, 2013 - link
This argument is getting old and inaccurate. People don't always want laptops. Additionally, bottom of the barrel hardware has a very short life span. Sometimes, it is nice to have a well integrated AIO without cables running everywhere. Sure, not everyone wants a computer that will last 3+ years and that is perfectly fine.But, just for fun, here's some pricing from NewEgg parts I believe would be of similar quality to an iMac. Yes, you can find cheaper cases/PSU/etc, but an iMac isn't built using the cheapest components. Likewise, feel free to buy a Dell! But, to call people who buy an iMac ignorant/ out of the mind/ etc is just proving a lack of insight into the situation and market pricing.
250 i5 cpu w/ IRIS (good luck finding it! NewEgg's uses HD4600)
150 mobo
75 case
75 psu
235 IPS 21.5" lcd
65 hdd
50 webcam w/ mic
75 ram
140 windows
65 wireless kb/mouse
35 speakers
?? shipping/tax
?? Time to built
----------------
1215 Total + extra
Calista - Tuesday, October 8, 2013 - link
What you did is a game created to let Apple look good. I made a similar list with parts from Newegg. It was only $1150 but included a Shuttle case, a 27" IPS-display from Dell, a 240 GB Intel SSD and 16 GB of RAM. And Windows 8 Retail obviously. A similar config but with a 21" display from Apple was $1870. The 27" iMac with similar configuration was almost $2400, i.e. twice as expensive as the PC.web2dot0 - Saturday, October 12, 2013 - link
Let me guess, your parts are the same part quality as Apple's ? :)The case isn't worth anything, a calibrated monitor isn't worth anything, AIO isn't worth anything, fusion drive isn't worth anything, .... That's cool.
I can play that game too jackass
foolio5 - Monday, October 14, 2013 - link
Are you really equating an inflated price to higher quality parts? Quick tell me the difference between Kingston and Samsung Ram! How do you know his PSU isn't a Seasonic? What exactly is the brand HDD Apple uses? I heard this all the time about quality parts when everything Apple uses is Samsung or Foxconn. Yes Foxconn... the epitome of quality OEMs!You are adding a lot of money for having an AIO. He is using a Dell 27" IPS, that should clue you into the fact that it is an Ultrasharp which is calibrated, also hes using an SSD so again tell me how a Hybrid beats out an SSD.
Lastly, are you 12 or something. Your name calling is immature.
abazigal - Tuesday, October 8, 2013 - link
I would (if I didn't already own a 2011 27" imac), which I found to be worth every cent.It makes perfect financial sense when you realise that you are not so much buying a computer, as you are getting an integrated solution that's ready to use right out of the box. I find that Macs come with excellent functionality without me having to spend much time setting it up.
Already, it comes preloaded with the excellent iLife suite, a great pdf annotation reader (preview), a stable OS that continues to run smoothly 2 years on without me needing to do anything to maintain it, entry to the Apple ecosystem, and access to fairly inexpensive Mac software.
Then, there is the good aftercare service. Once, my Imac developed screen issues. I made a call to the service centre. 20 minutes later, an appointment was made. They came down to my house the next day, brought it back for servicing, and delivered it back to my house 2 days later. Problem solved with minimal effort or stress on my part.
So in the end, the extra that I am paying is well worth it for the promise of a seamless and hassle-free computing experience. Which I value more than if I were to simply build a desktop using parts sourced individually (and then having to deal with all the troubleshooting on my own subsequently).
Res1233 - Wednesday, October 30, 2013 - link
Mac OS X is the main reason I buy macs. I have a feeling that you are right about most mac users having no clue what they're buying, but I could go on for hours about the advantages of OS X (geek-wise). No, hackintoshes are not an option if you want any kind of reliability, so don't even go there. If you force me to, I will explain my reasoning in depth to practically anandtech-levels, but I'm not in the mood right now. Perhaps another time! :)tipoo - Monday, October 7, 2013 - link
What are the chances of the 13" Pro duo getting Iris Pro 5200? I'd really love that.Bob Todd - Monday, October 7, 2013 - link
Have they announced any dual core Iris Pro parts? I know the original SKU list just had them in the quads. I still assume the 13" rMBP will get the 28W HD 5100 (hopefully in base configuration, but there will probably be a lower spec i5 below that).Flunk - Monday, October 7, 2013 - link
It's unsure, a new part could be announced at any time. Intel has even made variants specifically for Apple before.tipoo - Monday, October 7, 2013 - link
It shouldn't have to be dual core to be in the 13" pros though. Intel has quads in the same TDP as the current duals in it. A quad core, with GT3e, that would make it an extremely tempting package for me.Sm0kes - Tuesday, October 8, 2013 - link
I think anything but GT3e in the 13'' Macbook Pro is going to be a disappointment at this point. How their 13'' "pro" machine has gone this long with sub-par integrated graphics is mind boggling. The move to a retina display really emphasized the weakness.I'd also venture a guess that cost is the real barrier, as opposed to TDP.
tipoo - Thursday, October 10, 2013 - link
Perhaps. Yeah, the 13" has been disappointing to me, I love the form factor, but hate the standard screen resolution, and the HD4000 is really stretched on the Retina. If it stays a dual core, I don't see a whole lot of appeal over the Macbook Air 13" either. To earn that pro name, it really should be a quad with higher end integrated graphics.Hrel - Monday, October 7, 2013 - link
"This is really no fault of Apple’s, but rather a frustrating side effect of Intel’s SKU segmentation strategy."So I take it I'm not the only one infuriated by the fact that Intel hasn't made Hyperthreading standard on all of it's CPU's.
I remember reading, on this site, that HT adds some insignificant amount of die area, like 5% or something, but is capable of adding up to 50% performance. (in theory). If that's the case the ONLY reason to not include it on EVERY CPU is to nickel and dime your customers. Except it should really be "$100" your customers since only the i7's have HT.
Isn't the physical capability of HT already on ALL cpu's? It just needs to be turned on in firmware right?
DanNeely - Monday, October 7, 2013 - link
With the exception of IIRC dual vs quad core dies and GT2 vs GT3 graphics almost everything that differs between CPUs in a generation is either binning or disabling components if too few dies with a segment non-functional are available for the lower bin.Intel could differentiate its product line without doing any segment disabling on the dies; but it would require several times as many different die designs which would require higher prices due to having to do several times as much validation. Instead we get features en/disabled with fuses or microcode because the cost of the 'wasted' die area is cheaper than the costs associated with validating additional die configurations.
Flunk - Monday, October 7, 2013 - link
Actually the GT2 is just a die-harvested GT3. Intel only have 2 and 4 core versions and crystalwell is an add-on die so there are essentially only 2 base dies, at least for consumers.I do agree about the hyper-threading, there is really no need to disable it. It's not like it really matters in consumer applications anyway.
rootheday3 - Monday, October 7, 2013 - link
I don't think this is true. See the die shots here:http://wccftech.com/haswell-die-configurations-int...
I count 8 different die configurations.
Note that the reduction in LLC (CPU L3) on Iris Pro may be because some of the LLC is used to hold tag data for the 128MB of eDRAM. Mainstream Intel CPUs have 2MB of LLC per CPU core, so the die has 8MB of LLC natively. The i7-4770R has all 8MB enabled but 2MB for eDRAM tag ram leaving 6MB for the CPU/GPU to use directly as cache (how it is reported on the spec sheet). The i5s generally have 6MB natively (for either die recovery and/or segmentation reasons) but if 2MB is used for eLLC tag ram, that leaves 4 for direct cache usage.
Given that you get 128MB of eDRAM in exchange for the 2MB LLC consumed as tag ram, seems like a fair trade.
name99 - Monday, October 7, 2013 - link
HT adds a pretty consistent 25% performance boost across an extremely wide variety of benchmarks. 50% is an unrealistic value.And, for the love of god, please stop with this faux-naive "I do not understand why Intel does ..." crap.
If you do understand the reason, you are wasting everyone's time with your lament.
If you don't understand the reason, go read a fscking book. Price discrimination (and the consequences thereof INCLUDING lower prices at the low end) are hardly deep secret mysteries.
(And the same holds for the "Why oh why do Apple charge so much for RAM upgrades or flash upgrades" crowd. You're welcome to say that you do not believe the extra cost is worth the extra value to YOU --- but don't pretend there's some deep unresolved mystery here that only you have the wit to notice and bring to our attention; AND on't pretend that your particular cost/benefit tradeoff represents the entire world.
And heck, let's be equal opportunity here --- the Windows crowd have their own version of this particular fool, telling us how unfair it is that Windows Super Premium Plus Live Home edition is priced at $30 more than Windows Ultra Extra Pro Family edition.
I imagine there are the equivalent versions of these people complaining about how unfair Amazon S3 pricing is, or the cost of extra Google storage. Always with this same "I do not understand why these companies behave exactly like economic theory predicts; and they try to make a profit in the bargain" idiocy.)
tipoo - Monday, October 7, 2013 - link
Wow, the gaming performance gap between OSX and Windows hasn't narrowed at all. I had hoped, two major OS releases after the Snow Leopard article, it would have gotten better.tipoo - Monday, October 7, 2013 - link
I wonder if AMD will support OSX with Mantle?Flunk - Monday, October 7, 2013 - link
Likely not, I don't think they're shipping GCN chips in any Apple products right now.AlValentyn - Monday, October 7, 2013 - link
Look up Mavericks, it supports OpenGL4.1, while Mountain Lion is still at 3.2http://t.co/rzARF6vIbm
Good overall improvements in the Developer Previews alone.
tipoo - Monday, October 7, 2013 - link
ML supports a higher OpenGL spec than Snow Leopard, but that doesn't seem to have helped lessen the real world performance gap.Sm0kes - Tuesday, October 8, 2013 - link
Got a link with real numbers?Hrel - Monday, October 7, 2013 - link
The charts show the Iris Pro take a pretty hefty hit any time you increase quality settings. HOWEVER, you're also increasing resolution. I'd be interested to see what happens when you increase resolution but leave detail settings at low-med.In other words, is the bottleneck the processing power of the GPU (I think it is) or the memory bandwidth? I suspect we could run Mass Effect or something similar at 1080p with medium settings.
Kevin G - Monday, October 7, 2013 - link
"OS X doesn’t seem to acknowledge Crystalwell’s presence, but it’s definitely there and operational (you can tell by looking at the GPU performance results)."I bet OS X does but not in the GUI. Type the following in terminal:
sysctl -a hw.
There should be line about the CPU's full cache hierarchy among other cache information.
Anand Lal Shimpi - Monday, October 7, 2013 - link
Good call - unfortunately no, Crystalwell isn't reported there either.Dman23 - Monday, October 7, 2013 - link
Great review! Thanks for taking the time to review the New iMacs. The Iris Pro 5200 looks extremely interesting.Guspaz - Monday, October 7, 2013 - link
It's amusing to me that the 27" iMac is smaller and thinner than my Dell 27" monitor (the U2711), even though the iMac is a full comptuer while my U2711 is just a monitor.1andrew - Monday, October 7, 2013 - link
The iMac's have a great display and Target Display Mode is a cool feature. I would really like to see them expand it to non-Thunderbolt/Apple devices. I would love to play my 360 without needing a second display.twistedgamez - Monday, October 7, 2013 - link
why don't reviewers test dota2? i know it's not a particularity intensive game, but lots of people play it (often 10 x number of players of the second game on steam) and lots of people would like to know how it performs, especially when you up the resolution, and lots of people only play dota2 so its not particularly easy to judge its performance when seeing metro/tr performance on lower resolutionsbut other than that, a great review!
bluevaping - Monday, October 7, 2013 - link
I know you guys wanted to get a review out, but I think you will see a jump in performance with the upcoming Os X to be released next month. I bet there will be better optimized drivers and support for Open Cl 1.2 and Open GL 4.0jasonelmore - Monday, October 7, 2013 - link
+$250 for nothing but a 128gb ssd for fusion drive is rape.repoman27 - Monday, October 7, 2013 - link
Well, in the US, it's only a $200 adder. And it's a rather performant PCIe based SSD. Have you priced out an alternative that's as fast or faster for less money? Don't get me wrong, Apple maintains a 36% gross margin which is probably considerably higher than, say, Newegg's, but what were you really expecting for this type of CTO option?nerd1 - Tuesday, October 8, 2013 - link
You can get 256 sata3 ssd at much less than 200. They are also charging 200 for extra 8gb of ram.repoman27 - Wednesday, October 9, 2013 - link
Yes, you can get a considerably slower SSD with a crappy controller and lower endurance NAND for less money than Apple's offerings.The RAM is another matter entirely. It's unfortunate they made it such a hassle to upgrade the RAM yourself in the 21.5" model.
foolio5 - Monday, October 14, 2013 - link
Sammy 840 EVO 128 GB SSD is $99 bucks and up there in the speed/longevity department.akdj - Tuesday, October 15, 2013 - link
Sammy 840 will read about half the speed of the new PCIe SSD in the 2013 iMac. It'll be at least twice as quick. These aren't 2.5" SATA3 drives. They're PCIe. Big difference. Small price to pay for the upgrade. You'll never know you don't have a 1.15 TB SSD in there for most consumer workflows. They. Flat. Fly. It's a bargain. That said, I'm with you on the RAM. $100 for 16 seems fair and nicely set for three-five years. While still being able to sell it for 50% after a half decade :-)djscrew - Monday, October 7, 2013 - link
putting a hard drive in, even at a base level, should be considered a crime to computing in this day and age... a 256 gb ssd should be stock... for a company that pushes quality and doesn't concern itself as much with price, apple should have ditched the hdd this generationsaarek - Tuesday, October 8, 2013 - link
256GB is not enough memory for a large proportion of computer users, a more realistic and useful answer would have been for Apple to have made the 1TB Fusion drive the default option.iwod - Monday, October 7, 2013 - link
I Hope Apple can cut cost in their next model some where and put in SSD as Default.idget - Tuesday, October 8, 2013 - link
Personally I prefer the design of the '09 models. Sure it doesn't look "pretty" as these ones (subjective), but I feel that on the imacs, Apple's "thin is good" mentality is annoying.The "bulge" at the back looks ugly. Also because it's so thin, everything is moved to the back, including the sd card slot, which I feel is better on the side of the computer. It's a pain having to reach around the computer (especially on the 27inch). Also it needs more usb ports
I feel that Apple has gone for looks and forgot function on these imacs
alpha754293 - Tuesday, October 8, 2013 - link
How much power does it consume at full load and at idle?MF2013 - Tuesday, October 8, 2013 - link
I'm not sure if the graphics comparison to the DDR3 version of the GT 750m is fair. The DDR3 cripples the graphics performance, and while I don't know what the sales volume numbers are, my impression is that most 750m's that ship do so with DDR5. Also, anybody who games and cares about GPU performance, and is the least bit informed, would make sure to get the DDR5 version. And it's not like the DDR5 version is expensive- the Lenovo y500's that occasionally go for sale around $830 have 2GB of DDR5.So all this demonstrates is that Iris Pro graphics can compete with bandwidth crippled discrete graphics. But if you don't care about gaming, you shouldn't care about whether the integrated graphics are competitive with discrete GPUs. And if you do care about gaming, this performance is still not good enough. The positive tone of this review doesn't seem justified.
speculatrix - Tuesday, October 8, 2013 - link
I just hope you never have to have it repaired out of warranty.mschira - Tuesday, October 8, 2013 - link
nice review, and interesting to see that with Crystallwell there is an integrated GPU that does have acceptable performance.Will be interesting to see in a 13" MBR.
What would also be interesting are performance comparisons of the new real GPUs if and how much better they are than the 2012 level.
GPU performance is still a bottleneck for iMacs I think (i.e. gaming).
M.
mschira - Tuesday, October 8, 2013 - link
Could one buy a super fast USB3 SSD and build ones' own fusion drive on the cheap?Cheers
M.
name99 - Wednesday, October 9, 2013 - link
Yes --- if you're willing to be daring.Essentially you'd need to boot off a third drive (or the network) then use diskutil cs commands to create an LVG then an LV tying the two drives together. There are instructions on the web giving details.
On a different point, I don't think Anand is correct in saying that Fusion works better than other hybrid solutions because it tracks blocks. I think the real answer is that it does a MUCH smarter job of tracking file "temperature". OSX has, since about 10.3, tracked file temperature (which is essentially a combination of how large the file is and how often it's accessed). This was done back then (and is still done) to move the hottest files to a small "hot files" area at the start of a disk for the obvious performance reasons.
Details here:
http://osxbook.com/book/bonus/misc/optimizations/#...
My guess is that Fusion essentially hooks into this mechanism, and just redefines the constants controlling how large the hot file area is to have it cover all of the SSD (minus of course the area for file system metadata, the area that is reserved for fast writes, and so on).
I can't think of any realistic situation (within pure OSX) where tracking by blocks rather than files is useful, and it would require a whole new way of looking at the problem. I think the obvious way to test this would be to look at the behavior of VM images which, assume, as a whole don't count as hot because they are very large, but which do have hot blocks inside them. If you look at IO when, say, starting up a VM, do you see all the IO coming from the HD, or do you see it all come from the SSD, with HD accesses coming later once the VM is booted and we're now pulling in less frequently accessed blocks?
Risas - Thursday, October 10, 2013 - link
What a silly review and comparison. Apple vs Apple, conclusión: the Moore's law still works.CPU performance: It look like Apple made any work instead of changing the Intel CPU, Apple CPU performance comparison? there's no other comparison of the Intel's CPUs? Just look at them, what to spect?
And over and over... Apple iMAc vs Apple iMacs, who wins? Apple, of course.
Final words..."Apple continues to have the strongest Mac lineup of its history" Does any other have MAC lineups? Of course no, as it's a monopolistic stuff. So It's unnecesary to say what's obvious.
Apple's Haswell?? yo write the article as if Apple did anything on the Haswell desing or manufacture.... Maybe Intel have to say something about it...
And here is what really matters: The iMac’s industrial design is beautiful.
No complains about the small breaking-wrist keyboard or any thing as Apple fights againts Apple it wold be always a winner: Apple.
Commodus - Thursday, October 17, 2013 - link
I'd say the Apple vs. Apple comparisons are valid because of the limited all-in-one market. It's hard to find something comparable; a lot of iMac alternatives are either budget models or use significantly different components (see the ASUS Transformer AIO or Lenovo IdeaCentre A730 as examples). The company practically dominates the category, at least in North America and Europe.Haplodepatrijn - Thursday, October 10, 2013 - link
Doses this mean we can finally use the GPU render in Blender ?shweetuant - Friday, October 25, 2013 - link
HiIf I am planning to install Windows 7 64-bit as virtual running side by side with OS X. Is the default 8GB of ram sufficient?
kkirk - Tuesday, December 3, 2013 - link
I was wondering if there will be a late 2013 Macbook Pro 15" review coming soon?