He said "Hey Intel"; Intel is evangelist of x86, so he probably meant "Hey Intel! It's time to release new updated x86 CPU (read: Broadwell/Skylake) not only for portables, but for bigger machines too" :)
Heard of i960, StrongARM, or Itanium? No? Intel is very much a one trick pony. They, may be really good at that trick, but past endeavors have proven they are not good at pushing new arch. Granted most of the reason Itanium was so bad was because they let HP and other partners influence its instruction set and design, but it still sucks.
The 'only' problem with Itanium ISA is that it's VLIW, but that was the basic idea of the whole thing (what had the chance back then to turn out to be a good idea as well), the rest of the ISA is the most well-thought I have ever seen.
Relying on the compiler to determine order, parallelisms, and branch predictions was too ambitious. Borderline insane to think it was possible at this point in time. Despite the large transistor count it would make a better coprocessor.
Are you kidding me? No Displayport 1.3, No Thunderbolt 3, & not even Broadwell? They didn't even bother adding USB Type-C.
They should have done what they did with the Mac Pro last year and have such things part of the product and just allowed people to pre-order.
This device still does admittedly blows alternatives like the Dell 5K monitor for mainstream professionals w/o intensive power needs (otherwise they would've opted for a specialized workstation or a Mac Pro).
There are no DP 1.3 or USB Type-C products on the market right now and probably won't be for several months. The TB3 spec has not been finalized and won't be until next year with Skylake. The retina iMacs use desktop class processors which Intel is skipping with Broadwell. I really don't see what the problem is.
The Displayport 1.3 standard was just released in September 2014. So no one has the parts for it yet. Apple had to custom modify the internal DisplayPort 1.2 hardware to run it like Displayport 1.3 internally.
"Too bad this thing isn't using the 970/980M from Nvidia."
I'm pretty sure Apple weighed the pros and cons of using the 970/980M. Ultimately the better deal was to partner with AMD. We will found on the reasons for this partnership in the future, but for now I wouldn't jump to conclusions and assume that the consumer was better off with an Nvidia part.
There plenty of Pros for going with AMD. Though maybe not directly for the benefit of Apples customer. 1) Apple is having a patent fight with Nvidia. I'm sure they are not happy with Nvidia. 2) Apple, Microsoft and Sony all want to see AMD around to have competition for Intel / Nvidia, Giving AMD business to keep it afloat helps, even if they sacrifice some performance 3) AMD has less leverage in price negotiations compared to Nvidia due to it's much poorer financial situation. Most likely they are willing to work with smaller margins just to stay alive. And they might be willing to do more customization job to get these deals, as they are really struggling.
Not to mention that not supporting CUDA on the OSX basically forces software vendors to write OpenCL support into their applications. They might hate it, but they don't really have good options except abandoning OSX all together which doesn't seem like an option as Apple has surprising marketshare in some sectors.
It'd be interesting if Apple bought Nvidia. Nvidia has a market cap of only about 10 Billions. Less than half a year of Apples profits...
Apple would not buy nVidia, as Apple does not sell graphics cards and other components, only complete consumer devices. So why pay for entire well running business, when you need just their R&D?
AMD could have won it due to their willingness(?) to customize the DP things. And maybe give some exclusivity on M295X + lower prices. It is also question whether 970M/980M were not late to the party.
Same reason in why Apple built their own ARM chips for their iPhone and iPad? Why bother with the in house processor while Apple only sell complete devices? One definite answer is lower production costs. So I don't see no difference between Apple bought nVidia and Apple bought a few ARM design firms in the past.
Not even close. The new Mac Mini is dual-core only, with Intel Iris (not Pro) graphics.
If your workload is simple (view high-res data without GPU compute transforms,) then it would work fine. But if you do CPU-intense or GPU-intense things, the iMac will blow it out of the water.
According to their website, GPU can be not only new R9 M295X (4 GB VRAM), supposed to be some kind of Tonga GPU, but also R9 M290X (4 GB VRAM), which is "good old Pitcairn/Curacao", if I'm not mistaken. Didn't know Pitcairn/Curacao is 5K-capable. May be a new chip revision though.
Maybe someone can help me out here: Why are you so sure that they're using DP at all? I've never seen DisplayPort used to drive an internal display so far...
Displayport was introduced in laptops around 2 years ago. Many of the ultrabooks have an eDP connector (eDP, embedded displayport, is the same as normal DisplayPort, but with an different form factor). All laptops with 3D panels also have an eDP connector, as well as most IPS panels. I would like an better display in met 15.6" laptop, but the only IPS panel with lvds available in this form factor is the sony orangegate panel. At least that was the case about one year ago.
Not sure what you're saying here. LVDS is the de-facto standard in used in notebooks, that why I said I've never seen eDP (or iDP) being used there instead of the usual FPD. As far as I know DP also uses LVDS (rather than TMDS or even CML) so adapting displays from FPD to eDP might be even possible.
I'm surprised at the starting price. This is the same as the price of the new Dell 5k machi e which is supposed to,use two 2880 panels, and costs the same $2,500 list price.
I wonder if Dell will now have to lower the price of their monitor, as it now looks overpriced.
Semiaccurate correctly predicted that Apple would switch from using Nvidia GPUs to AMD in Mac products. The reason why is retalitation for Nvidia suing Apple for patent infringment over iPhone GPU. Nvidia is suing Samsung and Qualcomm, but they are also suing Apple for the same patents, although that hasn't been made public yet. More details here: http://semiaccurate.com/2014/09/04/n...accurate-sa...
Apple has played Nvidia and ATI/AMD against each other for more than a decade. I highly doubt they care at all about a patent lawsuit, as opposed to things like delivery date, volume, and pricing. Remember that they still use Samsung as a major supplier to this day.
Like the author said. I bet it has more to do with AMD's capabilities in customising the Display port output and building custom PCB's for Apple much like they have done with the GPU on the Mac Pro. AMD is apparently much better equipped to build custom-design GPU's over nVidia. Hence why AMD is powering the XBONE and PS4.
The M295X is more than likely just a rebrand. No customisation, although it supports FreeSync - that's just standard on AMD GPU's. That's completely different to Xbone and PS4 which use semi-custom APU's.
No, it has more to do with OpenCL and pushing away from CUDA. Both need that to happen: Apple wants to do it to always have more options on what GPU they have available to consider for their new products, and ATI definitely needs to better penetrate the workstation/professional market--this is a market that NVidia destroyed them in (67%+ of the market).
"At 5120x2880 pixels, the new Retina 5K Display is precisely 4x the pixels of the 2560x1440 panel in last year’s model."
4x? Granted, I'm really bad at math, but when you double something, isn't it only 2x?
Also, even though Apple has stated that this display can't be used in Target Display Mode, that would change if the firmware for the DisplayPort could be updated later to 1.3.
When you double a rectangle along both sides, you end up with four times the area. So when you double the number of pixels in both directions, you end up with four times as many in total.
I guess I was half expecting ATI's to be in the new Mac's, but how will this effect pro users? Especially Adobe After Effect/Premiere users who rely on Cuda to some extent? I can't recall if ATI had a Cuda alternative and if Adobe was going to even bother supporting it.
Adobe has long supported the OpenCL open standard for acceleration of their entire Suite. Modern AMD GPUs are very fast at OpenCL which is why this wouldn't be an issue.
This Russian is talking complete sense. I also wanted to add that not only is OpenCL beginning to come into it's own(it's still slightly behind CUDA in terms of some processes - but that will improve over time), you can also still get the advantages of CUDA by plugging in a Thunderbolt expansion chassis with a CUDA enabled card. The CUDA card will be recognized across the Thunderbolt bus, and then you can assign that card to be used by Adobe CC for whatever tasks you want. It's really smooth.
Has anyone confirmed if this iMac will work properly with Windows installed? Most Apple systems run Windows fine, but I can see the monitor's custom TCON potentially being a problem here on the graphics driver side.
On http://www.apple.com/imac-with-retina/osx/ it says : "If you want to run Windows on your Mac, you can do that with Boot Camp." So I suppose they have included Windows drivers as well for the display - would be awesome!
I wouldn't consider that as confirmation. Fact is - you cannot run Windows on a machine with a Fusion Drive, meaning that no standard configuration iMac 5k will run Windows. So this text is just irrelevant.
The other bloke responded quite aggressively. He's right however. Fusion Drive equipped Macs can run Windows just fine, you're not entirely off bat however. The Windows part of the machine will use only the spinning platter part of the disk, and ignore the SSD.
Something tells me that if Apple had the time to make custom silicone, they likely took the time (or are taking the time now) to write a driver as well. Of course, there are no guarantees... but they know that people rely on that functionality, so it would stand to reason that they would.
Amazing that dell is going to release a similar monitor for the same price. Apple is including a free computer lol. I thought Apple was the expensive one? Finally apple is taking advantage of it's amazing supply chain to offer crazy technology at the lowest prices and the highest profit margins.
If one wanted to use a 2560 x 1440 secondary display connected by Thunderbolt, would it be necessary to get the upgraded GPU with 4GB of video RAM or might the R9 M290x with 2GB be sufficient?
In case anyone missed iFixit's teardown ( https://www.ifixit.com/Teardown/iMac+Intel+27-Inch... ), the TCON is a semi-custom chip made by Parade for Apple. The DP665, as it's marked, looks to be an 8-lane eDP 1.3 timing controller with PSR.
So if the GPU isn't treating the display as two separate tiles and using two separate DP 1.2 outputs, the sneaky bit would be how Apple managed to get a single display controller and digital encoder / transmitter block to support 5120x2880, and then bind both links of a UNIPHY transmitter block into a single 8-lane eDP interface. I guess this works fine with TMDS for Dual-Link DVI, so maybe it was possible all along for DisplayPort as well. It's odd to think that Pitcairn can pull off driving a 5120x2880 panel as a single tile though.
I'm guessing the flexibility of AMD's digital encoder / transmitter blocks and PHYs is the primary reason Apple went with their GPUs for the Retina iMac.
I am kind of curious if they are just doing something like running at 18-bit color (6bpc) with dithering or something. Not even using that extreme of timings (regular cvt-r) one can run 5120x2880 @ 60Hz via a 938 Mhz pixelclock which is within spec of what DP 1.2 can do @ 6 bpc (limit is 960 Mhz). The 8bpc limit is 720 Mhz pixel clock and if they are staying true to that no timings would be capable of of fitting it in 720 Mhz as absolutely 0 extra blanking would still be 884 Mhz pixelclock.
Yeah, but like I said, if you look at the teardown photos which show the TCON, it's pretty clearly an 8-lane eDP job, which is more than enough for 5120x2880, 24 bpp, 60 Hz using HBR2. I seriously doubt they would double the number of signaling pairs between the GPU and TCON unless they were actually using them.
What I'm wondering--for future proofing purposes, as I'm about to drop over $3k on one of these--is: Do you or others you might check in with think it's possible and therefore probable that a third party or perhaps software upgrade from Apple will allow this iMac to be used in *4k* (yes, 4, not 5) Target Display Mode at some point? That'd at least be a nice consolation prize for having bought early and getting to use it with computer at 5k now, knowing that at some point--even with the DP 1.2, unlikely to ever be upgradable--that this could be used as a secondary 4k display, at the least?
Hi, I have stock of Brand New Apple iPhone 6 - 64GB Unlocked phones for sale at $650 only, sealed in box with 1 year warranty. Available in Gold , silver and space grey colors
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
83 Comments
Back to Article
ezinner - Thursday, October 16, 2014 - link
Hey Intel, it's time for a new processor family to launch. How much longer will we push the x86 architecture?widL - Thursday, October 16, 2014 - link
Are you aware that moving to a new instruction set architecture would break every piece of software ever developed until now?TiGr1982 - Thursday, October 16, 2014 - link
He said "Hey Intel"; Intel is evangelist of x86, so he probably meant"Hey Intel! It's time to release new updated x86 CPU (read: Broadwell/Skylake) not only for portables, but for bigger machines too" :)
Morawka - Thursday, October 16, 2014 - link
not if you had a translator or emulator. We've done this many times before.StevoLincolnite - Saturday, October 18, 2014 - link
Meet: Binary Translation.CharonPDX - Thursday, October 16, 2014 - link
We did. A few years ago. It's called x86-64.ppi - Friday, October 17, 2014 - link
It was AMD :-Phpglow - Thursday, October 16, 2014 - link
Heard of i960, StrongARM, or Itanium? No? Intel is very much a one trick pony. They, may be really good at that trick, but past endeavors have proven they are not good at pushing new arch. Granted most of the reason Itanium was so bad was because they let HP and other partners influence its instruction set and design, but it still sucks.xdrol - Thursday, October 16, 2014 - link
The 'only' problem with Itanium ISA is that it's VLIW, but that was the basic idea of the whole thing (what had the chance back then to turn out to be a good idea as well), the rest of the ISA is the most well-thought I have ever seen.ZeDestructor - Thursday, October 16, 2014 - link
That, and it having three non-virtualizable instructions...hpglow - Thursday, October 16, 2014 - link
Relying on the compiler to determine order, parallelisms, and branch predictions was too ambitious. Borderline insane to think it was possible at this point in time. Despite the large transistor count it would make a better coprocessor.widL - Thursday, October 16, 2014 - link
The i7 CPU is actually the 4790K, since the 4790 has a 3.6 Ghz base clock.TiGr1982 - Thursday, October 16, 2014 - link
That is probably the case, because their website states:"Configurable to 4.0GHz quad-core Intel Core i7 (Turbo Boost up to 4.4GHz)"
Laxaa - Thursday, October 16, 2014 - link
Most exciting Apple-announcement in a while for me. The price is surprisingly competitive as well.lilkwarrior - Friday, October 17, 2014 - link
Are you kidding me? No Displayport 1.3, No Thunderbolt 3, & not even Broadwell? They didn't even bother adding USB Type-C.They should have done what they did with the Mac Pro last year and have such things part of the product and just allowed people to pre-order.
This device still does admittedly blows alternatives like the Dell 5K monitor for mainstream professionals w/o intensive power needs (otherwise they would've opted for a specialized workstation or a Mac Pro).
SirKnobsworth - Friday, October 17, 2014 - link
There are no DP 1.3 or USB Type-C products on the market right now and probably won't be for several months. The TB3 spec has not been finalized and won't be until next year with Skylake. The retina iMacs use desktop class processors which Intel is skipping with Broadwell. I really don't see what the problem is.jameskatt - Friday, October 17, 2014 - link
ARE YOU KIDDING US??Duh: Displayport 1.3, Thunderbolt 3, USB Type-C and Intel's Broadwell Desktop CPUs are ALL VAPORWARE. They aren't out yet. Double Duh.
Apple cannot make products out of vaporware. That is why they are not in the new iMac.
In fact, since no one else makes a 5K Timing Controller for the monitor, even Apple had to design manufacture the timing controller chip itself.
jameskatt - Friday, October 17, 2014 - link
The Displayport 1.3 standard was just released in September 2014. So no one has the parts for it yet. Apple had to custom modify the internal DisplayPort 1.2 hardware to run it like Displayport 1.3 internally.abrowne1993 - Thursday, October 16, 2014 - link
Too bad this thing isn't using the 970/980M from Nvidia.When are we going to see displays with DisplayPort 1.3? Should be able to do 5K with no compromises, right?
anandreader106 - Thursday, October 16, 2014 - link
"Too bad this thing isn't using the 970/980M from Nvidia."I'm pretty sure Apple weighed the pros and cons of using the 970/980M. Ultimately the better deal was to partner with AMD. We will found on the reasons for this partnership in the future, but for now I wouldn't jump to conclusions and assume that the consumer was better off with an Nvidia part.
zepi - Friday, October 17, 2014 - link
There plenty of Pros for going with AMD. Though maybe not directly for the benefit of Apples customer.1) Apple is having a patent fight with Nvidia. I'm sure they are not happy with Nvidia.
2) Apple, Microsoft and Sony all want to see AMD around to have competition for Intel / Nvidia, Giving AMD business to keep it afloat helps, even if they sacrifice some performance
3) AMD has less leverage in price negotiations compared to Nvidia due to it's much poorer financial situation. Most likely they are willing to work with smaller margins just to stay alive. And they might be willing to do more customization job to get these deals, as they are really struggling.
Not to mention that not supporting CUDA on the OSX basically forces software vendors to write OpenCL support into their applications. They might hate it, but they don't really have good options except abandoning OSX all together which doesn't seem like an option as Apple has surprising marketshare in some sectors.
It'd be interesting if Apple bought Nvidia. Nvidia has a market cap of only about 10 Billions. Less than half a year of Apples profits...
ppi - Friday, October 17, 2014 - link
Apple would not buy nVidia, as Apple does not sell graphics cards and other components, only complete consumer devices. So why pay for entire well running business, when you need just their R&D?AMD could have won it due to their willingness(?) to customize the DP things. And maybe give some exclusivity on M295X + lower prices. It is also question whether 970M/980M were not late to the party.
iSayuSay - Saturday, October 18, 2014 - link
Same reason in why Apple built their own ARM chips for their iPhone and iPad? Why bother with the in house processor while Apple only sell complete devices? One definite answer is lower production costs. So I don't see no difference between Apple bought nVidia and Apple bought a few ARM design firms in the past.tipoo - Thursday, October 16, 2014 - link
I wonder how comparable the new Mac Mini and the 999 dollar Vizio 4K TV would be...Though that would be too massive for professional work maybe.CharonPDX - Thursday, October 16, 2014 - link
Not even close. The new Mac Mini is dual-core only, with Intel Iris (not Pro) graphics.If your workload is simple (view high-res data without GPU compute transforms,) then it would work fine. But if you do CPU-intense or GPU-intense things, the iMac will blow it out of the water.
jameskatt - Friday, October 17, 2014 - link
This is where Apple went backwards. The old Mac Mini could do a quad-core i7 CPU.TiGr1982 - Thursday, October 16, 2014 - link
According to their website,GPU can be not only new R9 M295X (4 GB VRAM), supposed to be some kind of Tonga GPU,
but also R9 M290X (4 GB VRAM), which is "good old Pitcairn/Curacao", if I'm not mistaken.
Didn't know Pitcairn/Curacao is 5K-capable. May be a new chip revision though.
Daniel Egger - Thursday, October 16, 2014 - link
Maybe someone can help me out here: Why are you so sure that they're using DP at all? I've never seen DisplayPort used to drive an internal display so far...TiGr1982 - Thursday, October 16, 2014 - link
There is embedded Display Port standard (eDP):http://en.wikipedia.org/wiki/DisplayPort#eDP
Though I'm not sure they're actually using it here.
Darkstone - Thursday, October 16, 2014 - link
Displayport was introduced in laptops around 2 years ago. Many of the ultrabooks have an eDP connector (eDP, embedded displayport, is the same as normal DisplayPort, but with an different form factor). All laptops with 3D panels also have an eDP connector, as well as most IPS panels. I would like an better display in met 15.6" laptop, but the only IPS panel with lvds available in this form factor is the sony orangegate panel. At least that was the case about one year ago.Daniel Egger - Friday, October 17, 2014 - link
Not sure what you're saying here. LVDS is the de-facto standard in used in notebooks, that why I said I've never seen eDP (or iDP) being used there instead of the usual FPD. As far as I know DP also uses LVDS (rather than TMDS or even CML) so adapting displays from FPD to eDP might be even possible.melgross - Thursday, October 16, 2014 - link
I'm surprised at the starting price. This is the same as the price of the new Dell 5k machi e which is supposed to,use two 2880 panels, and costs the same $2,500 list price.I wonder if Dell will now have to lower the price of their monitor, as it now looks overpriced.
tynopik - Thursday, October 16, 2014 - link
"through enough""single 5120x2880 single"
ThomasS31 - Thursday, October 16, 2014 - link
But is is smooth and responsive experince?TiGr1982 - Thursday, October 16, 2014 - link
Well; for the price it costs it should be (and I guess it is).lefty2 - Thursday, October 16, 2014 - link
Semiaccurate correctly predicted that Apple would switch from using Nvidia GPUs to AMD in Mac products. The reason why is retalitation for Nvidia suing Apple for patent infringment over iPhone GPU. Nvidia is suing Samsung and Qualcomm, but they are also suing Apple for the same patents, although that hasn't been made public yet.More details here:
http://semiaccurate.com/2014/09/04/n...accurate-sa...
lefty2 - Thursday, October 16, 2014 - link
sorry, link was wrong:http://semiaccurate.com/2014/09/04/nvidia-sues-sam...
Cygni - Thursday, October 16, 2014 - link
Apple has played Nvidia and ATI/AMD against each other for more than a decade. I highly doubt they care at all about a patent lawsuit, as opposed to things like delivery date, volume, and pricing. Remember that they still use Samsung as a major supplier to this day.scottrichardson - Thursday, October 16, 2014 - link
Like the author said. I bet it has more to do with AMD's capabilities in customising the Display port output and building custom PCB's for Apple much like they have done with the GPU on the Mac Pro. AMD is apparently much better equipped to build custom-design GPU's over nVidia. Hence why AMD is powering the XBONE and PS4.lefty2 - Thursday, October 16, 2014 - link
The M295X is more than likely just a rebrand. No customisation, although it supports FreeSync - that's just standard on AMD GPU's. That's completely different to Xbone and PS4 which use semi-custom APU's.vFunct - Thursday, October 16, 2014 - link
M295X is actually their new Tonga architecture, and the Desktop version is the R9 275X, a mid-grade consumer desktop part.TiGr1982 - Friday, October 17, 2014 - link
There is no desktop R9 275X (these were old invalid rumors); the only released desktop Tonga-based card is called R9 285 (no X).ppi - Friday, October 17, 2014 - link
You know, they could have working FreeSync over the whatever interface the screen uses (they demoed eDP notebook a year ago).Spunjji - Tuesday, October 21, 2014 - link
I personally have some hopes for this, although Apple would probably have made up a name for it and marketed it as their own if this were true...lilkwarrior - Friday, October 17, 2014 - link
No, it has more to do with OpenCL and pushing away from CUDA. Both need that to happen: Apple wants to do it to always have more options on what GPU they have available to consider for their new products, and ATI definitely needs to better penetrate the workstation/professional market--this is a market that NVidia destroyed them in (67%+ of the market).monstercameron - Thursday, October 16, 2014 - link
how funny would it be if the panel was freesync capable.tipoo - Sunday, October 19, 2014 - link
Funny? That would be awesome!Ulf Hednar - Thursday, October 16, 2014 - link
"At 5120x2880 pixels, the new Retina 5K Display is precisely 4x the pixels of the 2560x1440 panel in last year’s model."4x? Granted, I'm really bad at math, but when you double something, isn't it only 2x?
Also, even though Apple has stated that this display can't be used in Target Display Mode, that would change if the firmware for the DisplayPort could be updated later to 1.3.
lowlymarine - Thursday, October 16, 2014 - link
When you double a rectangle along both sides, you end up with four times the area. So when you double the number of pixels in both directions, you end up with four times as many in total.Ulf Hednar - Thursday, October 16, 2014 - link
Thanks! :)Ulf Hednar - Thursday, October 16, 2014 - link
Man, I am really bad at math. Phew! But the second part still stands.MamiyaOtaru - Monday, October 20, 2014 - link
try actually doing the arithmetic :) 5120*2880 compared to 2560*1440. Compare to 4*4 vs 2*2. Is the former only twice as much?palladium - Friday, October 17, 2014 - link
What's the refresh rate? 30Hz or 60Hz?André - Friday, October 17, 2014 - link
If it wasn't 60Hz, you would have noticed it immediately just by moving the cursor across the screen.It seems silky smooth from the videos that have been posted of it in action from last nights event.
joos2000 - Monday, October 27, 2014 - link
It appeared to be above 60 Hz in the 30-60Hz videos someone posted on YouTube? That's funny!deeviousgenius - Friday, October 17, 2014 - link
I guess I was half expecting ATI's to be in the new Mac's, but how will this effect pro users? Especially Adobe After Effect/Premiere users who rely on Cuda to some extent? I can't recall if ATI had a Cuda alternative and if Adobe was going to even bother supporting it.RussianSensation - Friday, October 17, 2014 - link
Adobe has long supported the OpenCL open standard for acceleration of their entire Suite.Modern AMD GPUs are very fast at OpenCL which is why this wouldn't be an issue.
iPhil - Friday, October 17, 2014 - link
This Russian is talking complete sense. I also wanted to add that not only is OpenCL beginning to come into it's own(it's still slightly behind CUDA in terms of some processes - but that will improve over time), you can also still get the advantages of CUDA by plugging in a Thunderbolt expansion chassis with a CUDA enabled card. The CUDA card will be recognized across the Thunderbolt bus, and then you can assign that card to be used by Adobe CC for whatever tasks you want. It's really smooth.tipoo - Sunday, October 19, 2014 - link
Yup, infact GCN is in many ways even better at compute.JDG1980 - Friday, October 17, 2014 - link
Has anyone confirmed if this iMac will work properly with Windows installed? Most Apple systems run Windows fine, but I can see the monitor's custom TCON potentially being a problem here on the graphics driver side.fredzer - Friday, October 17, 2014 - link
On http://www.apple.com/imac-with-retina/osx/ it says : "If you want to run Windows on your Mac, you can do that with Boot Camp." So I suppose they have included Windows drivers as well for the display - would be awesome!milleron - Saturday, October 18, 2014 - link
Hope that's not just boilerplate text copied and pasted from the features list of prior iMacs.
odedia - Friday, October 24, 2014 - link
I wouldn't consider that as confirmation. Fact is - you cannot run Windows on a machine with a Fusion Drive, meaning that no standard configuration iMac 5k will run Windows. So this text is just irrelevant.ImRightYoureWrong - Sunday, October 26, 2014 - link
Fact is - You CAN run windows on a Fusion Drive.Please GTFO this website if you're not going to do actual research. You're just spouting pure BS.
casperes1996 - Monday, November 24, 2014 - link
The other bloke responded quite aggressively. He's right however. Fusion Drive equipped Macs can run Windows just fine, you're not entirely off bat however. The Windows part of the machine will use only the spinning platter part of the disk, and ignore the SSD.iPhil - Friday, October 17, 2014 - link
Something tells me that if Apple had the time to make custom silicone, they likely took the time (or are taking the time now) to write a driver as well. Of course, there are no guarantees... but they know that people rely on that functionality, so it would stand to reason that they would.Deelron - Friday, October 17, 2014 - link
Agreed, but I'd be in no rush to be one of the first ones to find out!TEAMSWITCHER - Saturday, October 18, 2014 - link
PC's don't work properly with Windows installed.tralalalalalala40 - Friday, October 17, 2014 - link
Amazing that dell is going to release a similar monitor for the same price. Apple is including a free computer lol. I thought Apple was the expensive one? Finally apple is taking advantage of it's amazing supply chain to offer crazy technology at the lowest prices and the highest profit margins.tipoo - Sunday, October 19, 2014 - link
And isn't the Dell one multiple tiles, while this is single tile?SirKnobsworth - Wednesday, October 22, 2014 - link
Because you can't drive that many pixels over a single external connection.milleron - Friday, October 17, 2014 - link
If one wanted to use a 2560 x 1440 secondary display connected by Thunderbolt, would it be necessary to get the upgraded GPU with 4GB of video RAM or might the R9 M290x with 2GB be sufficient?milleron - Saturday, October 18, 2014 - link
Hope that's not just boilerplate text copied and pasted from the features list of prior iMacs.GC2:CS - Saturday, October 18, 2014 - link
Holy **** an 27" screen with 200 ppi....repoman27 - Sunday, October 19, 2014 - link
In case anyone missed iFixit's teardown ( https://www.ifixit.com/Teardown/iMac+Intel+27-Inch... ), the TCON is a semi-custom chip made by Parade for Apple. The DP665, as it's marked, looks to be an 8-lane eDP 1.3 timing controller with PSR.So if the GPU isn't treating the display as two separate tiles and using two separate DP 1.2 outputs, the sneaky bit would be how Apple managed to get a single display controller and digital encoder / transmitter block to support 5120x2880, and then bind both links of a UNIPHY transmitter block into a single 8-lane eDP interface. I guess this works fine with TMDS for Dual-Link DVI, so maybe it was possible all along for DisplayPort as well. It's odd to think that Pitcairn can pull off driving a 5120x2880 panel as a single tile though.
I'm guessing the flexibility of AMD's digital encoder / transmitter blocks and PHYs is the primary reason Apple went with their GPUs for the Retina iMac.
houkouonchi - Saturday, October 25, 2014 - link
I am kind of curious if they are just doing something like running at 18-bit color (6bpc) with dithering or something. Not even using that extreme of timings (regular cvt-r) one can run 5120x2880 @ 60Hz via a 938 Mhz pixelclock which is within spec of what DP 1.2 can do @ 6 bpc (limit is 960 Mhz). The 8bpc limit is 720 Mhz pixel clock and if they are staying true to that no timings would be capable of of fitting it in 720 Mhz as absolutely 0 extra blanking would still be 884 Mhz pixelclock.repoman27 - Sunday, October 26, 2014 - link
Yeah, but like I said, if you look at the teardown photos which show the TCON, it's pretty clearly an 8-lane eDP job, which is more than enough for 5120x2880, 24 bpp, 60 Hz using HBR2. I seriously doubt they would double the number of signaling pairs between the GPU and TCON unless they were actually using them.likethesky - Wednesday, November 5, 2014 - link
Hey repoman27 or others ~What I'm wondering--for future proofing purposes, as I'm about to drop over $3k on one of these--is: Do you or others you might check in with think it's possible and therefore probable that a third party or perhaps software upgrade from Apple will allow this iMac to be used in *4k* (yes, 4, not 5) Target Display Mode at some point? That'd at least be a nice consolation prize for having bought early and getting to use it with computer at 5k now, knowing that at some point--even with the DP 1.2, unlikely to ever be upgradable--that this could be used as a secondary 4k display, at the least?
I'd love to know the answer to that question.
jporomaa - Tuesday, October 21, 2014 - link
will anandtech do some benchmarking on it?odedia - Friday, October 24, 2014 - link
I'm waiting for this review any day. This is the only thing that keeps me from ordering this machine!biketourist - Sunday, November 23, 2014 - link
Same here. Is anandtech working on a full review ? I would have expected it by now.leonhk1 - Wednesday, October 29, 2014 - link
Hi,I have stock of Brand New Apple iPhone 6 - 64GB Unlocked phones for sale at $650 only, sealed in box with 1 year warranty. Available in Gold , silver and space grey colors
Interested buyer should E-mail me at: [email protected]
casperes1996 - Monday, November 24, 2014 - link
So when's the review coming?