Comments Locked

220 Comments

Back to Article

  • Darkworld - Wednesday, May 20, 2020 - link

    10500k?
  • Chaitanya - Wednesday, May 20, 2020 - link

    Pointless given R5 3000 family of CPUs.
  • yeeeeman - Wednesday, May 20, 2020 - link

    Yeah right. Except it will beat basically all and lineup in games. Otherwise it is pointless.
  • yeeeeman - Wednesday, May 20, 2020 - link

    All AMD lineup*
  • SKiT_R31 - Wednesday, May 20, 2020 - link

    Yeah with a 2080 Ti the flagship 10 series CPU beats AMD in most titles, generally by a single digit margin. Who is pairing a mid-low end CPU with such a GPU? Also if there were to be a 10500K, you probably don't need to look much further than the 9600K in the charts above.

    This may have been missed on you, but what CPU reviews like the above show is: unless you are running the most top end flagship GPU and are low resolution high fps gaming, AMD is better at every single price point. Just accept it, and move on.
  • Drkrieger01 - Wednesday, May 20, 2020 - link

    It also means that if you have purchased an Intel 6th gen CPU in i5 or i7, there's not much reason to upgrade unless you need more threads. And it will only be faster if you're using those said threads effectively. I'm still running an i5 6600K, granted it's running at 4.6GHz - there's no reason for me to upgrade until either Intel and/or AMD come up with better architecture and frequency combination (IPC + clock speed).
    I'll likely be taking the jump back to AMD for the Ryzen 4000's after a long run since the Sandy Bridge era.

    Anyone needing only 4-6 cores should wait until then as well.
  • Samus - Thursday, May 21, 2020 - link

    That's most people, including me. I'm still riding my Haswell 4C/8T because for my applications the only thing more cores will get me is faster unraring of my porn.
  • Lord of the Bored - Thursday, May 21, 2020 - link

    Hey, that's an important task!
  • Hxx - Wednesday, May 20, 2020 - link

    at 1440p intel still leads in gaming. It may not lead by much or may not lead by enough to warranty buying it over Intel but the person buying this chip is rocking a high end gpu and will likely upgrade to a high end gpu and the performance gap will only widen in intel's favor as the gpu becomes less of a bottleneck. So yeah pairing this with a 2060 makes no sense, go AMD. but pairing this with a 2080ti and a soon to be released 3080TI oh yeah this lineup will be a better choice.
  • DrKlahn - Thursday, May 21, 2020 - link

    By that logic the new games released since the Ryzen 3x000 series debut last year should show a larger gap at 1440+ between Intel and AMD. But they don't. And judging by past trends I doubt they will in the future either.As GPUs advance so does the eye candy in the newer engines, keeping the bottleneck pretty much where it always is at higher resolutions and detail levels, the GPU.
  • Gastec - Friday, May 22, 2020 - link

    "pairing a high-end GPU with a mid-range CPU" should already be a meme, so many times I've seen it copy-pasted.
  • dotjaz - Thursday, May 21, 2020 - link

    What funny stuff are you smoking? In all actual configurations, AMD doesn't lose by any meaningful margin at a much better value.
    Anandtech is running CPU test where you set the quality low and get 150+fps or even 400+fps, nobody actually does that.
  • deepblue08 - Thursday, May 21, 2020 - link

    Intel may not be a great value chip all around. But a 10 FPS lead in 1440p is a lead nevertheless: https://hexus.net/tech/reviews/cpu/141577-intel-co...
  • DrKlahn - Thursday, May 21, 2020 - link

    If that's worth the more expensive motherboard, beefier (and more costly) cooling, and increased heat then go for it. If you put 120fps next to 130fps without a counter up how many people could tell?Personally I don't see it as worth it at all. Nor do I consider it a dominating lead. But I'm sure there are people out there that will buy Intel for a negligible lead.
  • Spunjji - Friday, May 22, 2020 - link

    An entirely unnoticeable lead that you get by sacrificing any sort of power consumption / cooling sanity and spending measurably larger amounts of cash on the hardware to achieve the boost clocks required to get that lead.

    The difference was meaningful back when AMD had lower minimum framerates, less consistency and -30fps or so off the average. Now it's just silly.
  • babadivad - Thursday, May 21, 2020 - link

    Do you need a new mother board with these? If so they make even less sense than they already did.
  • MDD1963 - Friday, May 22, 2020 - link

    As for Intel owners, I don't think too many 8700K, 9600K or above owners would seriously feel they are CPU limited and in a dire/ imminent need of a CPU upgrade as they sit now, anyway. Users of prior generations (I'm still on 7700K) will make their choices at a time of their own choosing, of course, and not simply because 'a new generation is out'. (I mean, look at 8700K vs. 10600K results.....; looks almost like a rebadging operation)
  • khanikun - Wednesday, May 27, 2020 - link

    I was on a 7700k and didn't feel CPU limited at all, but decided to get an 8086k for the 2 more cores and just cause it was an 8086. For my normal workloads or gaming, I don't notice a difference. I do reencode videos maybe a couple times a year. The only times I'll see the difference.

    I'll probably just be sitting on this 8086k for the next few years, unless something on my machine breaks or Intel does something crazy ridiculous, like making some 8 core i7 on 10nm at 5 ghz all core, in a new socket, then making dual socket consumer boards for it for relatively decent price. I'd upgrade for that, just cause I'd like to try making a dual processor system that isn't some expensive workstation/server system.
  • Spunjji - Friday, May 22, 2020 - link

    Yes, you do. So no, they don't make sense xD
  • Gastec - Friday, May 22, 2020 - link

    Games...framerate is pointless in video games, all that matters now are the "surprise mechanics".
  • Gastec - Friday, May 22, 2020 - link

    Basically you just have to type "allyourbasearebelongtous +$50/surprisemechanic" and you get all the framerate you want in your favorite multiplayer FPShooter.
  • Boshum - Wednesday, May 20, 2020 - link

    I think it's a viable alternative to Ryzen 3000, so it's not pointless. It's about equal in performance for most people. A little more expensive and power hungry core for core, but it's more of a flavor thing now. It's still better for certain gaming and application scenarios. Hyperthreading makes the low to midrange a much more reasonable option too, with heat and power being no big deal there. The only place it can't compete with Ryzen is at the very high end for power users doing heavy multi-core work.
  • Dribble - Wednesday, May 20, 2020 - link

    I'd be the sort of person to look at a 10700K but power usage is just too high. I want to be able to stick a high end air cooler on it, o/c and still have it run pretty quiet. I'd have to go water with one of these and I can't be bothered with that. Not worth it for the small performance increment over more efficient chips.
  • IBM760XL - Wednesday, May 20, 2020 - link

    Agreed. The 10700K and 10900K use more power per core than my ancient-but-trusty 2500K, at least with stock settings. Sure, the new chips get somewhat better IPC, but I can't justify switching from a Sandy Bridge that's nice and quiet even at 100% load, to a Comet Lake that will require Serious Cooling to have an outside chance of being as quiet.

    I could look at lower-end hex-core Comet Lake chips instead, but why would I do that when I could just as well get an octo-core Ryzen 7 3700, or a Ryzen 5 3600 that will have better performance than an i5-10500?
  • Boshum - Wednesday, May 20, 2020 - link

    I should think the 10500 and 3600 would be pretty close at stock, though you have more overclocking options with the 3600. It's the future Rocket Lake vs Ryzen 4000 options that is more interesting.
  • warrenk81 - Wednesday, May 20, 2020 - link

    typo in the dropdown for the final page, move/more.
  • colonelclaw - Wednesday, May 20, 2020 - link

    Grammar error, too. Less/fewer.
  • Flunk - Wednesday, May 20, 2020 - link

    Well, Intel's back on top for gaming, by a small marging, with chips that can fry an egg. Maybe it'll force AMD to lower their prices on their high-end chips. I don't really fancy a 250+ Watt CPU.
  • DrKlahn - Wednesday, May 20, 2020 - link

    You can already get the 3900x for $410 on Amazon. Unless you have a use case that heavily favors Intel that would seem to be a pretty good value already. A good B450 board capable of handling it could be had for not much more than the difference in chip cost (provided that fits your needs).
  • Irata - Wednesday, May 20, 2020 - link

    Yup, and like the article says that includes an HSF that will do the job.

    Contrast that with the 10900k which retails for $530 on Newegg (not available) and which requires you to spend $ 200+ for a proper cooling set up and you are looking at $ 410 vs. $ 730, i.e. paying 56% more for the 10900k. And that does not even include case fans, mainboard, PSU.

    If gaming is what one is after, the 9700k looks much more attractive than the 10900k.
  • DrKlahn - Wednesday, May 20, 2020 - link

    My biggest issue with gaming is that these reviews rarely show anything other than low resolution scenarios. I realize a sizable slice of the gaming community uses 1080p and that some of them are trying to hit very high frame rates. But there also a lot of us with 1440p+ or Ultrawides and I think it gets overlooked that Intels gaming "lead" largely evaporates for anyone not trying to hit very high frames at 1080p.
  • ElvenLemming - Wednesday, May 20, 2020 - link

    Honestly, I think it's ignored because it's well understood that at 1440p+ the CPU just doesn't matter very much. There's not much value in anything above 1080p for a CPU review the vast majority of games are going to be GPU limited. That said, plenty of other outlets include them in their reviews if you want to see a bunch of charts where the top is all within 1% of each other.
  • DrKlahn - Wednesday, May 20, 2020 - link

    I do agree with you that a lot of us do understand that as resolution and detail increases, CPUs become almost irrelevant to gaming performance. However you do see a fair few posters parroting "Intel is better for gaming" when in reality for their use case it really isn't any better. That's why I feel like these reviews (here and elsewhere) should spotlight where this difference matters. If you are a competitive CS:GO player that wants 1080p or lower with the most frames you can get, then Intel is undoubtedly better. But a person who isn't as tech savvy that games and does some productivity tasks with a 1440p+ monitor is only spending more money for a less efficient architecture that won't benefit them if they simply see "Intel better for gaming" and believe it applies to them.
  • shing3232 - Thursday, May 21, 2020 - link

    3900X or 3800X can beat Intel 9900Kf on csgo with pbo on if I remember correctly.
  • silencer12 - Saturday, May 23, 2020 - link

    Csgo is not a demanding game
  • vanilla_gorilla - Monday, June 15, 2020 - link

    >If you are a competitive CS:GO player that wants 1080p or lower with the most frames you can get, then Intel is undoubtedly better.

    It's actually more complicated than that. Even midrange Zen 2 CPU can hit well over 200 fps in CS:GO. So unless you have a 240hz monitor, it won't make any difference buying Intel or AMD in that case.
  • Irata - Wednesday, May 20, 2020 - link

    Techspot shows a seven game average and there the avg fps / min 1% difference to the Ryzen 3 3300x is less than 10% using a 2080ti.
  • CrimsonKnight - Thursday, May 21, 2020 - link

    This review's benchmarks goes up to 4K/8K resolution. You have to click the thumbnails under the graphs.
  • Meteor2 - Wednesday, July 15, 2020 - link

    To be clear: Anandtech tests at low resolutions so the bottleneck is the CPU, not the GPU. A Ryzen 5 won’t bottleneck a 2080 Ti at 4K.
  • kmmatney - Wednesday, May 20, 2020 - link

    Those of us who live near a Microcenter can get the 3900X for $389, along with a $20 discount on a motherboard (and a serviceable heatsink). The Ryzen 5 (what I bought) is $159, also with a $20 motherboard discount and a decent cooler. So my effective motherboard cost was $79, and total cost of $240 + tax, with a motherboard that can (most likely) be upgraded to Zen 3
  • SKiT_R31 - Wednesday, May 20, 2020 - link

    Intel never left "the top". Top of 720p low graphics settings, by a single percentage margin. Totally worth 50% higher price.
  • silencer12 - Saturday, May 23, 2020 - link

    Give it more than 3 years
  • tracker1 - Wednesday, May 20, 2020 - link

    AMD has already shifted their pricing quite a bit from launch in anticipation of this... is it clearly a better option, for most people... unless you literally only care about gaming, then a 10900K or 10700K might be an okay option at their respective price points and only if you're using at least an RTX 2080 Super. If you're going anything lower on GPU, then AMD is probably the better option all the way around (and you'll probably save a bit on your annual power bill as a result).
  • VoraciousGorak - Wednesday, May 20, 2020 - link

    Finally, a sane product stack from Intel with regards to naming versus core/thread count.
  • Hifihedgehog - Wednesday, May 20, 2020 - link

    Sane and thermal meltdown don't mix.
  • ElvenLemming - Wednesday, May 20, 2020 - link

    Unfortunate that their product stack finally makes sense now that the name sounds so stupid I get angry every time I read it.
  • Spunjji - Tuesday, May 26, 2020 - link

    Whether it's "Eye-Nine Ten-Nine-Hundred-Kay" or "Eye-Nine Ten-Thousand-Nine-Hundred-Kay", it sounds equally daft.
  • tipoo - Wednesday, May 20, 2020 - link

    Chasing clocks and high power to counter AMD. Ah, Netburst, good times. Ish.
  • WaltC - Wednesday, May 20, 2020 - link

    I had forgotten Netburst...;) "The Intel CPU that accelerated the Internet"! Thanks for the laugh!
  • trparky - Wednesday, May 20, 2020 - link

    Yep, I agree.
  • catavalon21 - Wednesday, May 20, 2020 - link

    +1
  • Lord of the Bored - Friday, May 22, 2020 - link

    The nostalgia is strong these days.
  • Bidz - Wednesday, May 20, 2020 - link

    So... where is the temperature chart? Given the power usage and the tier level of the product I would say many users want to know how practical it is to use.
  • LawRecords - Wednesday, May 20, 2020 - link

    Agreed. Its odd that thermals are missing given the high power draw.
  • shabby - Wednesday, May 20, 2020 - link

    I'd imagine it would be pegged at 90c since the cpu is constantly clocking itself as high as it can.
  • DannyH246 - Wednesday, May 20, 2020 - link

    Its not odd at at all. Its to make Intel look better we all know this.
  • shady28 - Wednesday, May 20, 2020 - link

    LTT has a video on thermals. The thermals for the gen 10 are better than gen 9, despite the higher clocks and core counts. Intel redesigned the conductive layer between the die and the lid. It worked.
  • Spunjji - Tuesday, May 26, 2020 - link

    Seriously? The thermals are better despite the higher power draw?

    I'm guessing this is a case of being able to get the heat out more easily *if you have a cooling system capable of subsequently dealing with the heat being pulled out*. That would make sense given the changes involved, but it involves the assumption that people are prepared to go from 280mm+ radiators.
  • mrvco - Wednesday, May 20, 2020 - link

    I get that this is a CPU review and not a GPU or system review, but it would be helpful to also include gaming resolutions w/ quality settings that people actually use for gaming rather just benchmarking... especially when building a gaming system and making decisions on how to allocate budget between CPU (+p/s +cooling) and GPU.
  • TheUnhandledException - Wednesday, May 20, 2020 - link

    I agree. Yes the result will show nearly identical performance from a 10900 down to an Ryzen 3600 but that is kinda the point. You don't really need an ultra high end CPU for gaming at high resolution. Even if it was just one game it would be nice to see how CPU performance scales at 1080p, 1080p high quality, 1440p, and 4K.
  • yankeeDDL - Wednesday, May 20, 2020 - link

    I think the main idea was to show if the CPU was getting in the way when teh GPU is definitely not the bottleneck.
  • mrvco - Wednesday, May 20, 2020 - link

    That's difficult to discern without all the relevant data.. i.e. diminishing returns as the bottle-neck transitions from the CPU to the GPU at typical resolutions and quality settings. I think better of the typical AnandTech reader, but I would hate to think that someone reads this review and extrapolates 720p / medium quality FPS relative performance to 1440p or 2160p at high or ultra settings and blows their build budget on a $400+ CPU and associated components required to power and cool that CPU with little or no improvement in actual gaming performance.
  • dullard - Wednesday, May 20, 2020 - link

    Do we really need this same comment with every CPU review ever? Every single CPU review for years (Decades?) people make that exact same comment. That is why the reviews test several different resolutions already.

    Anandtech did 2 to 4 resolutions with each game. Isn't that enough? Can't you interpolate or extrapolate as needed to whatever specific resolution you use? Or did you miss that there are scroll over graphs of other resolutions in the review.
  • schujj07 - Wednesday, May 20, 2020 - link

    “There are two types of people in this world: 1.) Those who can extrapolate from incomplete data.”
  • diediealldie - Thursday, May 21, 2020 - link

    LMAO you're genius
  • DrKlahn - Wednesday, May 20, 2020 - link

    In some cases they do higher than 1080p and some they don't. I do wish they would include higher resolution in all tests and that the "gaming lead" statements came with the caveat that it's largely only going to be beneficial for those seeking low resolution with very high frame rates. Someone with a 1080p 60Hz monitor likely isn't going to benefit from the Intel platform, nor is someone with a high resolution monitor with eye candy enabled. But the conclusion doesn't really spell that out well for the less educated. And it's certainly not just Anandtech doing this. Seems to be the norm. But you see people parroting "Intel is better for gaming" when in their setup it may not bring any benefit while incurring more cost and being more difficult to cool due to the substantial power use.
  • Spunjji - Tuesday, May 26, 2020 - link

    It's almost like their access is partially contingent on following at least a few of the guidelines about how to position the product. :/
  • mrvco - Wednesday, May 20, 2020 - link

    Granted, 720p and 1080p resolutions are highly CPU dependent when using a modern GPU, but I'm not seeing 1440p at high or ultra quality results which is where things do transition to being more GPU dependent and a more realistic real-world scenario for anyone paying up for mid-range to high-end gaming PCs.
  • Meteor2 - Wednesday, July 15, 2020 - link

    Spend as much as you can on the GPU and pair with a $200 CPU. It’s actually pretty simple.
  • yankeeDDL - Wednesday, May 20, 2020 - link

    I have to say that this fared better than I expected.
    I would definitely not buy one, but kudos to Intel.
    Can't imagine what it means to have a 250W CPU + 200W GPU in a PC next to you while you're playing. Must sound like an airplane.
  • yeeeeman - Wednesday, May 20, 2020 - link

    The CPU won't consume nowhere near 250w during gaming. 250w is valid only for short all core scenarios. Otherwise it will stay in its 130w tdp. Go and read other reviews and you will see I am right.
  • yankeeDDL - Thursday, May 21, 2020 - link

    According to this (https://images.anandtech.com/doci/15785/10900K%20y... it stays at 230W for almost 4min.
    In any case, you can read my sentence again and use 130W instead of 250W, and it does nt change anything.
  • arashi - Saturday, May 23, 2020 - link

    You can't blame him, he's on Intel payroll and has to act the idiot.
  • dirkdigles - Wednesday, May 20, 2020 - link

    Ian, I think the pricing on the charts is a bit misleading. The $488 price for the 10900K is the 1000-unit bulk pricing, and the $499 price on the 3900X hasn't been seen since January 2020... it's currently $409 on Amazon. This would skew the ability for the reader to make comparison.

    I know MSRP is a good metric, but street price is more important. What can I buy these chips for, today? If I'm a consumer, I likely can't get that $488 bulk per chip price for the 10900K, and the 3900X is not going to cost me anywhere near $409. Please update.
  • dirkdigles - Wednesday, May 20, 2020 - link

    *anywhere near $499. Typo.
  • WaltC - Wednesday, May 20, 2020 - link

    Yes, I paid ~$409 for my 3900X, and on top of that AMZN offered me 6-months, same-as-cash, which I was more than happy to accept...;) Good times!
  • AnarchoPrimitiv - Wednesday, May 20, 2020 - link

    Exactly, the 3900x is over $100 cheaper and is nowhere "around the same price"
  • yeeeeman - Wednesday, May 20, 2020 - link

    Well Intel has the 10900f at 400$. Locked with no igpu. almost same frequencies. That is a better buy than the 10900k
  • Spunjji - Tuesday, May 26, 2020 - link

    Right - the 10900F is likely a better deal, but the comparison was with the 10900K.
  • Irata - Wednesday, May 20, 2020 - link

    Waiting for comments on how the two small fans on the mainboard make this an unacceptable option. If I remember correctly, that applied to X570 boards.
  • AnarchoPrimitiv - Wednesday, May 20, 2020 - link

    Should I repost the countless comments made by Intel fanboys claiming that the fans on x570 meant the sky is falling? Don't try to ambush people with the accusation of a double standard when your side drew first blood
  • Irata - Wednesday, May 20, 2020 - link

    The double standard was exactly my point. End of the world for X570 for own 50-60mm fan back then, Crickets chirping for several 40mm fans on Z490 now.
  • Makaveli - Wednesday, May 20, 2020 - link

    its one small fan and its inaudible I haven't heard mine ever. The only people complaining about this is people who still thinking they are dealing with motherboards from the 1990's.
  • shing3232 - Thursday, May 21, 2020 - link

    They're worrying about longevity of the fans.
  • yeeeeman - Wednesday, May 20, 2020 - link

    All x570 motherboards had fans that was the problem. Here some specific models do
  • RSAUser - Thursday, May 21, 2020 - link

    The above.

    I've tweaked the fan curve on my motherboards, it's never kicked in yet.
  • ryao - Wednesday, May 20, 2020 - link

    Why are there data points from AMD missing in a number of tests. For example, the Crysis CPU render is missing data points for all of AMD’s processors except the 3600.
  • schujj07 - Wednesday, May 20, 2020 - link

    Crysis CPU render "This is one of our new benchmarks, so we are slowly building up the database as we start regression testing older processors."

    They are in the middle of updating the entire suite. That means that not every CPU has been tested with in the new suite so the only data available is from CPUs that have been tested.
  • gagegfg - Wednesday, May 20, 2020 - link


    bad anandtech policy, thus confuse users. If they pay attention to the amount of unsubstantiated comments, they are targeted for those graphics, confusing users with "superiority of intel" ... and this is not the case
  • DannyH246 - Wednesday, May 20, 2020 - link

    haha - yeah exactly. Anything where AMD would be ahead...."oh our database is light"
  • schujj07 - Wednesday, May 20, 2020 - link

    That doesn't make any sense. The Crysis CPU render is new as of the Ryzen 3300X review from 2 WEEKS ago. https://www.anandtech.com/show/15774/the-amd-ryzen...
  • catavalon21 - Wednesday, May 20, 2020 - link

    AMD is shown leading in many CPU tests dollar for dollar or watt for watt.
  • Achaios - Wednesday, May 20, 2020 - link

    Chipzilla, due to its greed, got us back to the Heat Output of 2008 processors such as the Yorkfield QX 9650.

    Lookup "Overclocking Intel's New 45nm QX9650: The Rules Have Changed" by Anandtech, and check the thermal output of the QX 9650.

    I don't see why any enthusiast would buy these overpriced and bad cpus. I certainly won't.
  • t.s - Wednesday, May 20, 2020 - link

    Never underestimate intel fanboys. This statement is copied from fb comment section about Ryzen 4000 will have 20% IPC uplift rumour:

    "AMD is made for applications like streaming. Intel is made for the other 90% of the market that relies on the single core performance. Yes, most of the industry still relies on Single core performance. Maybe know your industry before making a comment"
  • tipoo - Wednesday, May 20, 2020 - link

    TIL I'm a streamer. I didn't think data science was all that interesting!
  • Cooe - Wednesday, May 20, 2020 - link

    Seems Intel took a good hard look a the FX-9590 and was like... "Yup. Let's do that. It'll work for sure this time, I promise!"
  • WaltC - Wednesday, May 20, 2020 - link

    Bingo...;)
  • plonk420 - Wednesday, May 20, 2020 - link

    a) thanks for showing core to core latencies! b) so this doesn't have TSX?
  • Calypto - Wednesday, May 20, 2020 - link

    Why not throw in a Coffee Lake inter-core latency chart for comparison?
  • Calypto - Wednesday, May 20, 2020 - link

    ignore me I'm stupid
  • catavalon21 - Wednesday, May 20, 2020 - link

    The ability to edit (or ^Z) would be most welcome, trust me.
  • eastcoast_pete - Wednesday, May 20, 2020 - link

    Isn't that Skylake running a bit dry by now? But, seriously, Intel really risks losing a lot of market share in future years by selling these "classics" at high prices, and that is if one can get one in the first place.
    Curious: how many commercial customers buy Intel desktops just because they have iGPUs, but want more CPU oomph than the 3200G has? Is that why Intel still dominates the OEM desktop market?
  • AnarchoPrimitiv - Wednesday, May 20, 2020 - link

    Intel dominates the OEM market through intimidation and threats of retribution... They were literally convicted of bribing OEMs to NOT use AMD CPUs all throughout the 2000s in several courts around the world. The trials uncovered emails between Intel executives that stated, and I quote, "Dell is the best friend money can buy".... The proof is in the fact that currently, the Ryzen 4000 mobile CPUs are the best mobile chips offered right now, but Dell only puts them in the low end laptops. Why? Because Intel is probably giving huge financial incentives to bar AMD from premium designs to perpetuate the myth that AMD isn't a premium brand
  • Retycint - Wednesday, May 20, 2020 - link

    Do keep in mind that these are baseless speculations, based on something that happened 2 decades ago. Both Intel and AMD have changed since then (new engineering team, new management etc) and there has been no evidence of Intel providing incentives to cripple AMD systems. Go take your conspiracy elsewhere.

    And before you inevitably accuse me of being an Intel shill, this isn't about Intel or AMD, it's about facts to support your claim, of which there have been none
  • Irata - Wednesday, May 20, 2020 - link

    Baseless speculation? Financial horsepower, MDF and meet the comp funds are current and no secret.

    Why do you think there are no Ryzen 4000 laptops with GPU above a 2060?
  • Spunjji - Tuesday, May 26, 2020 - link

    Not entirely baseless, as they made two distinct claims. I've been a party to how Intel's "Marketing Development Funds" work - and work it does, at all levels from OEM to reseller to retailer. These days they don't explicitly punish anyone for not buying AMD - they simply tie rebates that will improve the profit margins on a product to specific quantities of those products being sold. It's "nobody's fault" if those quantities happen to make the sale of an AMD product by a given retailer or reseller distinctly unlikely.

    As for incentivizing bad *builds* of AMD systems, though, I'm not so sure. Intel clearly do a lot of work building reference platforms, and the economics of doing integration testing for a new vendor is not trivial. Honestly though, it's hard to tell how we *would* know if this were going on, because it would absolutely be made to look innocent - just like last time.
  • brantron - Wednesday, May 20, 2020 - link

    "literally convicted of bribing"

    1) No. That's not what "literally" means.
    2) No. No one was even *charged* with a crime, much less convicted.
    3) No. It wasn't about bribery.

    The reason Athlon 64s weren't ubiquitous way back when is the same reason the 4000 APUs aren't today - there aren't enough to go around.

    If your post were to be rephrased without hyperbole, baseless accusations, and whataboutism unrelated to the topic of this article, it would read something like this:

    "6 months after AMD's announcement of Renoir, the number of 4000 APUs sold for desktops is literally zero (see how that works?) because TSMC is still slammed."
  • WaWaThreeFIVbroS - Thursday, May 21, 2020 - link

    Your ignorance is amusing
    It is technically a bribery

    https://www.extremetech.com/computing/184323-intel...
  • Spunjji - Tuesday, May 26, 2020 - link

    First 3 points: accurate, if not entirely on-topic. Nobody was charged with a crime, but Intel sure were fined a lot for collusion.

    Which gets to the 4th point: again, accurate, but not entirely relevant. AMD were definitely not able to match Intel for manufacturing, which is why they couldn't have beaten Intel out of the market entirely, but that was barely related to why they weren't getting into Dell systems. See the aforementioned proven-and-fined-for collusion.
  • drothgery - Friday, May 22, 2020 - link

    Or because premium designs take longer when the new chip isn't just another respin of the same thing, and AMD hadn't produced a viable high-end notebook chip in well over a decade so it made sense to wait and see if Ryzen 4000 was any good rather than designing in advance?
  • Spunjji - Tuesday, May 26, 2020 - link

    Mixed disagree.

    In all likelihood, Intel is incentivizing OEMs to continue working with their products.

    It certainly looks like there is some sort of unspecified agreement between OEMs, Intel and Nvidia - hence the seemingly universal limitation of the 2060 with an AMD CPU.

    But then... this absolutely is AMD's first proper crack at a high-end notebook chip that performs up to its billing in a very, very long time. It will take time for it to filter though, so the current state of the market may not be a good indicator - especially with COVID-19 about.
  • Tunnah - Wednesday, May 20, 2020 - link

    Regarding your gaming suite test and GTA V/Steam limitations; why not switch to the cracked, offline version ? It's not like you're pirating it as you already bought it.

    Also you could keep a monolithic version in which you could insert any scripts you want via the modding capabilities, and because it's offline, updates won't come in and screw up your files. I keep a pirate version separate for messing around with modding on, and I never have to worry about an update rolling things back.
  • arashi - Sunday, May 24, 2020 - link

    I'm sure the legal liability would be very welcome.
  • Hxx - Wednesday, May 20, 2020 - link

    im excited for the 10700k for my gaming rig. almost as good as the 10900k but cheaper and less power hungry.
  • HammerStrike - Wednesday, May 20, 2020 - link

    The lack of PCIe 4.0 is a deal breaker for any gaming focused box. The one area where the new consoles have an undisputed lead is in their SSD’s and I/O infrastructure. As game engines and game design are transformed by this I think, within a few years, we are going to see game performance improvements with faster SSD’s. Much more so then the few % Intel currently has,based on CPU alone. Which is only really of practical benefit if you have a monitor with 165+ refresh rate and game at those settings. I love a high refresh but I’d much rather have the pretty bells and whistles on and get 80-120hz vs setting everything to low for 165.

    AMD chips are just much more compelling. Of course, unless you absolutely have to upgrade now, I’d wait a few months for Zen 3. Fair chance they take the performance crown, or get so close as not to matter. Plus they will run a lot cooler - even if you don’t care about the power draw per say, the cooler a chip runs the cheaper / quieter the cooling solution is. Take that savings and put it in a GPU, RAM or PCIe 4.0 SSD.
  • Boshum - Wednesday, May 20, 2020 - link

    I don't think lack of PCIe 4.0 is that bad, but is it certain that the LGA1200 won't support PCIe 4.0 when a Rocket Lake chip is plugged in?
  • WaWaThreeFIVbroS - Thursday, May 21, 2020 - link

    The board may support PCIe 4.0 signals but the Z490 chipset doesn't, so when a rocket lake is plugged in the PCIe 4 will probably only came from the CPU
  • ImNotARobot - Wednesday, May 20, 2020 - link

    I feel like there is a lack of testing between PCIe 4 and 3. The way I look at it, nvidia is right around the corner from launch their PCIe 4 lineup so these processors are going to be powering that. I haven't seen anyone review an AMD 5700xt on an intel and AMD machine just to see what other real life gaming impact that can have. Agreed if you're a hardcore gamer you might not want a 5700xt...but it gives insight on what next gen PCIe 4 channel can get you.
  • haukionkannel - Thursday, May 21, 2020 - link

    No impact at all. Todays and near future GPUs Are too weak to saturate pci 3.0... maybe in few years we will get GPUs that Are faster in Pci4.0... but that time has not yet arrived. (Unles you have 4Gb amd 5500 that has narrow 8wide bus.)
    Pci 4.0 is for m2ssd at this moments!
  • prophet001 - Thursday, May 21, 2020 - link

    Can't really argue but the clock performance does matter a lot in WoW which is what I mainly play. No gen 4.0 is wack but so is 16 lanes into the CPU.
  • UltraWide - Wednesday, May 20, 2020 - link

    Intel's 10th gen is a hard pass for me.

    I'll wait patiently with my 4770K.
  • Spunjji - Tuesday, May 26, 2020 - link

    Haswell was the last time I remember being excited about an Intel CPU.
  • AnarchoPrimitiv - Wednesday, May 20, 2020 - link

    Why is the article stating that the 10900k is "around the same price" as the 3900x when its literally around $100 more (3900x currently goes for $417 and the 10900k has listed at $522, $488 is only the tray price when you buy 1000 or more CPUs)? In my opinion a 25% more expensive CPU isn't "around the same price"
  • dirkdigles - Wednesday, May 20, 2020 - link

    Same thoughts - I commented on that earlier. Quite misleading IMO.
  • drothgery - Wednesday, May 20, 2020 - link

    comparing retail prices of something just released vs something that's been out for months is silly, so they went by MSRP (which for CPUs is the tray price)?
  • GreenReaper - Wednesday, May 20, 2020 - link

    Don't see how that works. You buy based on the performance available now, that is what the charts are based on - so why not the price now?
  • duploxxx - Wednesday, May 20, 2020 - link

    perhaps a reviews site should start testing with the defaults…. so put a default cooler on this system and test again in a case and heating next to it and see how much is reall left from this marketing turbo and theoretical benchmarking....
  • jameslr - Wednesday, May 20, 2020 - link

    What's a "default cooler"? None of these CPUs come with a "cooler" or HSF unit.
  • GreenReaper - Wednesday, May 20, 2020 - link

    So test it anyway, see what happens when you don't include a vital bit of kit in the comparison price.
  • Spunjji - Tuesday, May 26, 2020 - link

    The AMD ones do. They could throw in a known-equivalent cooler on the Intel side and repeat a few of the tests with it to see how it fares - one of those $30 Coolermaster jobs should do the trick.

    At least that way you'd get an idea of the extremes - "properly" cooled with a water loop vs. cooled the way most people used to do home builds.
  • Khenglish - Wednesday, May 20, 2020 - link

    Ian, for the Crysis CPU render test you'd probably get higher FPS disabling the GPU in the device manager and set Crysis to use hardware rendering. Disabling the GPU driver enables software rendering by default on Windows 10. The Win10 rendering does stutter worse than the reported FPS though, so take from it what you want.
  • shaolin95 - Wednesday, May 20, 2020 - link

    "But will the end-user want that extra percent of performance, for the sake of spending more on cooling and more in power?"

    Such retarded comment. More power...do you actually know who little difference this makes in a year. Wow this place is going down hill fast.
    Oh and a cooler you know we don't have to change our cooler with every CPU purchase so don't make it seem like this HUGE issue...your AMD fanboy colors are showing VERY clearly.
  • schujj07 - Wednesday, May 20, 2020 - link

    If you think you can use the 212 EVO you have from a 6700k or 7700k to keep the 10900k cool you are absolutely nuts. "Speaking with a colleague, he had issues cooling his 10900K test chip with a Corsair H115i, indicating that users should look to spending $150+ on a cooling setup. That’s going to be a critical balancing element here when it comes to recommendations." This isn't any form of fanboyism. This is stating a fact that to squeeze out the last remaining bits of performance in Skylake & 14nm Intel had to sacrifice massive amounts of heat/power to do so.
  • Maxiking - Wednesday, May 20, 2020 - link

    If you have issues cooling 10900k with H115i, the problem is always between the monitor and chair.

    They were able to cool OC 10900k with 240m AIO just lol

    Incompetency of some reviewers is just astonishing
  • schujj07 - Wednesday, May 20, 2020 - link

    All depends on the instructions that you are running. From Tomshardware: "We tested with the beefier Noctua NH-D15 and could mostly satisfy cooling requirements in standard desktop PC applications, but you will lose out on performance in workloads that push the boundaries with AVX instructions. As such, you'll need a greater-than-280mm AIO cooler or a custom loop to unlock the best of the 10900K. You'll also need an enthusiast-class motherboard with beefy power circuitry, and also plan on some form of active cooling for the motherboard's power delivery subsystem." https://www.tomshardware.com/reviews/intel-core-i9...
    "While Intel designed its 250W limit to keep thermals 'manageable' with a wide variety of cooling solutions, most motherboard vendors feed the chip up to ~330W of power at stock settings, leading to hideous power consumption metrics during AVX stress tests. Feeding 330W to a stock processor on a mainstream motherboard is a bit nuts, but it enables higher all-core frequencies for longer durations, provided the motherboard and power supply can feed the chip enough current, and your cooler can extract enough heat.

    To find the power limit associated with our chip paired with the Gigabyte Aorus Z490 Master motherboard, we ran a few Prime95 tests with AVX enabled (small FFT). During those tests, we recorded up to 332W of power consumption when paired with either the Corsair H115i 280mm AIO watercooler or a Noctua NH-D15S air cooler. Yes, that's with the processor configured at stock settings. For perspective, our 18-core Core i9-10980XE drew 'only' 256W during an identical Prime95 test." https://www.tomshardware.com/reviews/intel-core-i9...

    Think it is still a pebkac error?
  • alufan - Thursday, May 21, 2020 - link

    try this he doesn't slate the intel or amd just a proper review with live power draw at the socket OMG lol you need your won power plant when you run these let alone over clock it

    https://www.kitguru.net/components/leo-waldock/int...
  • Spunjji - Tuesday, May 26, 2020 - link

    "They were able to cool OC 10900k with 240m AIO just lol"
    Who were? Everyone I've read indicates that with a 240mm AIO, CPU temps hit 90+

    Pathetic comment troll is pathetic.
  • Retycint - Wednesday, May 20, 2020 - link

    It is, in fact, a huge issue because most people won't have high end coolers necessary to keep the thermals under control. Personal attacks such as accusing people of being a "fanboy" just degrades your argument (if there was any in the first place) and make you look dumb
  • Spunjji - Tuesday, May 26, 2020 - link

    "Such retarded comment."
    The pure, dripping irony of using a slur to mock someone else's intelligence, but screwing up the grammar of the sentence in which you do it...

    Some people build from scratch. Some people have uses for their old system. larger PSUs and suitable cooling to get optimal performance from this CPU don't come cheap. Go home, troll.
  • watzupken - Wednesday, May 20, 2020 - link

    Not surprising, Intel managed to keep their advantage in games by pushing for higher frequency. However the end result is a power hungry chip that requires some high end AIO or custom water cooler to keep cool. I agree that Intel is digging themselves deeper and deeper into a hole that they will not be able to get out so easily. In fact I don't think they can get out of it until their 7nm is ready and mature enough to maintain a high frequency, or they come out with a brand new architecture that allows them to improve on Comet Lake's performance without the crazy clockspeed. Indeed, they will not be able to pull another generation with their Skylake + 14nm combination looking at the power consumption and heat generation issue. Intel should consider bundling that industrial chiller they used to cool their 20 core chip during the demo.
  • watzupken - Wednesday, May 20, 2020 - link

    Sorry for typo, its a 28 core, not 20 core.
  • blaktron - Wednesday, May 20, 2020 - link

    No one else wondering how Ian manages to get only a 5% drop in performance going from h264 Faster to h265 Fast? That should be well over a 50% drop, and suggests he is running his HEVC tests with an H264 profile.

    Am I crazy here or is the idea that an 8 core CPU gets 200 fps h265/HEVC encoding just plain wrong?
  • WaWaThreeFIVbroS - Thursday, May 21, 2020 - link

    This place is owned by the dudes running tomshardware, what do u expect
  • Icehawk - Saturday, May 23, 2020 - link

    I have asked numerous times how they get HEVC #s as they are almost quadruple what I get. 3900x gets in the 70s encoding and my 8700 was in the 60s. I can only guess they use the hardware encoders which isn’t how anyone who cares about quality is going to do it and doesn’t show the full cpu vs cpu difference, it shows the built in encoder. But Anand still thinks people who bother to read CPU reviews don’t use XMP.
  • lucasdclopes - Wednesday, May 20, 2020 - link

    "Intel's turbo has a recommended length of 56 seconds according to the specification sheets, and on our test system here, the motherboard manfuacturer is confident that its power delivery can support a longer-than-56 second turbo time. "
    So performance of those chips will have significant differences depending on the motherboard? Maybe cheaper boards will result in worse sustained performance then.
  • jcc5169 - Wednesday, May 20, 2020 - link

    Intel fanboys are gasping for air, looking for excuses not to buy the obvious choice, AMD
  • DannyH246 - Wednesday, May 20, 2020 - link

    www.IntelTech.com does it again!! Every element designed to show Intel in the best possible way.
    How about this instead...
    The Core i9-10900K's is priced so that its clear competitor is the Ryzen 9 3900X. However AMD offering is still >=15% cheaper, offers PCIe 4.0 compatibility, uses less power, is more secure and can be used on older, cheaper boards that also support the 16-core 3950X allowing for an upgrade path. The Core i9 is a moderately reasonable chip at best, however as it requires a new motherboard it is effectively a dead end.
  • vanilla_gorilla - Wednesday, May 20, 2020 - link

    I always know it's a good review when half of the comments claim the author is an Intel shill and the other half claim they are an AMD shill.
  • Beany2013 - Wednesday, May 20, 2020 - link

    Ain't it beautiful?

    Honestly, I'm kinda surprised how well Intel has managed to maintain their performance on a pure math basis, but oh *goodness* that power usage.

    I think things will get really interesting when intel hit the sub 10nm* process (by which time AMD should be on 5nm*) and we'll see how much fight both Intel and AMD both have.

    That it means we can all get solid multicore, multithread (fucking finally) CPUs from both vendors at prices that can be described as 'not entirely crazy' is a win win no matter which side of the fence you're on.

    Steven R
  • Beany2013 - Wednesday, May 20, 2020 - link

    * yeah, nm is a bit of a poor measurement these days, but you get the idea.
  • Beany2013 - Wednesday, May 20, 2020 - link

    Aside - I just priced up the 3600, 3700x, 10600k and 10900k in the UK.

    Oh dear Intel. Oh dear, oh dear, oh dear.

    And that's before you talk about cooling. My word.
  • WaWaThreeFIVbroS - Thursday, May 21, 2020 - link

    Half claimed that author is an AMD shill? I only saw one lol, keep living in your fantasies

    Besides, what danny said is true, the price listed on this article is highly misleading, this place is going downhill ever since the purch dudes or whatever their names are right now boight anandtech lol
  • WaWaThreeFIVbroS - Thursday, May 21, 2020 - link

    Bought*
  • WaWaThreeFIVbroS - Thursday, May 21, 2020 - link

    Half claimed that author is an AMD shill? I only saw one lol, in the entire comment section, keep living in your fantasies, shill

    Besides, what danny said is true, the price listed on this article is highly misleading, 3900X can be found for low 400$, and the 488$ for 10900K is a bulk price, this place is going downhill ever since the purch dudes or whatever their names are right now bought anandtech lol
  • Meteor2 - Wednesday, July 15, 2020 - link

    Are you OK?
  • trivik12 - Wednesday, May 20, 2020 - link

    terrible product but its still impressive that it has single core performance to hold against latest Ryzen processor. I wonder when Rocketlake will release and then hopefully we see 10/7nm desktop in 2022 at least.

    At least Tigerlake in notebook looks good as clockspeeds are up big time and icelake already has shown good single threaded performance. Tigerlake-H/U/Y should be welcome additions.
  • casperes1996 - Wednesday, May 20, 2020 - link

    Reminds me of back when Steve Jobs went on stage and talking about how the advertisement team got inspired by how hot Intel chips ran compared to IBM's Power chips and showed this:
    https://www.youtube.com/watch?v=g15RwcVMXsc

    Replace the "Apple Computer" bit with AMD and it's almost applicable today
  • Maxiking - Wednesday, May 20, 2020 - link

    Great cpu, god bless Intel, AMD duct tape technology demolished in gaming again. How does Intel keep doing it? It is like watching magician pulling rabbits out of a hat!
  • Dug - Wednesday, May 20, 2020 - link

    Except several show the 9900k better than their new cpu! How is that magic?
  • Beany2013 - Wednesday, May 20, 2020 - link

    Magical *thinking* dug. You know, not based in reality.
  • Boshum - Wednesday, May 20, 2020 - link

    Pfft. You are hilarious.
  • Spunjji - Tuesday, May 26, 2020 - link

    Maxipad, the latest in the line of Gondalf imitators.
  • Adm_SkyWalker - Wednesday, May 20, 2020 - link

    Once again I find myself debating if I should upgrade. My current i7-6950X has held up better than I thought it would. I guess it's another year or two wait for me.
  • Boshum - Wednesday, May 20, 2020 - link

    I would be good with a beast like that for 5 more years.
  • Icehawk - Saturday, May 23, 2020 - link

    I’d wait until a component like mobo dies, that’s what got me to move from a 3770 about a year ago to a 8700 - mobo died and they were pricy and old. Replaced my wife’s i5 from same gen with a 3900X though recently and gave her the intel box. I’m a gamer but I do a lot of encoding so felt AMD offered a better mix and allows me to use my 450W fanless PSU. But aside from encoding speed I barely notice a difference from that 3700.
  • Dug - Wednesday, May 20, 2020 - link

    The problem with all these charts is that they are inconsistent.

    There are so many variables that aren't shown that it doesn't make sense to show these.

    Most of this has to do with how motherboards handle the cpu's and what their default settings do.
    There can be a 15% swing in AMD motherboard default settings between brands. Not to mention things like pbo on or off, infinity fabric, memory timings, etc.

    I don't know about the Intel side. I remember their settings made less difference unless it was just cpu clock speed.
  • shady28 - Wednesday, May 20, 2020 - link

    Agree with the sentiment, but you kinda stacked the deck with that last statement.

    Most of the Z490s are now supporting much higher speed RAM (up to DDR4-5000) and even intel 9th gen were good at overclocked RAM, while AMD systems rarely get above 3600Mhz. It shows if you look at something like PCMark 10 where the top 100 systems on almost all of the charts is completely dominated by intel. All of them are overclocked of course, but all of the top AMD systems are also overclocked.

    What I would like to see is something along the lines of a i5-10600K vs AMD 3600 vs AMD 3600X, but not using 'all the same components other than mobo and CPU'. Take those 3 chips and build the fastest system you can with them. Use that PCI 4.0 NVMe and GPU on AMD, use that 4800Mhz CAS 18 RAM on the Intel. See what happens.
  • mrvco - Wednesday, May 20, 2020 - link

    Ok, part of me would be curious to see what Intel could (or couldn't) do with an 11th Gen spin of their 14nm process.
  • Findecanor - Wednesday, May 20, 2020 - link

    The "Security" portion of this article is not really comprehensible. I can't guess what the author is thinking. The author needs to write it down in actual words what these things mean.

    Security on Intel processors is what is holding me off from buying any Intel CPU for the time being.
    I consider myself pretty knowledgeable about the actual vulnerabilities themselves, and how they work, and how they can be mitigated -- in theory --, but if I have not kept up with every little tidbit of news about security on Intel's processors in particular, that portion of the article tells me absolutely NOTHING.
  • quadibloc - Wednesday, May 20, 2020 - link

    These chips are impressive, and for people with a need to build a system today, and a preference for Intel, they are reasonably competitive. So I am favorably impressed, even if AMD would remain my own choice at the moment. I still do believe that in the long run, Intel does have the means to regain leadership, so that in a year or two or five, AMD will be back to being in second place (but in second place like the previous generations of Ryzens, not like the Bulldozer years). I don't know, though, if even Intel will be able to keep up at the process end; even it may have to go fabless after 10nm, which would have significant implications for the industry.
  • Boshum - Wednesday, May 20, 2020 - link

    I generally agree, but I'm not so certain AMD will be in 2nd place within 5 years (from a best CPU architecture point of view). They should be considering the difference in resources, but Intel is so spread out and AMD seems so focused.
  • poohbear - Wednesday, May 20, 2020 - link

    OK i'll bite. Why would anyone buy this generation of Intel processors when AMD's is just as powerful and yet more efficient being on 7nm? Especially with Ryzen 4000 coming out this fall.
  • dguy6789 - Wednesday, May 20, 2020 - link

    AMD is ahead in a few key areas- price vs performance, total number of cores/threads, power.

    Intel is still ahead in the per core/per thread area. An Intel 8 core 16 thread will beat an AMD 8 core 16 thread in absolutely everything because of just how high Intel chips can clock to. In short, Intel is a higher performing albeit more expensive option for low thread count workloads.
  • Boshum - Wednesday, May 20, 2020 - link

    I don't think the power and heat are too big a deal until you hit the 8 and 10-core K chips. The people that buy those are enthusiast gamers who want the highest possible FPS in games (whether they are able to perceive it or not, but I am sure they can in certain scenarios). A lot of those ultra-enthusiasts have a lot of fun with overclocking too, and Intel gets more out of that.
    Ryzen 4000 will undoubtedly be a better overall chip, but Rocket Lake should be coming to the LGA 1200 platform in the not too distant future. It may pass up Ryzen 4000 in gaming for those benchmark enthusiasts. It will be no match for Ryzen 4000 in heavy multi-core scenarios.
  • gagegfg - Wednesday, May 20, 2020 - link

    At the end of the day, AMD continues to have the performance crown at a price premium (3950X).
    Also, it seems to me a bad ANANTECH policy for many graphics that do not have an AMD equivalent CPU and only put the 3600.
  • mandoman - Wednesday, May 20, 2020 - link

    I can't imaging anyone being the slightest bit concerned about power on the HEDT! It's simply ludicrous to even bring it into the discussion. Frankly the whole emphasis in this review smacks loudly of "tree hugger" philosophy which has no place in the high end computing arena at all.
  • Beany2013 - Wednesday, May 20, 2020 - link

    Some of us actually care about good engineering rather than pushing an old, inefficient process node as hard as technically possible.

    Enjoy dropping an extra £100 just to cool your CPU.
  • Hxx - Wednesday, May 20, 2020 - link

    WHAAT? U think this is not good engineering? this is BALLS engineering, they basically achieved a miracle on the 14nm platform. You are basically standing in front of a miracle. Step back and think about it. A 5 yo technology that competes and beats in many tests the competitor's 7nm process. Yes overall AMD may be the better purchase but again that not what im saying.
    Just think about that. On top of that they added good overclocking, controlled temps, plenty features, etc . Cant say im impressed with the Z490 platform itself since its the same old z390/70/270/170 with better connectivity but the CPU themselves will make history I mean the 14nm process sure is effing OLD but man what these guys did with this, the refinement it went through to achieve this performance on this OLD tech is amazing in my opinion and for that I applaud them. I want them to hurry up and wrap up Rocket Lake but this is definitely for sure no doubt definitely great engineering.
  • alufan - Thursday, May 21, 2020 - link

    so what exactly do you think would happen if AMD did the same thing threw the power limits out the window and used a 14++++++ node with the extra thermal headroom available with the 3000 series chips, Intel has not released its new process node chips because they cant make them work AMD has and the limitations are simply due to the node size and physics, they have engineered a way round the issue Intel even now is talking about backporting designs it stinks, this is a "new" chip from Intel with more top end period AMD has released 3 nodes in 3 years and has a new version coming up in a few months with a rumored 20% uplift in IPC but lets wait and see, not to mention 5nm is designed and being sampled and 3nm is in design, that is Engineering
  • Hxx - Thursday, May 21, 2020 - link

    ROFL AMD? AMD struggles with getting a BIOS right let alone fine tuning a platform ? Nah they are too busy now supposedly giving us a beta bios for the 4xx series and that's a very scary thought given AMD's track record. In case you didn't know, AMD doesn't make their own chips. If tsmc moves to a different node then so will AMD, that's how it works. So yes I applaud TSMC for good engineering, AMD not so much.
  • arashi - Sunday, May 24, 2020 - link

    Replacing Stewart with xx does not a clone account make.

    Try again.
  • Spunjji - Tuesday, May 26, 2020 - link

    Good catch XD
  • Spunjji - Tuesday, May 26, 2020 - link

    You're talking past yourself.

    Sure, it's impressive what Intel's disaster management engineers managed to pull out the wreckage of their failure at 10nm. Their failure at 10nm was an engineering failure too, though, and they still haven't managed to backport their 10nm-planned architecture to 14nm.

    In other words, those engineering failures are the only reason they had to build this crazy nonsense - of which you express such admiration - in the first place.
  • extide - Wednesday, May 20, 2020 - link

    This is not HEDT
  • Spunjji - Tuesday, May 26, 2020 - link

    He's still reading from the 2016 Intel playbook :D
  • Icehawk - Saturday, May 23, 2020 - link

    I care because I like silent machines and use fanless PSUs. I can’t afford to blow 250-300W of the power budget on the CPU when I am limited to 450W, the small difference in real world gaming isn’t worth popping for a higher power PSU that brings with it fan noise. I should be able to run my 3900X with a nV 3070 with what I have, I don’t think I could with this i9.

    If power budget isn’t a concern then it’s down to brand preference, usage mix, etc to me. I have an intel 8700 as well, at the time I felt that was the best CPU choice, when I needed another new machine a few months ago the 3900 was - I still feel it would be today for me.

    YMMV
  • Spunjji - Tuesday, May 26, 2020 - link

    Cool, another person who thinks their personal views on a topic outweigh all others and is psychologically projecting that onto the reviewer. This is how 90% of disinformation works now...
  • prophet001 - Wednesday, May 20, 2020 - link

    I'm curious as to why this only has 16 pcie lanes into the CPU. How much does running your high performance SSD through the PCH or running your GPU in x8 mode affect performance?
  • GreenReaper - Wednesday, May 20, 2020 - link

    Conveniently, there is an article (almost) about that: https://www.anandtech.com/show/15720/intel-ghost-c...
  • azfacea - Wednesday, May 20, 2020 - link

    with intel DIY PC marketshare being well below 50% and 10th gen itself having to compete with 9th, 8th, 7th, with supply shortage and everything I doubt these new LGA1200 motherboards can reach 10% of DIY PC which means the

    " ... 44+ entrants ranging from $150 all the way up to $1200 ..."

    are all massive cash burning operations that would never make sense in a million years w/- intel "development funding". they are literally squandering billions of dollars that they took from ripping of the customers. intel is so stupid, gouging its customers like this and then squandering the money for what ?? LGA 1200 has the option to have pcie 4 by the time its irrelevant ? my god WTF is going on there.
  • ByteMag - Wednesday, May 20, 2020 - link

    I'm wondering why the 3300X wasn't in the DigiCortex benchmark? This $120 dollar 4c/8t banger lays waste to the selected lineup. Or is it too much of a foreshadowing of how Zen 3 may perform? I guess benchmarks can sometimes be like a box of chocolates.
  • ozzuneoj86 - Wednesday, May 20, 2020 - link

    Just a request, but can you guys consider renaming the "IGP" quality level something different? The site has been doing it for a while and it kind of seems like they may not even know why at this point. Just change it to "Lowest" or something. Listing "IGP" as a test, when running a 2080 Ti on a CPU that doesn't have integrated graphics is extremely confusing to readers, to say the least.

    Also, I know the main reason for not changing testing methods is so that comparisons can be done (and charts can be made) without having to test all of the other hardware configs, but I have one small request for the next suite of tests (I'm sure they'll be revised soon). I'd request that testing levels for CPU benchmarks should be:

    Low Settings at 720P
    Max Settings at 1080P
    Max Settings at 1440P
    Max Settings at 4K

    (Maybe a High Settings at 1080P thrown in for games where the CPU load is greatly affected by graphics settings)

    Drop 8K testing unless we're dealing with flagship GPU releases. It just seems like 8K has very little bearing on what people are realistically going to need to know. A benchmark that shows a range from 6fps for the slowest to 9fps for the fastest is completely pointless, especially for CPU testing. In the future, replacing that with a more common or more requested resolution would surely be more useful to your readers.

    Often times the visual settings in games do have a significant impact on CPU load, so tying the graphical settings to the resolution for each benchmark really muddies the waters. Why not just assume worst case scenario performance (max settings) for each resolution and go from there? Obviously anti-aliasing would need to be selected based on the game and resolution, with the focus being on higher frame rates (maybe no or low AA) for faster paced games and higher fidelity for slower paced games.

    Just my 2 cents. I greatly appreciate the work you guys do and it's nice to see a tech site that is still doing written reviews rather than forcing people to spend half an hour watching a video. Yeah, I'm old school.
  • Spunjji - Tuesday, May 26, 2020 - link

    Agreed 99% with this (especially that last part, all hial the written review) - but I'd personally say it makes more sense for the CPU reviews to be limited to 720p Low, 1080P High and 1440P Max.

    My theory behind that:
    720p Low gives you that entirely academic CPU-limited comparison that some people still seem to love. I don't get it, but w/e.
    1080p High is the kind of setting people with high-refresh-rate monitors are likely to run - having things look good, but not burning frames for near-invisible changes. CPU limiting is likely to be in play at higher frame rates. We can see whether a given CPU will get you all the way to your refresh-rate limit..
    1440p Max *should* take you to GPU-limited territory. Any setting above this ought to be equally limited, so that should cover you for everything, and if a given CPU and/or game doesn't behave that way then it's a point of interest.
  • dickeywang - Wednesday, May 20, 2020 - link

    With more and more cores being added to the CPU, it would've been nice to see some benchmarks under Linux.
  • MDD1963 - Wednesday, May 20, 2020 - link

    Darn near a full 2% gain in FPS in some games! Quite ...uhhh..... impressive! :/
  • MDD1963 - Wednesday, May 20, 2020 - link

    Doing these CPU gaming comparisons at 720P is just as silly as when HardOCP used to include 640x480 CPU scaling...; 1080P is low enough, go medium details if needed.
  • Spunjji - Tuesday, May 26, 2020 - link

    Personally agreed here. It just gives more fodder to the "15% advantage in gaming" trolls.
  • croc - Wednesday, May 20, 2020 - link

    It would be 'nice' if the author could use results from the exact same stack of chips for each test. If the same results cannot be obtained from the same stack, then whittle the stack down to those chips for which the full set of tests can be obtained. I could understand the lack of results on newly added tests...

    For a peer review exercise it would be imperative, and here at Anandtech I am sure that there are many peers....
  • 69369369 - Thursday, May 21, 2020 - link

    Overheating and very high power bills happens with Intel.
  • Atom2 - Thursday, May 21, 2020 - link

    Dear Ian, You must be the only person on the planet that goes to such lengths not to use AVX, that you even compare Intel's AVX512 instructions to a GPU based OpenCL, just to have a reason not to use it. Consequently you only have AMD win the synthetic benchmarks, but all real world math is held by Intel. Additionally, all those synthetics, which are "not" compiled with Intel C++. Forget it... GCC is only used by Universities. The level of bias towards AMD is becoming surreal.
  • Spunjji - Tuesday, May 26, 2020 - link

    Complaining at the reviewer for failing to test something that doesn't really get used is... a thing.
  • Datawhite - Thursday, May 21, 2020 - link

    Bring on ZEN 3 AMD than Intel can R.I.P. ......
    Still waiting for RDNA 2!
  • Samus - Thursday, May 21, 2020 - link

    No quad core under $100 basically just gave AMD the entire budget segment.

    Overall, this pricing is ridiculous but at least the 6C parts are somewhat competitive.
  • ph1nn - Thursday, May 21, 2020 - link

    Does Intel realize global climate change is a thing? This power consumption is an embarrassment, this company used to have the most most efficient CPUs now they draw 200W?!
  • Gastec - Friday, May 22, 2020 - link

    I don't understand what the climate change has to do with a 200W CPU power consumption. I would have understood something like "does Intel realize we have limited or non-existent incomes, given the current Pandemic situation?"
  • Beaver M. - Friday, May 22, 2020 - link

    I hope you buy a new PC only every 10 years.
  • pegnose - Friday, May 22, 2020 - link

    It looks to me that a simple re-ordering of the core-to-core latency chart for the 10900K removes the apparent 3-4 ns jump. You already mentioned that the core "names" not necessarily represent hardware positions, Ian.

    Btw, I am curious why it seems that a higher core/thread index comes with higher latency. Adjacent cores should have low core-to-core latency. But 16-to-18 takes longer than 4-to-6. Is this due to address-checking in the ring-bus communication taking longer for higher indices?
  • Shaquille_Oatmeal - Friday, May 22, 2020 - link

    X570 chipset AMD boards can't be found in stock almost anywhere. This isn't news. But even today, days after Intel's 10th gen LGA1200 CPUs launched, and the arguably subjective reviews are finally made public, there's an endless supply of Z490 boards. PC enthusiasts do want the fastest CPUs, for sure, but we also consider the cost and [overall efficiency]. We are not 12 year old kids wanting the colorful RGB lights for our COD rig. No. The RGB lighting is a nice feature, but we're not idiots. These Intel CPUs are garbage based on even Intel's standards over the years; yet they are being marketed like they are the best CPUs. Intel, we can see the truth. And the truth is we won't touch these CPUs; perhaps if you dropped the price on the 10700K to $250 we can have a serious convo. Hopefully Intel gets there game together. I'm sure their OEM buyers are thinking the same.
  • Gastec - Friday, May 22, 2020 - link

    The way this is going I'm looking forward to that 32-core Intel consumer CPU, with 1000 W power draw, that will definitely give us those much needed 1000 fps @ 1080p
  • boozed - Saturday, May 23, 2020 - link

    Got a question about the game benchmarks. The table has an "IGP" column but the charts in that column have "GTX 1080" written on them. So which is it?
  • Ryan Smith - Tuesday, May 26, 2020 - link

    To be sure, it's GTX 1080. IGP is the name of the setting.
  • F123Nova - Saturday, May 23, 2020 - link

    I am trying my best to be nice, but this article has the most dubious set of benchmarks I have seen, and the omission in the charts of Intel competition in certain charts where the competition is better makes me wonder why this article smells of a cash handout. Cant say for sure if this is another "Just buy it" piece, but it sure smells foul. I expected more from Anandtech...
  • Ryan Smith - Tuesday, May 26, 2020 - link

    Hi Nova,

    As has been the case for the past 23 years, we always strive to have accurate reporting, to the best of our abilities.

    Given that we're in the process of rolling out some new benchmarks (such as the Crysis software render), we haven't yet had a chance to backfill in results for a number of processors. Unfortunately that's going to take some time. But in the meantime, was there any specific benchmark(s) you were concerned about? That might at least help us better prioritize what to backfill first.

    And to be sure, there's no cash handout. That's not how we operate. (Selling out for anything less than an incredibly comfortable retirement isn't very helpful for our future employment prospects)
  • tvdang7 - Wednesday, May 27, 2020 - link

    why couldnt AT use a 3800x instead of a 3700x.
  • pcgpus - Friday, July 10, 2020 - link

    Nice review. 10600K might be a new king in games (for fair price).

    If you want to compare this article with other services You have to go on this link:
    https://warmbit.blogspot.com/2020/06/intel-core-10...

    There are results from 9 services from 32 games!

    After page load please pick up your language from google translate (right side of page).
  • pcgpus - Friday, July 10, 2020 - link

    Nice review. 10900K is the new king in games!

    If you want to compare this article with other services You have to go on this link:
    https://warmbit.blogspot.com/2020/06/intel-core-i9...

    There are results from 9 services from 35 games!

    After page load please pick up your language from google translate (right side of page).
  • Meteor2 - Wednesday, July 15, 2020 - link

    A new microarchitecture doesn’t require a new process. When PAO immediately went south, I don’t understand why Intel didn’t just implement a new microarchitecture on 14 nm. Surely Ice Lake hasn’t taken four years to develop?
  • Meteor2 - Wednesday, July 15, 2020 - link

    *Sunny Cove. God Intel’s code-names are dumb
  • miss5tability - Saturday, August 8, 2020 - link

    i just discovered this INTEL SCAM, now i dont freaking understand how those 10 gen cpu works i wanna buy i3 10300 and what im reading this is not 65W chip? what is real f@#%$@ power draw for those cpus
  • damian101 - Monday, August 10, 2020 - link

    As far as I know Intel never used a single bidirectional ring bus on CPUs with more than 10 cores.
    On Intel Ivy Bridge CPUs with 12 and more (15) cores, Intel used three unidirectional ring buses. There were also no Sandy Bridge CPUs with more than 10 cores, and Intel used two bidirectional ring buses connected with buffered switches for their high core count Haswell CPUs.

Log in

Don't have an account? Sign up now