Comments Locked

111 Comments

Back to Article

  • goatfajitas - Monday, January 29, 2024 - link

    Whoa, 428 watts at peak for the i9-14900K? I have not paid close attention to the last few rounds of releases and I knew it was bad but wholly crap. WTF Intel?
  • [email protected] - Monday, January 29, 2024 - link

    Yeah, they definitely get pretty toasty, of it's imperative to plan out a cooling strategy to keep the CPU and other components from roasting. Of course, it does allow you to use it as a space heater in addition to a computer in cold weather. The highest TDP Ryzen 7000 CPUs run "cool" by comparison.
  • goatfajitas - Monday, January 29, 2024 - link

    Yes, it would make a good "Winter PC" LOL
  • shabby - Monday, January 29, 2024 - link

    Are they available in russia? They need them desperately to heat their frozen homes 😂
  • GeoffreyA - Tuesday, January 30, 2024 - link

    Maybe the US should donate a few out of the kindness of their hearts.
  • ricebunny - Monday, January 29, 2024 - link

    Peak power is an irrelevant metric. It’s more of a motherboard feature than anything else - Intel’s Raptor Lake will pull as much power as you give them.

    For those who are concerned about power, there is a TDP ceiling feature. Once set, the Intel CPU will adhere closely to the limit. Laptop tests have shown the Intel Raptor Lake to be about as power efficient as the Zen 4. Take a look at Ars’s review of Framework 13.
  • goatfajitas - Monday, January 29, 2024 - link

    It is not irrenevant and is not a mobo feature. It is how much power is drawn under heavy load. When under heavy load for short bursts it can be fine, but under sustained load, it will get too hot and therefore not operate at or near turbo, it will run closer to the base clock, which is lame.
  • TheinsanegamerN - Tuesday, January 30, 2024 - link

    You are ignorant. A 14900k will shovel gobs of power if you let it. Set it to a 250 watt TDP and it will stick to 250 watt while losing MAYBE 2% peak performance.

    Turbo isnt SUPPOSED to be sustained, that's a MOBO feature. Have you tried reading?
  • goatfajitas - Tuesday, January 30, 2024 - link

    Dont be so pedantic... I didnt say it was supposed to sustain it, I am saying the power draw is too damn high period. Intel is compensating for inefficient design and has been doing it since they got stuck on 14nm several years back.

    Are you trying to claim Intel doesnt have issues with heat here or just being pissy?
  • temps - Tuesday, January 30, 2024 - link

    He's not being pedantic. Not in the slightest.

    If you can dissipate hundreds of watts of heat there is no issue. If you let the chip run uncapped and it draws lots of power... who cares... if you're willing to pay the electrical bill, I'm sure you're very happy to have that last few percent of performance.
  • t.s - Tuesday, January 30, 2024 - link

    well, I care.
  • goatfajitas - Tuesday, January 30, 2024 - link

    I do see your point. You can put in some high end cooling and take advantage of the speed. That still doesnt fix the fact that it runs extremely hot and power hungry compared to its competition.
  • ricebunny - Tuesday, January 30, 2024 - link

    Highly dependent on application. In games they pull around 125W, roughly the same as high end Zen 4 CPUs.
  • goatfajitas - Tuesday, January 30, 2024 - link

    "Highly dependent on application. In games they pull around 125W, roughly the same as high end Zen 4 CPUs."

    Agreed on some setups. I have seen some equivalently equipped SFF/Tiny desktop setups (with obviously limited thermals) and Intel drops off alot earlier. The point being it runs too hot.
  • WaffleTech - Sunday, February 4, 2024 - link

    "roughly the same"

    ComputerBase measured this over more than a dozen games and on average it's 149W for the 14900K and 72W for the 7950X3D, which is less than half. Even the Non-X3D 7950X with 105W uses almost 30% less in games.
  • Thunder 57 - Tuesday, January 30, 2024 - link

    It's the enormous heat output that is the problem, not the electric bill.
  • yankeeDDL - Thursday, February 1, 2024 - link

    "no issue"? Are you joking?
    Cost aside, to dissipate 400+W you need massive cooling, and the MoBo needs to be way over-designed.
    Intel peaks at 4x the power of Ryzen to, maybe, marginally beat it in few benchmarks. That's not irrelevant at all: this kind of delta in efficiency applies also on mobile (albeit not so extreme): that translates in battery life (and slower performance, as on laptops you can't peak at those levels).
  • is4u2p - Wednesday, January 31, 2024 - link

    Was this in the 12th gen mobile CPUs?
  • lakedude - Friday, February 9, 2024 - link

    "Peak power is an irrelevant metric." maybe to you.

    Going AMD this time allowed me to save money on the cooler and the power supply. Gonna save money on electricity as well.

    I hate the way Intel plays games with their TDP these days. Don't say something is 125w TDP if it can pull over 300w. Call it a 300w chip that can be underclocked or power capped to 125w. Or have it only pull 125w out of the box rather than leave it up to the user to fiddle around with turning it down.
  • wwenze - Monday, February 5, 2024 - link

    Well, Intel definitely has the advantage when it comes to "not dying when pushed hard" or even "not dying when running at the actual stated specs" but outdate efficiency/IPC, while AMD is the opposite on both fronts.

    So Intel put the pieces together and said "what if we clock it until 250W turbo TDP while the competition only has 100W at that price point"
  • ToTTenTranz - Monday, January 29, 2024 - link

    It might be worth mentioning the 8700G's larger iGPU gains almost 25% performance when paired with DDR5-6000 instead of the slower DDR5-5200:

    https://www.techspot.com/review/2796-amd-ryzen-870...

    I don't know what advantages there are for DIY clients to hold on to JEDEC-approved speeds instead of just going with XMP and EXPO "pre-validated 1-click overclocks". Especially considering how it's only called "overclock" because JEDEC seems to take their sweet time to validate new clock/timing setups.
  • Slash3 - Monday, January 29, 2024 - link

    The gap is likely even larger than that, as I believe HWU's 5200MT entry is still using faster XMP/Expo timings, rather than CL44 base spec. Steve simply scaled the speed for the other entries (the exception being 7200MT, which is also at Gear 2).
  • meacupla - Monday, January 29, 2024 - link

    The DDR5 RAM has to run at 1.1V to get JEDEC validation. It's not that JEDEC takes their sweet time, it is about DRAM makers not being able to produce fast DRAM.
    This is especially true when you want low latency.

    AFAIK, the fastest JEDEC compliant DDR5 available is 6400, but with awful latency.

    More so, I am surprised techspot got their chip to work at DDR5-7200. I didn't think it was possible to hit more than 6000 with any stability on the AM5 platform.
  • Slash3 - Monday, January 29, 2024 - link

    Recent AGESA updates have made Gear 2 speeds in the 7200-8000MT range quite possible, but improvement over Gear 1 6000MT operation is fairly small due to the I/O die fabric speed being a limiting factor.

    These APUs are monolithic, though, so higher FCLK and MCLK/UCLK speeds are likely possible. Hopefully some reviewers and users will have time to dig into it now that the chips are officially out. I suspect we'll see some people running 6600MT in Gear 1 or 8000MT+ in Gear 2.
  • nandnandnand - Tuesday, January 30, 2024 - link

    These APUs are bandwidth starved when it comes to iGPU performance. This review is already an outlier for using DDR5-5200, although pairing more expensive memory with 8700G/8600G doesn't make it a better choice than a CPU+discrete combo for budget users.
  • FWhitTrampoline - Tuesday, January 30, 2024 - link

    The main reason that the FCLK can get to higher rates/better FCLK:MCLK/UCLK ratios and with the proper stability on the Ryzen 8000G series APUs is because the Memory Controllers are on the same monolithic slab of silicon as the CPU cores. So there's no Cross-Module(Die to Die)/cross-clock-domain SerDes hops required there on Monolothic APUs to slow things down there. So the path from the Infinity fabric to the Memory Controllers on any Monolithic APU die based design is simpler there for APUs.
  • thestryker - Monday, January 29, 2024 - link

    JEDEC did something odd with DDR5 in that there are multiple sets of compliant latency and every maker has gone with the middle set. The lowest set is identical to DDR3/4 JEDEC compliant memory with the middle being about 14% higher. Still worse than XMP/EXPO kits as those just throw voltage through to lower latency.

    I'm hoping that with the 8500 DRAM Micron mentioned in a press release means maybe there will be some lower latency parts I can get for my server box.
  • is4u2p - Wednesday, January 31, 2024 - link

    The legion Go runs 7500 RAM with the Z1 Extreme.
  • is4u2p - Wednesday, January 31, 2024 - link

    Yep, they peak around DDR5-6400 speeds, which is why you're seeing all these handhelds with 6500 or higher.
  • barich - Wednesday, January 31, 2024 - link

    DIY clients, sure. But these are going to end up in a lot of Dells and HPs, probably more by far than people who build their own. It's good to have representative benchmarks of that sort of configuration, and they'll definitely only ship JEDEC-specced RAM.

    Personally, I actually used Crucial's JEDEC DDR5-5600 in my 13th gen Intel build because stability is of paramount importance to me and anything higher is technically overclocking. Intel doesn't guarantee that any given CPU's memory controller will function properly faster than that. They pretty much all do, to a greater or lesser degree, but it's still a guess as to how far you can go.
  • James5mith - Monday, January 29, 2024 - link

    "These limitations primarily come in highly intensive multi-threaded workloads such as rendering or encoding, where the performance of processors such as the Ryzen 7000 desktop series, but the key point is that these APUs aren't inherently designed for these tasks in mind, and users looking for more CPU grunt are almost certainly likely to opt for a higher grade processor with faster cores, more cores, and more threads. "

    That is a) a massive run-on sentence, and b) doesn't make much actual sense. For example:

    "where the performance of processors such as the Ryzen 7000 desktop series"

    Where the performance does what? Or is what? There is no coherent thought in that comma delimited side note.
  • GeoffreyA - Tuesday, January 30, 2024 - link

    The performance of the 7000 series excels at multithreading.
  • yankeeDDL - Monday, January 29, 2024 - link

    I wish there were some more remarks vs Intel's offering.
    CPU-wise Ryzen is more efficient, generally speaking. The performance seems same or slightly lower when compared with CPU that burn 400W to reach crazy boost rate. Still, that's my view.
    Comparing it only against Ryzen 5*** seems a bit limited, no?
  • meacupla - Monday, January 29, 2024 - link

    Ryzen 5000G is the primary competitor for Ryzen 8000G.
    It's next closest competition comes from the mobile segment in the form of mini-PCs.
    Intel hasn't offered an APU for its desktop socket in ages.
  • yankeeDDL - Thursday, February 1, 2024 - link

    Not correct.
    Ryzen 8000G is the only Zen4 desktop CPU on TSMC 4nm process (Ryzen 7000 is also Zen4, but on 5nm).
    So - iGPU aside - I expect the 8000G to be more efficient than the 7000. Hence, I am curious as to how it would perform against Intel and also against the 7000.
  • Grapple - Monday, January 29, 2024 - link

    Comparing against Intel’s 65W T-series processors would have made this article much more interesting.
  • FWhitTrampoline - Monday, January 29, 2024 - link

    T series is 35W and mostly for thin client Mini Desktop Business PCs but at least the T series are Socket Packaged! And Intel Made a Big mistake not releasing at least a 65W Socket Packaged Meteor Lake SKU. And Intel is very well capable of offering a socket packaged Mobile Processor because I'm still using my HP ProBook 4540s laptop with a Socket Packaged Ivy Bridge generation 3632QM processor and that laptop can get a processor update of MB replacement and reuse the same processor!
  • meacupla - Monday, January 29, 2024 - link

    I disagree. But only on processor choice.
    Ryzen 7840HS, Intel i7-13700H and i7-1370P would be my choice.
  • TheinsanegamerN - Tuesday, January 30, 2024 - link

    Given that this is a desktop chip, it'd make more sense to compare it to other desktop chips.

    Otherwise, throw an M3 and a nvidia t100 car processor while youre at it.
  • meacupla - Tuesday, January 30, 2024 - link

    Hahaha, nope.
    8000G are mobile chips using a desktop socket.
  • TheinsanegamerN - Wednesday, January 31, 2024 - link

    If its being sold as a desktop chip it should be compared against the desktop chips it is competing against. If I am building a mini desktop with an APU the 7840u is irrelevant.
  • meacupla - Wednesday, January 31, 2024 - link

    Which, again, is hilarious.
    The facts point to 8700G and AM5 platform costing too much to make sense. If you want a cost effective gaming setup, you are better off with an i3 or R5 with dGPU.

    8700G's one niche is ultra compact mITX without a dGPU. But if you go that route, it's now competing with.... oh look at that, the mini-PC segment.
  • maxijazz - Saturday, February 3, 2024 - link

    Low power APUs as 7840, 8x00G are perfect for fanless (noiseless) mini desktops (audio servers) used in (so called) computer-audio industry, audio-hobby. Intel sucks in these applications recently.
  • ermg_chips - Tuesday, January 30, 2024 - link

    What I'm wondering is, are they going to come out with an 8700GE, 8600GE, etc eventually? The 5xxxGE series were the same CPUs but with a TDP set to 35W to fit in the same ultra-compact systems that the 35W intel -T chips do.

    I have a weird reason for caring, I live a nomadic/unstable lifestyle but still like to self-host all my shiz, so cramming as much compute into a "1 liter" PC as possible is important to me, and at my price point, I've been very happy with a 5750GE in a HP EliteDesk Mini stuffed with RAM as a teeny Proxmox box.
  • FWhitTrampoline - Monday, January 29, 2024 - link

    It's just too bad that Intel did not release any 65W Meteor Lake S Socket Packaged variants As I would rather have built the Intel variant of the ASRock Desk Mini with Meteor Lake instead. And I have to LOL that the Tech Press can only Show the CPU only Blender 3D Cycles Rendering tests as there's no Radeon iGPU ROCm/HIP support for any Blender 3D iGPU accelerated Cycles Rendering testing.

    And Intel's Meteor Lake iGPUs have proper and working iGPU compute API support via Intel's OneAPI/Level-0 whereas on Linux AMD's ROCm/HIP GPU compute API support is just not there for Radeon iGPUs or for most Consumer Radeon dGPUs currently.
  • TheinsanegamerN - Tuesday, January 30, 2024 - link

    It still wont fit in your in win chopin.
  • FWhitTrampoline - Tuesday, January 30, 2024 - link

    I do not own an Inwin Chopin but I do own an ASRock X300 Desk Mini and having to Blender Cycles render on the CPU cores and it's just too much for that OEM cooling solution to handle on the x300 sans any thermal throttling. But Because of the lack of Proper AMD ROCm/HIP support there for the Vega iGPU I can not make use of that for Blender 3D iGPU accelerated Cycles rending where there's plenty more FP units on the iGPU to accelerate the Ray Tracing calculations(Ray Tracing is a Compute Intensive workload).

    And I love how the Tech Press utilizes Blender 3D's CPU Cycles rendering as a CPU stress test and totally ignores any Blender 3D iGPU and dGPU Blender Cycles rendering tests to the point where that's basically ignored and an uncovered subject for the most part. But at least Intel cares more for creators there than AMD! Intel's got it's iGPUs support for Compute Workloads on Linux via OneAPI and Level-0 while AMD just lets the console makers to the hard part of tweaking AMD's APUs for Gaming Graphics workloads only!

    But I'm going to have to wait for Arrow Lake to get any Socket Packaged Intel SKUs with the better Tile Based Graphics on a Desktop/Socket Packaged offering!
  • meacupla - Monday, January 29, 2024 - link

    When you compare the 8700G results to 7840HS (Beelink GTR7), the performance difference is negligible.
    It seems to me that if you don't pair the 8700G with premium RAM, you would be further wasting your money.
  • t.s - Tuesday, January 30, 2024 - link

    Seconded! With better power consumption too.
  • AndrewJacksonZA - Monday, January 29, 2024 - link

    Question: If a person has an APU, why use Blender CPU only?

    So on float8, there's a 2% difference between the 8700G and the i7-14900K. Wow.

    Thank you
  • TheinsanegamerN - Tuesday, January 30, 2024 - link

    guys WHY would you use a CPU only test in a CPU REVIEW??!?!?!?
  • t.s - Tuesday, January 30, 2024 - link

    Cause it have best iGPU in its class. If you won't test that, why bother testing, as it's almost certain that 7700x or 7700 will be better.
  • AndrewJacksonZA - Tuesday, January 30, 2024 - link

    👍
  • TheinsanegamerN - Wednesday, January 31, 2024 - link

    Ok, and in the CPU bench section they use a CPU test. How would you know if a 7700 would be better in CPU load if you dont test it?
  • AndrewJacksonZA - Tuesday, January 30, 2024 - link

    guys WHY would you only test PART of a CPU in a CPU REVIEW??!?!?!?
  • TheinsanegamerN - Wednesday, January 31, 2024 - link

    How DARE we want to see what a CPU does in a CPU review. WAAAH I NEED IGPU OR ILL CRY WAAAAH
  • FWhitTrampoline - Tuesday, January 30, 2024 - link

    Because AMD does not support ROCm/HIP for its iGPUs and its ROCm/HIP for consumer dGPUs is lacking as well on Linux. And the Blender Foundation starting with Blender 3.0/Later editions has dropped supporting OpenCL as the GPU compute API. And so since Blender 3.0/later the Blender Foundation only supports Nvidia's CUDA for non Apple PCs/Laptops and Apple's Metal for Apple silicon for Blender 3D 3.0/later editions.

    So without any Ryzen iGPU support for ROCm/HIP there's nothing to take the CUDA Intermediate Language Representation(ILR) and convert that to a form that can be executed on Radeon iGPU/dGPU hardware. And for Intel's iGPUs and dGPUs it's Intel's OneAPI/Level-0 that does the translating of the CUDA ILR to a form than can be executed on Intel's iGPU/dGPU hardware and for Intel that OneAPI/Level-0 works for Windows and Linux!

    Blender 3D generates CUDA PTX ILR and All GPU Makers us Intermediate Languages for GPUs so GPU makers/others ship no pre-compiled binaries where software gets directly compiled into the GPUs Native Instruction Set in advance. And that's so the ILR code remains portable across OS/Ecosystems and GPU makers are free to modify their GPU ISA and still maintain comparability with software that only gets compiled into a portable Intermediate language Representation(ILR)
  • FWhitTrampoline - Tuesday, January 30, 2024 - link

    Edit: maintain comparability
    to: maintain compatibility

    I hate Firefox's Spell Checker its a Train Wreck as always!
  • thestryker - Monday, January 29, 2024 - link

    Feels like these APUs deserve a DRAM scaling article comparing the IGP performance.
  • GeoffreyA - Tuesday, January 30, 2024 - link

    Yes, that would be a nice one. Always necessary for APUs.
  • zodiacfml - Monday, January 29, 2024 - link

    thanks but it would have been nicer for me with an i3-12100 in the charts. The 13100f or 12300f tests from old reviews not comparable.
  • TheinsanegamerN - Tuesday, January 30, 2024 - link

    Take 13100f and extrapolate. Not that hard.
  • PeachNCream - Tuesday, January 30, 2024 - link

    Not a bad CPU overall, though it does absolutely devour electrical energy. Competition is far worse, but that shouldn't justify a CPU alone consuming more power than would be required to provide illumination to an entire home - it's worth eleven 800 lumen lights! In the evenings or night, I usually have four or fewer bulbs active for half the power consumption or less than this CPU at sub-maximum workloads WITHOUT the rest of the supporting components a PC requires to provide useful functions. Perspective makes it obvious that's quite terrible when we live on a world that is overpopulated, polluted, and hanging on the precipice of being unable to sustain enough food production to feed us and we all know what happens when humans are hungry and forced to compete for limited resources.
  • TheinsanegamerN - Tuesday, January 30, 2024 - link

    If you're that scared of power use, buy a celeron mini PC and be quiet.
  • PeachNCream - Tuesday, January 30, 2024 - link

    Some of us have to care because it's obvious a lot of us don't and have our heads buried in the sand.
  • TheinsanegamerN - Wednesday, January 31, 2024 - link

    Why are you using a PC if you care? why are you not in a commune growing organic crops by hand if you care so much?

    Nobody cares about your virtue signaling.
  • PeachNCream - Wednesday, January 31, 2024 - link

    Clearly you feel threatened enough by your own lifestyle choices to care by going on the attack and suggesting some extreme alternative like this commune nonsense as if suggesting it eliminates any slight adjustment to your own actions that could offer a reduction in the guilt you're coping with by lashing out.
  • erotomania - Wednesday, January 31, 2024 - link

    How exactly is a 65W processor with graphics "gobbling power"? If you mean inefficient, i suppose we could discuss, with facts. But these are modern Ryzen cores, with some mobile genetics - I don't think inefficient applies.

    In the past I had some Richland APUs (with FX cores) that were definitely inefficient but still idled as low an anything else. I have a 5600G system that idles so low my UPS can't detect it, event though when not idle the system is OC'ed. I would not characterize either as gobbling power.
  • PeachNCream - Thursday, February 1, 2024 - link

    I've already somewhat pointed out why there the consumption is a considerable factor. ~87W at full load as indicated on AT's measurements is enough to provide illumination for an entire home. That isn't a comment on efficiency (or work accomplished for power expended) and I didn't indicate that in my initial post. It's an observation about the power cost implications and impacts of a PC when a single component consumes that much energy and it still, as a standalone device, is an incomplete representation of overall power consumption of a PC built around it.

    And, it's fair to point out that it is NOT an Intel CPU with far higher consumption. I also mentioned that as well in the same post. AMD's CPUs demonstrate a better work-to-wattage ratio so please realize that I'm aware that among all desktop CPUs, this particular chip is far from the worst possible option.
  • maxijazz - Saturday, February 3, 2024 - link

    Maybe people feel threatened because proud woke people (aka communists-fascists) want to enforce their lifestyle on others? By lobbied out new laws, by propaganda, by censorship.
    Otherwise nobody would feel threatened.
    Live your life and let others live theirs.
    If you like force others for good of society, world or universe, go to North Korea. They have communism as state's religion.
  • TesseractOrion - Saturday, February 3, 2024 - link

    Maybe take your meds Maxijazz if you feel so "threatened" LMAO.

    Right trash snowflakes get triggered so easily *sigh*
  • TesseractOrion - Saturday, February 3, 2024 - link

    You do, since it's triggered your usual inane response LOL
  • t.s - Tuesday, January 30, 2024 - link

    As stated on other review comments, you'll be better served with 7840hs mini-pc. Better priced, better idle, better upper-limit power, almost as good againts 8700G (if there's given headroom).
  • FWhitTrampoline - Tuesday, January 30, 2024 - link

    There's no Processor upgrade path for the 7840HS as It's BGA and soldered to the MB! And there's 3rd party software for upping the TDP past 65W on the Mobile and Desktop Ryzen APUs!
  • nandnandnand - Tuesday, January 30, 2024 - link

    You don't need the upgrade path. The path you need to take is just use the thing for a few years and retire it for lighter duty (e.g. HTPC) or give it to a poor kid when you're done.
  • Thunder 57 - Wednesday, January 31, 2024 - link

    Tell that to everyone who upgraded CPU's on AM4.
  • meacupla - Wednesday, January 31, 2024 - link

    AM4 had a very long socket life. Long enough that AMD ran out of CPU numbers they could use in EEPROM with a single BIOS.

    AM5's life expectancy remains questionable.
  • meacupla - Tuesday, January 30, 2024 - link

    8700G costs so much that it is a terrible choice to begin with.
    For the same price, you can get an i3-12100F with RX 6600, and it'll spit out more frames.
  • Thunder 57 - Wednesday, January 31, 2024 - link

    I wouldn't recommend a 4 core CPU these days.
  • meacupla - Wednesday, January 31, 2024 - link

    Yes, but if you are budget constrained and think the 8 core 8700G is good value, you would be sorely mistaken.
  • TheinsanegamerN - Wednesday, January 31, 2024 - link

    For basic use a 4 core is fine. Plenty of games still run fine on 4 cores, and if you are using the iGPU your CPU cores are not going to be your limiting factor.
  • FatFlatulentGit - Wednesday, January 31, 2024 - link

    That setup would also eat over 2x as much power. The 8000G line is for people who want something with some expansion options and decent performance, but also want to keep it lower power/heat/noise, among other things. I'm eyeballing one for an HTPC upgrade.
  • t.s - Tuesday, January 30, 2024 - link

    The problem with 8700G is the price. Too pricey. I'll take socketed CPU anytime over soldered If the price is right ($200-$250). Cause when you go 7840HS route, you can get complete packages with just $490 (16GB 5600MHz, 512GB SSD, BT5.2 + WiFi6) or about $390 no RAM no SSD.
  • GeoffreyA - Tuesday, January 30, 2024 - link

    Quite a nice boost in performance over the 5000G APUs, with only a slight increase of power. Though I suppose for those already on AM4 APUs, it may be better to wait for the Zen 5 ones, and get an even bigger boost in one shot.
  • cp0x - Tuesday, January 30, 2024 - link

    I've been reading Anandtech for decades, since it started. In all that time, this is the worst review I've ever seen. We already know these chip are budget / entry level (of the latest generation). There's only one thing that is truly interesting about these chips: The integrated graphics. And specifically, the IGP at 1080p. And while we get 4 pages full of tests of a discrete GPU with this CPU, we get a total of one page with the IGP at 1080p, and it contains only one chart comparing this chip with other options (e.g. Intel), and that one chart is of ... drum roll ... Civ6 🤦‍♂️ WTF!?!?!?!

    We do get a few more graphs of 720p gaming. Why?!? WHEN THE REVIEW ALREADY EXPLAINED THAT THE STEAM STATS SHOW MOST GAMERS PLAY AT 1080P?!?!?!?

    I'm not going to assume that this is a conspiracy theory. After all: "Never attribute to malice that which is adequately explained by stupidity."

    Come on. Do better. This article should have been centered around and anchored on the 1080p results with this chip's IGP. It should have compared with more than just AMD's previous gen of this chip; it should have compared with Intel's offerings in this space. The questions the reader has are: What $$$ video card does this thing save me from buying? Will this play my game? How does it compare with other CPUs and their IGPs?

    End rant. Do better.
  • cp0x - Tuesday, January 30, 2024 - link

    And of course (as other commenters have pointed out): What effect does slower or faster DDR5 make on the 1080p results? (Especially since faster DDR5 was actually being used in the Intel system.)
  • FWhitTrampoline - Tuesday, January 30, 2024 - link

    The tech press in general ignores iGPUs and small form factor systems save ETA Prime and his YouTube channel. But ETA Prime is Gaming Benchmark Focused there mostly so any Blender 3D Cycles iGPU accelerated testing is ignored there. AMD tends to only focus on Games development for its APUs mostly as the console makers using AMD's APUs are more responsible for the games performance on AMD's APUs than AMD is responsible.

    And really the Gaming Hardware review sights are higher end Desktop Processor/dGPU focused there where Very Small Form Factor(InWin Chopin and ASRock X300/Other Desk Mini SKUs that use AMD and Intel Socket Packaged processors with powerful iGPUs make sense) system builds are ignored!

    Both the ASRock X300/Desk Mini and the Inwin Chopin very small form factor system builds and are too tiny to accommodate any dGPUs and so Ryzen 5000G was popular there for the iGPU and in those builds with AMD/Intel APUs/SOCs with more powerful iGPUs. There will be an ASRock x600/AM5 Desk Mini and what is essentially the same case form factor as the X300/AM4 Desk Mini. The Inwin Chopin barely accommodates a Mini-ITX MB but lacks the room of any dGPU to get slotted into the x16 slot on a Mini-ITX MB, sans some case modding of the Chopin!
  • Bruzzone - Tuesday, January 30, 2024 - link

    1080 gaming, well, the real application for gamers running greater than a 6000 MHz memory bus is high frequency trading. mb
  • nandnandnand - Tuesday, January 30, 2024 - link

    I think the worst crime in the review was using DDR5-5200.
  • Thunder 57 - Wednesday, January 31, 2024 - link

    AT isn't the same since Dr. Ian Cutress left.
  • GeoffreyA - Wednesday, January 31, 2024 - link

    On another point, I still think Anand's CPU reviews were better than Ian's. Ian was good but tended to get lost in the details, whereas Anand had an abstraction to his writing, and made computers inspiring, as if you were reading a story. Even the titles were memorable.
  • TheinsanegamerN - Wednesday, January 31, 2024 - link

    If you are going by steam, there are nearly as many sub 1080p as there are 1080p gamers. And those sub 1080p users would love the iGPU.

    If you dont like it, feel free not to read it. Or do your own review if you are so much smarter.
  • FWhitTrampoline - Tuesday, January 30, 2024 - link

    Gamers Nexus has found an Issue with the Ryzen 8000G APUs in the form of some Mobile Variants of the 7000/Phoenix mobile processor firmware settings that should have been removed for the Desktop Ryzen 8000G series Phoenix derivatives. And that has something to do with Laptop/Handheld skin temperature regulation where in order to account for that in laptops/handhelds the processors are being trotted in the Desktop Ryzen 8000G SKUs as if that's in a mobile laptop/handheld from factor device! So watch GN's latest video from today and see that more re-testing may be in order there!
  • nandnandnand - Tuesday, January 30, 2024 - link

    It's Skin Temperature Aware Power Management (STAPM).
  • Kinematics - Tuesday, January 30, 2024 - link

    The 88 W peak draw is about 25 W above the 65 W TDP, not 35.
  • mikato - Tuesday, January 30, 2024 - link

    Are we not doing idle power consumption any more?
  • t.s - Tuesday, January 30, 2024 - link

    This depend significantly from PSU and motherboard you're using. anything x70 chipset 24ATX PSU will boost the idle power vs when using x20 and 12VO. But yes, I'm curious too with their setup idle power. My guess, around 50W -> 24ATX 1000W, 6950 GPU.
  • mannen - Tuesday, January 30, 2024 - link

    You missed to compare these against the Ryzen 5 7600 and the Ryzen 7 7700 CPUs.
    That would have been very interesting since they are equivalent (but desktop) parts with the same number of cores/threads. To compare to Ryzen 9 makes no sense since it's not within the same budget class.

    I for example am interested in the 8600G, but also looking at the 7600 and keeping my old GPU. Since I don't game a lot, I can easily sacrifice the old GPU but I want to know how much CPU performance I'm giving up.
  • TheinsanegamerN - Wednesday, January 31, 2024 - link

    Well the other commenters here are throwing huge temper tantrums over CPU benchmarks being included in this review, so you may be better off looking elsewhere.
  • lorribot - Tuesday, January 30, 2024 - link

    Would have been nice to see a comparison with discrete cards like a RTX2060/3060.
  • nandnandnand - Tuesday, January 30, 2024 - link

    I could have sworn that was in here, but I was probably thinking of other reviews.

    Short story long, AnandTech is dying and you should go to other sites for reviews now. And the 780M iGPU in the 8700G is not going to do well against a 2060/3060. It seems to be around a GTX 1650 in performance.
  • is4u2p - Wednesday, January 31, 2024 - link

    Uh, the M1 had dedicated NPUs in it and it is a desktop processor.

    As for these, they're Z1 and Z1 Extreme APUs rebranded, you can get these in the handheld gaming machines.
  • Stu7nm3dflash - Thursday, February 1, 2024 - link

    Certainly Apple has a lot of machine learning built in, this iPad Mini 6, has 16 tflops of ML, more than the M1, but little access to big amounts, of short term memory, my Ryzen 5 8600, only has 16 tflops, but I've given it 64GB of DDR 5 and PCIe4. Fingers crossed, for my creative work. ARM, is more power efficient, the M1,2,3, Pro, Max, Ultra, iPhone, Apple TV, iPads all have ML, plus unified memory and PCIE4. But amount of memory is also important, 64GB of DDR 5 only cost me $A200, PCIE4 $ 100, motherboard $125, chip $A375, my Mac mini M1, only has 8GB, of unified memory, it crashed immediately, under AI model load.
  • Stu7nm3dflash - Thursday, February 1, 2024 - link

    My use case, is a bit different, turning 2 stories, 11 pages into 80 pages, at first, online, I got 3 paragraphs. Then on an 8 core, 4000 series, Ryzen 7, with 32 GB of DDR 4, PCIe3, I got 4 pages, now, 64GB of DDR 5, PCIe4, Ryzen 5 8600, with 16 tflops of AI, I'm hoping for more pages, before it crashes. At double the short term memory, at double the speed, double speed long term memory, a specific AI architecture and 4nm nearly 3 times the transistor density, of 7nm, last time I used Jan.ai, plus a model, easiest build yet. Fingers crossed, more specific processing, 4 times the short term memory power, I hope I'm getting closer, memory, processing and software continue to advance.
  • GeoffreyA - Thursday, February 1, 2024 - link

    Perhaps you'd get better results with a GPU upgrade?
  • peevee - Thursday, February 1, 2024 - link

    Too bad 8700G is hamstrung by only 65W and PCIe4 (and only 2 RAM channels).

    Maybe their graphics department insisted on these things to preserve sales? APUs would be so nice at 4 DDR5 channels and 200W...
  • meacupla - Friday, February 2, 2024 - link

    Socket AM5, with its 1718 pins, doesn't have enough pins for quad channel DDR5.
    While I don't think it would require 4844 pins like sTR5 for threadripper, it would require a new socket.
  • vertigoz - Saturday, February 3, 2024 - link

    I would love to see benchmarks of ai/3d using GPU, allowing much more ram can be a major plus
  • blackie333 - Wednesday, February 7, 2024 - link

    I really care more for IDLE power consumption than maximum power because I use my PC mostly for reading/music listening. Gaming or crunching videos is not my daily routine.
    These integrated AMD CPUs have been chosen by many because of their much better IDLE power efficiency compared to normal desktop models.
    I haven't found idle power consumption comparison with older gen. models in the article.
    I'm getting old and my eyes don't serve me as good as before, maybe it's there somewhere but can't find it.
  • masb - Friday, February 9, 2024 - link

    Excellent comparison, especially regarding the OpenFOAM topic. Where can I access the complete specifications for the Intel Core i5-14600K system?
  • Violet Giraffe - Monday, March 11, 2024 - link

    The article doesn't list iGPU specs?

Log in

Don't have an account? Sign up now