Frequency, Temperature, and Power

A lot of questions will be asked about the frequency, temperature, and power of this chip: splitting 280W across all the cores might result in a low all-core frequency and require a super high current draw, or given recent reports of AMD CPUs not meeting their rated turbo frequencies. We wanted to put our data right here in the front half of the review to address this straight away.

We kept this test simple – we used our new NAMD benchmark, a molecular dynamics compute solver, which is an example workload for a system with this many cores. It’s a heavy all-core load that continually cycles around the ApoA1 test simulating as many picoseconds of molecular movement as possible. We run a frequency and thermal logger, left the system idle for 30 seconds to reach an idle steady state, and then fired up the benchmark until a steady state was reached.

For the frequencies we saw an ‘idle’ of ~3600 MHz, which then spiked to 4167 MHz when the test began, and average 3463 MHz across all cores over the first 6 minutes or so of the test. We saw a frequency low point of 2935 MHz, however in this context it’s the average that matters.

For thermals on the same benchmark, using our Thermaltake Riing 360 closed loop liquid cooler, we saw 35ºC reported on the CPU at idle, which rose to 64ºC after 90 seconds or so, and a steady state after five minutes at 68ºC. This is an ideal scenario, due to the system being on an open test bed, but the thing to note here is that despite the high overall power of the CPU, the power per core is not that high.

Click to zoom

This is our usual test suite for per-core power, however I’ve condensed it horizontally as having all 64 cores is a bit much. At the low loads, we’re seeing the first few cores take 8-10W of power each, for 4.35 GHz, however at the other end of the scale, the CPUs are barely touching 3.0 W each, for 3.45 GHz. At this end of the spectrum, we’re definitely seeing AMD’s Zen 2 cores perform at a very efficient point, and that’s even without all 280 W, given that around 80-90W is required for the chipset and inter-chip infinity fabric: all 64 cores, running at almost 3.5 GHz, for around 200W. From this data, we need at least 20 cores active in order to hit the full 280W of the processor.

We can compare these values to other AMD Threadripper processors, as well as the high-end Ryzens:

AMD Power/Frequency Comparison
AnandTech Cores CPU TDP   1-Core
Full Load
Full Load
3990X 64 280 W   10.4 W 4350 3.0 W 3450
3970X 32 280 W   13.0 W 4310 7.0 W 3810
3960X 24 280 W   13.5 W 4400 8.6 W 3950
3950X 16 105 W   18.3 W 4450 7.1 W 3885

The 3990X exhibits a much lower power-per-core value than any of the other CPUs, which means a lower per-core frequency, but it isn’t all that far off at all: less than half the power for only 400 MHz less. This is where the real efficiency of these CPUs comes into play.

The 64 Core Threadripper 3990X CPU Review The Windows and Multithreading Problem (A Must Read)
Comments Locked


View All Comments

  • GreenReaper - Saturday, February 8, 2020 - link

    64 sockets, 64 cores, 64 threads per CPU - x64 was never intended to surmount these limits. Heck, affinity groups were only introduced in Windows XP and Server 2003.

    Unfortunately they hardcoded the 64-CPU limit in by using a DWORD and had to add Processor Groups as a hack added in Win7/2008 R2 for the sake of a stable kernel API.

    Linux's sched_setaffinity() had the foresight to use a length parameter and a pointer:

    I compile my kernels to support a specific number of CPUs, as there are costs to supporting more, albeit relatively small ones (it assumes that you might hot-add them).
  • Gonemad - Friday, February 7, 2020 - link

    Seeing a $4k processor clubbing a $20k processor to death and take its lunch (in more than one metric) is priceless.

    If you know what you need, you can save 15 to 16 grand building an AMD machine, and that's incredible.

    It shows how greedy and lazy Intel has become.

    It may not be the best chip for, say, a gaming machine, but it can beat a 20-grand intel setup, and that ensures a spot for the chip, not being useless.
  • Khenglish - Friday, February 7, 2020 - link

    I doubt that really anyone would practically want to do this, but in Windows 10 if you disable the GPU driver, games and benchmarks will be fully CPU software rendered. I'm curious how this 64 core beast performs as a GPU!
  • Hulk - Friday, February 7, 2020 - link

    Not very well. Modern GPU's have thousands of specialized processors.
  • Kevin G - Friday, February 7, 2020 - link

    The shaders themselves are remarkably programmable. The only thing really missing from them and more traditional CPU's in terms of capability is how they handle interrupts for IO. Otherwise they'd be functionally complete. Granted the per-thread performance would be abyssal compared to modern CPUs which are fully pipelined, OoO monsters. One other difference is that since GPU tasks are embarrassing parallel by nature, these shaders have hardware thread management to quickly switch between them and partition resources to achieve some fairly high utilization rates.

    The real specialization are in in the fixed function units for their TMUs and ROPs.
  • willis936 - Friday, February 7, 2020 - link

    Will they really? I don’t think graphics APIs fall back on software rendering for most essential features.
  • hansmuff - Friday, February 7, 2020 - link

    That is incorrect. Software rendering is never done by Windows just because you don't have rendering hardware. Games no longer come with software renderers like they used to many, many moons ago.
  • Khenglish - Friday, February 7, 2020 - link

    I love how everyone had to jump in and said I was wrong without spending 30 seconds to disable their GPU driver and try it themselves and finding they are wrong.

    There's a lot of issues with the Win10 software renderer (full screen mode mostly broken, only DX11 seems supported), but it does work. My Ivy Bridge gets fully loaded at 70W+ just to pull off 7 fps at 640x480 in Unigine Heaven, but this is something you can do.
  • extide - Friday, February 7, 2020 - link

    No -- the Windows UI will drop back to software mode but games have not included software renderers for ~two decades.
  • FunBunny2 - Friday, February 7, 2020 - link

    " games have not included software renderers for ~two decades."

    which is a deja vu experience: in the beginning DOS was a nice, benign, control program. then Lotus discovered that the only way to run 1-2-3 faster than molasses uphill in winter was to fiddle the hardware directly, which DOS was happy to let it do. it didn't take long for the evil folks to discover that they could too, and virus was born. one has to wonder how much exposure these latest GPU hardware present?

Log in

Don't have an account? Sign up now