Aardvark Daily

New Zealand's longest-running online daily news and commentary publication, now in its 25th year. The opinion pieces presented here are not purported to be fact but reasonable effort is made to ensure accuracy.

Content copyright © 1995 - 2019 to Bruce Simpson (aka Aardvark), the logo was kindly created for Aardvark Daily by the folks at

Please visit the sponsor!
Please visit the sponsor!

Cores vs clock speed

7 November 2018

There's a very real battle going on between Intel and AMD.

For a long time, Intel has been "the" CPU manufacturer, with AMD running a distant second. However, the arrival of the Ryzen family of CPUs from AMD has changed all that and now people considering a new PC really need to do their homework if they want to get the best computer for their applications.

Time was when making a faster computer simply required throwing the word "turbo" in the brochure and bumping up the clock speed.

Who fondly recalls the days of the original IBM PC when the 4.77MHz of the "true blue" was quickly gazumped by cheeky clone-makers running their 8088 chips at 8MHz. Those clones blew the original IBM, with its leisurely clock speed, right out of the water.

Today, things are not quite so simple however. Whilst clock speed remains important in some tasks, other aspects of CPU design are more important in other applications and that's why the Intel vs AMD battle has gathered new pace of late.

You see, modern software tends to be quite complex and operating systems are designed with multi-tasking at their core. In order to multi-task, the computer effectively shares the CPU (and other) resources between many programs or processes. If your CPU only has a single core then this sharing is done solely on a temporal basis. If however, you have multiple cores, things can go a lot faster.

So there are two simple ways to boost the performance of a CPU... bump up its clock speed (ie: make it process instructions more quickly), or provide multiple cores within that CPU which can operate concurrently so as to allow several programs or processes to run at once.

But which is the best way to go?

Well Intel seems to be strongly in the camp of "just turn the damned wick up" and has worked very hard to produce processors that will run at speeds of up to 5GHz.

On the other hand, AMD has placed an emphasis on fitting as many cores as it can into each CPU package.

The current top-line desktop CPUs from Intel have between 6 and 8 cores (8700 and 9900 families respectively). By comparison, the AMD Ryzen family can deliver as many as 32 cores in a desktop CPU.

That is a *significant* difference in favour of AMD.

However, the Ryzen family don't reliably support anything like a 5GHz clock so they're not always the fastest kids on the block.

The problem is that some applications (and even Windows 10 itself) aren't particularly suited or capable of harnessing the power of a large number of cores and thus the potential power of a 32-core processor is not realised in practice.

It is very interesting to compare the performance of the 8-core Intel 9900K with the 12-core AMD Ryzen Threadripper 2920 (there are plenty of videos and sites comparing the two CPUs). At first glance you could be forgiven for thinking that the 50% greater core count of the AMD part would more than make up for the 20% faster clock speed on Intel's parts and indeed, sometimes this is the case. However, there are still a good many applications where the raw clocks speed of the Intel CPUs allows them to outperform the more core-laden AMD part.

The reality is that a good many applications just don't lend themselves to parallel processing via multiple cores and right now, games are a good example of this. Most of the popular PC-based games run better on Intel than on AMD -- although I think you'd have to be a real hard-core gamer to really notice the difference of a few percent in the frames-per-second metric.

The tables turn though, when you throw tasks such as 3D rendering into the mix.

Raytracing and other 3D rendering functions can be massively accelerated by the parallelism offered by a higher core-count. It's easy to break up a scene into chunks that can all be processed in parallel so that's generally what these programs do. Programs like Blender really get excited (and fast) when you throw lots of cores at them.

Video rendering is a little bit of a different (and more complex) story.

Traditionally, rendering a video into a format such as H264 or H265 has been incredibly CPU-intensive and thus required a super-grunty CPU when time is money. The advent of powerful GPUs has changed this somewhat and these days, most video editing software will use the GPU to perform that rendering, placing far less load on the system's CPU.

However, this is an area where Intel's CPUs do have another clever advantage. They have a separate core (in addition to the standard cores) which can be recruited to perform H264/H265 video decoding and encoding. My own video editing computer has an i7 8700 processor with this QSV (quick sync video) capability. I marvel that it can render video files at or faster than realtime rates, whilst the main CPU idles along at about 10% utilisation. Bloody marvelous.

This offloading of a very processor-intensive task onto a dedicated core also means that while rendering video, the machine remains fast and responsive -- allowing other tasks to be performed (such as creating some VFX or whatever) without impacting the render speed.

No comparison of cores vs clocks would be complete without mention of operating systems.

It has been widely reported that Windows 10 doesn't do a very good job of working with high core-count CPUs such as the AMD Threadripper series. This is borne out by tests which show the same apps being run on Linux where that OS's superior performance in the multi-core arena really shows up. I guess Microsoft will eventually update Windows to provide better support but in the meantime, if you're going with a high core-count CPU, you'd be well advised to make the most of that power by using Linux rather than Windows, when possible. Install Linux and run Windows in a VM if you have to.

One down-side to all this go-fastery (whether by cores or clock) is power consumption.

The top-line AMD Ryzen Threadripper (with 32 cores) sucks a gobsmacking 250W (before overclocking) and that's an enormous amount of energy. Getting rid of the thermal energy this produces almost demands a water-cooling solution or hearing protection if using an air-based colling system. Even the Intel part gets pretty toasty when driven hard.

And finally -- it's bang for buck time.

Almost without exception, everyone agrees that the AMD Ryzen range of processors may not always deliver the best ultimate performance but they do deliver unbeatable "bang per buck". An AMD-based solution seems to come in quite a lot cheaper than the equivalent Intel-based one so unless you really need those higher clock speeds and/or that GPU core which does video decoding/encoding, AMD looks like it's going to be able to pull out a lead in the long running war with Intel.

Of course for the vast masses of "users", it doesn't make the slightest bit of difference which CPU is in their computer. They just want to turn it on, play a few games, update their Facebook page and watch a little porn. They care not whether it's done with fast clocks or a surfeit of cores. More important to them is whether they can buy with affordable finance or whether the colour of the box matches the drapes.

However, as "geeks" *we* would like to think that we choose our next system with a little more care, knowledge and understanding.

But wait... that one looks shiny doesn't it.... :-)

Please visit the sponsor!
Please visit the sponsor!

Have your say in the Aardvark Forums.

PERMALINK to this column

Rank This Aardvark Page


Change Font

Sci-Tech headlines



Beware The Alternative Energy Scammers

The Great "Run Your Car On Water" Scam


Recent Columns

There is no centre to the universe
In the beginning there was nothing. No matter, no time, no universe...

When less is more?
It used to be that being an electronics hobbyist was easy...

Why time flies
As older folk like myself are very much aware, time seems to pass more quickly as you age...

Oh the irony
YouTube has made a lot of noise about enforcing its community standards of late...

The end of live streaming?
The events of last Friday continue to have deep repercussions on the shape and form that the internet may take from this point forwards...

More internet restrictions
It really does look as if the internet is dying -- from the perspective of being an open, free and somewhat anonymous method of accessing and disseminating information...

The future looks bleak
Today's column was going to be about the tragedy of the Christchurch mosque attacks which happened on Friday of last week...

No longer plane simple
Just about every country in the developed world has now grounded the Boeing 737 Max 8 aircraft, after two crashes that have resulted in the deaths of hundreds of people...

When the sun shines
We all know where clouds live... in the sky...

Actions speak louder than words
I've written a few columns about the (apparent) decline in geekiness and the sad way people seem to be uninterested in the technology that powers the world around us...

A black-box society?
A few days ago I made a video (as you do) about how there seems to be a lack of interest in "making stuff" these days...