Aardvark DailyNew Zealand's longest-running online daily news and commentary publication, now in its 25th year. The opinion pieces presented here are not purported to be fact but reasonable effort is made to ensure accuracy.
Content copyright © 1995 - 2019 to Bruce Simpson (aka Aardvark), the logo was kindly created for Aardvark Daily by the folks at aardvark.co.uk
Please visit the sponsor!
I'm old and cynical, it takes a lot to impress me these days.
However, AMD have impressed me with their new range of Threadripper CPUs.
The 24 and 32-core parts seem to have set an impressive new benchmark for high-end desktop systems and effectively blown away any price-performance benefits that Intel may have had left in this sector of the market.
Who'd have thought, back in the 1970s, when I was enjoying my first hands-on experience with an 8-bit micro running at a scorching 1MHz clock speed, that half a century later we'd have bumped clocks by well over 3 orders of magnitude, hiked data-buss widths by a factor of four and actually managed to squeeze 32 processors into a single package?
I'd love to have seen my face if someone had dumped a Threadripper-based computer in my lap way back then!
The only negative aspect to the new AMD parts is the pricing.
Two grand US for a desktop CPU is pretty steep -- but then again, it's not just a desktop CPU is it? It's more than 32 desktop CPUs by 1990s standards.
One really has to wonder where Intel will go from here.
Aside from a few niche markets, the Intel products just don't offer the same bang per buck or even the same raw performance (at any price).
Previously, Intel could at least claim that their cores were clocked faster and therefore outperformed AMD's offerings on tasks that were not well suited to multi-cored/threaded architecture. Sadly for Intel, even this benefit has been nuked by Threadripper.
Due to massive levels of internal caching, the Threadripper parts ticking along at a little over 4GHz now effectively outrun Intel's 5GHz parts.
I doubt there was much in the way of champagne corks popping at Intel when the first Threadripper benchmarks were published this week.
The big problem for Intel is that coming up with newer, faster, more cost-effective CPUs is not something that can be done overnight, regardless of how much money you throw at the job. AMD seems to have nailed the issues associated with producing parts using the 7nm process, while Intel are still struggling to get beyond 10nm -- and this leaves them very much on the back foot.
Of course the question, as always, is where to from here, in terms of CPU evolution?
We can't just keep on adding cores (or can we?) because we soon run into the laws of diminishing returns. Even the most complex problems can only be broken down into so many parallel processes. As it says in that wonderful book The Mythical Man Month, "no matter how many women you put on the job, it still takes 9 months to have a baby".
The laws of physics have also brought the increase clock speeds to a grinding halt. 5GHz seems to be it, at least for the forseeable future.
Even creating wider data paths by way of a shift to 128-bit processors won't produce much in the way of an improvement -- given that much of the data being handled still consists of 8 or 16-bit values (such as characters).
Could we have hit the effective limits of traditional computing architectures?
Might the future consist of hugely parallel systems that function as neural networks running AI algorithms?
Instead of programming the next generation of computers, might we just define the problem and leave them to "learn" the best way to produce the solution, all by themselves?
The laws of physics are immutable and CPU designers are now banging their heads against those laws with monotonous regularity. Could the next giant step forward be a total rethink of the entire architecture and paradigms being used?
You tell me... what will a "computer" look like in another 20 years? Will the Von Neumann architecture be just a distant memory?
Please visit the sponsor!
Have your say in the Aardvark Forums.