Google
 

Aardvark Daily

New Zealand's longest-running online daily news and commentary publication, now in its 25th year. The opinion pieces presented here are not purported to be fact but reasonable effort is made to ensure accuracy.

Content copyright © 1995 - 2019 to Bruce Simpson (aka Aardvark), the logo was kindly created for Aardvark Daily by the folks at aardvark.co.uk



Please visit the sponsor!
Please visit the sponsor!

We all get four transistors!

11 September 2020

GPUs, that's graphics cards to the great unwashed, they're a key component in any modern high-performance computer system.

Although they're primarily used for creating the display we see on our screens, GPUs are also often used for other computationaly-intensive operations and are increasingly playing a part in artificial intelligence.

The earliest GPUs were very humble devices and perhaps the two most memorable ones were the IBM CGA card and the Hercules monochrome adaptor.

Actually, these weren't really GPUs in the modern sense, since they had no processors, simply some dedicated logic to provide access to memory that was effectively mapped onto the coordinates of a screen.

Today's modern GPUs are indeed a completely different beast and may actually be many times more powerful than the CPU we think of as the core of every computer.

Of late there have only really been two players in the GPU market: NVIDIA and AMD.

There was a time when these companies fought head-to-head in the battle for the desktop market, that competition reflected in some quite agressive pricing.

Then came the crypto-mining craze and GPU prices went through the roof due to demand. It seems that the super-powerful number-crunching capabilities of these cards allowed for much quicker crypto-mining than did a regular old CPU and pretty soon high-end cards were changing hands for much more than their recommended retail price.

Thanks to their superior raw power and better efficiency, NVIDIA's cards were most popular and prized in this application. Indeed, the company even began to produce GPUs specifically targeted at cryptominers rather than game-players or regular PC users.

Then the crash came... and the focus returned to gaming, video-editing and such.

NVIDIA brought out their 2000 series cards and added some fluff in the form of raytracing corse and tensor cores. Although they promised much, they delivered far less than most poeple expected. The ray-tracing was pretty weak and hugely impacted gaming frame-rates, whist few of those games seem to make much (if any) use of the tensor cores.

Meanwhile, AMD brought out its new Radeon RX series of cards with "RDNA" -- the special sauce that was going to improve efficiency, increase speed and add extra spice to the mix.

Well the RX5700 and 5600 are adequate cards but early drivers were pretty crappy.

Sadly, they simply didn't really bring much-needed competition to a market that was rapidly being "pwnd" by NVIDIA. Without any need to sharpen its prices, a top-line NVIDIA GPU such as the GTX2080ti actually cost more than the entire rest of a computer put together.

But oh my, how things have changed in the last week or two!

AMD has been promising its next generation of GPUs using RDNA2 (even hotter and more spicy). They promised to reset the price of GPUs by delivering better performance than NVIDIA for less money -- and who wouldn't like that?

Not to be outdone, NVIDIA has just announced its new 3000 series of GPUs and, in doing so, they have made a world of gamers and video editors very, very happy.

Perhaps in anticipation of AMD's pending announcements and almost certainly with a goal of completely destroying any threat they may pose, NVIDIA has rolled out some super-powerful cards at incredibly low prices. In fact, they've virtually halved the cost per TFLOP of GPU cards overnight.

Now I'm not a gamer -- but I am a video content creator so even I am excited by this.

Get ready to be blown away... the top-of-the-line GPU, the RTX3090 has an astronomical 28 billion transistors on its main die. Are you shirting me?

TWENTY EIGHT BILLION transistors on a single chip?

That's more than 4 transistors for every person on planet earth... in EACH GPU!

Compare this to the average high-end desktop CPU which has just 7 or 8 billion or even the AND Epyc enterprise CPUs with their amazing 19 billion and you'll see why this is a big deal.

Of course all of this is just peanuts compared to the world's biggest processor chip the Cerebas WSE which has an astonishing 1.2 TRILLION transistors.

Wow... and I remember the first time I held a transistor in my hand (an OC71 with black glass encapsulation with three thin wires hanging out the bottom, a red dot of paint signifying which was the collector) and marvelled at how much smaller it was than the valves it was destined to replace.

At that time I was probably one of a very small percentage of people who'd ever held a "real" transistor in their hands so there were far more people than transistors. Now every top-end GPU will have five times as many transistors as the entire earth's population.

So much progress in one short lifetime!

Please visit the sponsor!
Please visit the sponsor!

Have your say in the Aardvark Forums.

PERMALINK to this column


Rank This Aardvark Page

 

Change Font

Sci-Tech headlines

 


Features:

The EZ Battery Reconditioning scam

Beware The Alternative Energy Scammers

The Great "Run Your Car On Water" Scam

 

Recent Columns

Let us see if this works
As regular readers will know, I've had issues with New Zealand's Civil Aviation Authority and the way they enforce part 101 of the Civil Aviation Regulations (CAR)...

What is your B plan?
The world is changing, and perhaps not for the better...

Too horrible to contemplate?
It's hard to believe that the last global conflict was a mere 75 years ago...

The needs of the many
New Zealand has Covid 19 under control...

Too big to care
Bullying is a bad thing...

Is tech a trickle-up benefit
Technology is wonderful...

Our last wisp of privacy to be lost?
Privacy, anyone remember that? ...

Finally... shifted and sorted
What an interesting week I've just endured...

The world is changing fast
It has been very interesting, as well as depressing, to watch the way the world has changed recently...

I am going to bore you
I don't know if it's because nobody is interested or simply because people don't bother to comment but there are two subjects that I regularly write about that generate very little in the way of response...

Cryptocurrency and tax
For as long as there has been trade, there have been currencies...