Aardvark DailyNew Zealand's longest-running online daily news and commentary publication, now in its 24th year. The opinion pieces presented here are not purported to be fact but reasonable effort is made to ensure accuracy.
Content copyright © 1995 - 2019 to Bruce Simpson (aka Aardvark), the logo was kindly created for Aardvark Daily by the folks at aardvark.co.uk
Please visit the sponsor!
I ran a CPU benchmark on my video rendering machine last night and was gobsmacked at how much slower it is than "state of the art" silicon.
Right now that's not too much of a problem because I'm only churning out a video every day or two so 1/5th real-time rendering speed is acceptable. However, I do have plans to significantly increase my video output -- in minutes if not the number of videos released.
Perhaps the biggest problem with the old piece of iron is that it gets pretty hot when chugging its way through large renders -- especially in the heat of mid-summer. The fan roars away and I can watch the CPU temperature soar from an idle of 39 degrees up and beyond 85 degrees. At one stage it was actually getting so hot that it would reboot, even though the heatsink was clear of dust and everything was clean-as.
After I removed the heat-sink, cleaned off the old (dry) heat-transfer compound and replaced it with new stuff, things improved significantly and now it no longer reboots and rarely exceeds 75 degrees -- but it's still not going to shatter any speed records.
So, as mentioned in a previous column, I've been keeping an eye out for new iron to power the render-farm :-)
To be honest, the days when choosing a CPU was simple have long-gone.
It used to be that all you had to do was choose the highest-numbered CPU from the family (ie: 286/386/486) and then the fastest clock-speed. Job done!
These days it's quite a bit more complex than that.
We have i3, i5, i7, i9 and within that range, different versions and families -- as well as different clock speeds, turbo rates, hyperthreading... what the?
It would be easy to say just give me the most cores and the fastest clock speed but that's almost certainly not going to deliver the best *value* for my limited dollars.
I did look at refurbished ex-lease computers but the i7 variants inevitably are somewhat dated and although they have a much better CPU benchmark than my old i5, they are also quite a bit slower than the latest generation of i7s and, in some cases, even slower than newer i5 CPUs.
Then there's the issue of video performance.
Most of the old ex-lease iron simply has onboard graphics which are "okay" for your average desktop user but can struggle with things like 4K video at higher framerates.
To be honest, I really don't have the time to produce a massive matrix of cost/performance for all the ever-increasing options of CPU family, iteration, video configuration, etc.
From that perspective it's tempting to simply pull out the credit card and buy a top-end gamer rig that will likely last me another 4 years or so -- but that's probably not going to represent best use of my limited capital (and besides which, I hate buying stuff on tick).
Another factor which I don't believe can be underestimated, is the issue of the recently disclosed CPU vulnerabilities. The mitigation for the Spectre and Meltdown flaws has not been wholly successful to date with both a performance and reliability impact appearing to be making the patches somewhat undesirable.
Trust me, there's little more annoying in life than to have your computer spontaneously reboot 2 minutes from the completion of a three hour render and from what I've read, the patches released so far tend to reduce the reliability of those systems to which they have been applied.
Now you might think -- hey, this is a rendering machine, it doesn't even need to be connected to the internet so why worry?
Well to be honest, I'm not worried. Even though this machine is hooked up to the LAN, the only site it ever connects to is YouTube so I can firewall it to stop all other access and feel pretty safe that it's secure from Meltdown or Spectre. No, the reason for hesitating is because I think that I might be able to pick up a real bargain pretty soon.
Why is that?
Well I'm pretty sure that when Intel/AMD or whoever, release a CPU design that has these flaws fixed in the silicon, then a great many companies, government departments and individuals will be tossing their current iron to buy new "secure" machines. When that happens, the second-hand market will be awash with high-performance PCs going for a much lower price than they do today.
I'm thinking that if I wait (and with an old i5 I mean *wait*) for another 6-9 months, I'll probably be able to pick up "vulnerable" second hand high-end systems for half of what they might cost today.
Is this sound thinking?
If you had/have a top-end PC-based system operating in a critical role, would you be upgrading to the new "fixed" version of that silicon as soon as you could and would you then be flogging the old stuff for whatever you could get for it?
In short, why would anyone buy an expensive high-end PC today when, if they wait a little, they'll be able to get a "fixed" machine in a few months or, like me, pick up the same high-end machine with a current generation CPU for cents on the dollar?
Or am I just being incredibly optimistic?
Please visit the sponsor!
Have your say in the Aardvark Forums.