AMD Llano APU Review - Slow CPU, Fast GPU 184
Vigile writes "Though we did see the fruits of AMD's Fusion labor in the form of the Brazos platform late in 2010, Llano is the first mainstream part to be released that combines traditional x86 CPU cores with Radeon-based SIMD arrays for a heterogeneous computing environment. The A-series of APUs reviewed over at PC Perspective starts with the A8-3850 that is a combination of a true quad-core processor and 400 shader processors similar to those found in AMD's Radeon HD 5000 series of GPUs. The good news for the first desktop APU is that the integrated graphics blows past the best Intel has to offer on the Sandy Bridge platform by a factor of 2-4x in terms of gaming. The bad news is the CPU performance: running at only 2.9 GHz the Phenom-based x86 portion often finds itself behind even the dual-core Intel Core i3-2100. On the bright side you can pick one up next month for only $135."
Slower than an i3... (Score:5, Interesting)
On newegg that core i3-2100 is retailing for $124; how do the graphics in the llano stack up against the i3's graphics? Might not be such a bad deal at all.
Article (or at least the material they got from AMD) indicates that graphics is precisely where it shines, so an i3-class CPU with nearly-discrete-class graphics, at an i3 pricetag, sounds quite compelling.
Re: (Score:2)
That is AMDs plan with this unit. Same relative cost and performance as the i3 but much better GPU.
Re:Slower than an i3... (Score:5, Informative)
Re: (Score:3)
The open source driver won't catch up; the open source drivers have never even come near to the closed drivers in 3D performance. They're for people who want to always use the latest kernel without worrying about incompatibility.
Re:Slower than an i3... (Score:4, Informative)
Performance is one thing, it's not close in features or stability either. The 5850 was released in September 2009, I still can't get HDMI audio, there's no video acceleration, OpenGL is at 2.1 (card supports OpenGL 4.1) and last I checked it was rather easy to hang it. I'm not blaming the guys who work on it because they're few and working as hard as they can, but they're no match for the 100+ developer Catalyst team. It didn't help that in the long years where both ATI and nVidia were closed source the graphics stack really didn't get much love. But the info is there now, all it really needs is the manpower.
Re: (Score:2)
how do the graphics in the llano stack up against the i3's graphics?
this is not only answered in TFS but even in TFT :)
and arguable your question is kind of senseless as Intels i3 is not a CPU/GPU combination but "only" a processor, though if you use your i3 with Intel on-board graphics the AMD will run circles around it.
Re:Slower than an i3... (Score:4, Informative)
One of the SandyBridge selling points was "our integrated graphics no longer suck, and are now semi-decent". And calling the Llano a CPU/GPU combo while not doing the same for Intel is kind of pointless; both have integrated graphics, and both have it as a selling point. Since the prices are comparable, "one gives me good graphics and the other sucks" isnt a hard choice to make.
Re: (Score:3)
Since the prices are comparable, "one gives me good graphics and the other sucks" isnt a hard choice to make.
You left out part of the equation. The choice is more like:
i3-2100 - Fast CPU / Mediocre GPU
Llano - Slow CPU / Good GPU
For most non-gamers the choice will be the i3. For light gamers, HTPC, and notebooks the choice will be Llano. For more serious gamers the choice is obviously the i3 since the Llano CPU is too slow and the Llano onboard GPU isn't anywhere near good enough. These people will use higher end discrete graphics cards. I chose i3 for my gaming box for exactly that reason.
Re: (Score:3)
Re: (Score:3)
Just a nitpick here, but Sandy Bridge supports full h.264 hardware decoding up to 1080p, 3D TV support, and Bitstreaming of HD Audio formats.
Re: (Score:2)
Since the prices are comparable, "one gives me good graphics and the other sucks" isnt a hard choice to make.
The reason people have moaned about Intel's abysmal integrated performance is that it's been the low bar of the market, all those computers that weren't built to game and didn't have a discrete graphics card. Because it turns out a lot of people got a used computer from work or borrowed their dad's work machine or whatnot to game, using the integrated graphics. With the Sandy Bridge graphics Intel raised that low bar quite a bit. Even if you buy a business machine you get that graphics performance for "free
Re: (Score:2)
Intels i3 is not a CPU/GPU combination but "only" a processor
argh, call me stupid; like you I read only half of TFS and ignored the Sandy-Bridge-sentence :/
Re: (Score:2)
Very stupid - you aren't supposed to read any of it.
Re: (Score:3)
Re: (Score:3)
So, if, in fact, the Llano's graphics are "2-4x better than the best Sandy Bridge has to offer" they should crush the i3's IGP like a bug, and be a better gaming part generally unless a given game is atypically CPU bound.
I suspect that AMD will have themselves a cheapskate(and/or space constrained) hit, since their part woul
Re: (Score:2)
Ah, but if you get a discrete ATI card, it looks like the integrated graphics teams up with it in some kind of bizarre Crossfire setup, so the AMD processor would be even better than the i3. Good luck setting up dual-rendering between intel integrated and an nVidia or ATI card.
Re: (Score:2)
Re: (Score:3)
Ive always heard people talk about how faster cards need a faster CPU, and that if you do a 2.2ghz 2core AMD you will end up bottlenecking your high-end 6990 card, but Ive never really seen it quantified or explained; surely the CPU isnt processing data that the GPU spits out onto the DVI port; and we are well past the days of needing the CPU to intervene on RAM and HDD requests; a lot of the point of AGP and PCIe (IIRC) is that they do not require CPU intervention to access memory-- they have a direct link
Re: (Score:2)
Re: (Score:2)
When I play games, I generally always see the CPU spiked to 100%-- even doing something like WoW several years ago on a core2 duo. Changing graphics cards definately upped my FPS, but CPU tends to stay pegged.
My understanding is that no matter what game it is (generally), its going to peg the CPU and GPU as hard as it can to get as much physics and rendering done as it can, and whatever it cant get done it just skips.
As for the business scenario you mentioned, my understanding is that the CPU will take ove
Re: (Score:2)
It's worth noting that WoW is a VERY bad example here - that game is a massive exception to the rules, and is bottlenecked on CPU rather then GPU on any system with reasonably modern graphics card.
Re: (Score:2)
I found the same problem with my 790GX. The highest performance card which would operate in Crossfire mode with the chipset GPU was too small to be worth buying and lacked the ports that I wanted anyway. Having the integrated graphics is still handy though f
Re: (Score:2)
I don't think the benchmarks were very helpful because for most people the performance of this chip should be excellent. I have a dual core hyperthreaded Atom based server which is very responsive and usable, but going by raw CPU performance benchmarks sucks. For desktop use you don't need that much CPU power, and in fact simply having more cores is a better bet as it improves responsiveness massively.
AMD are expecting the GPU to do a lot of the heavy processing like video decoding, we just need more softwa
Re: (Score:3)
I'm watching the development of Open CL fairly closely, because it's probably going to end up making or breaking Llano in the long run.
Pretty well sounds like (Score:2)
The i3 does not have the best graphics for the SB, the i7 does. They say it is 2x-4x what that is. Well, that means pretty reasonable lower-midrange graphics. Enough to play modern games, though probably not with all the eye candy.
That could make it worthwhile for budget systems. $135 for an all inclusive solution rather than $124 for a CPU and $50 on a video card.
Of course there are some downsides too in that it is a weaker CPU and some games (Bad Company 2 and Rift come to mind) need better CPUs and of co
Re: (Score:2)
Intel pre-emptively released an i3 with their top-of-the-line HD3000 graphics GPU a short while ago, so the i3 is on a par with the best Intel can offer, iGPU-wise. 2105 I think.
Re: (Score:2)
Even so, the benchmarks show it well ahead of an HD3000. The chip has got some reasonable graphics. Equal to or in most cases better than a $50 card. That's not bad.
Re: (Score:2)
Agreed
Re:Pretty well sounds like (Score:4, Insightful)
What should be noted and what isnt well understood is that these "APU's" coming out from AMD are all Bobcat chips. Bobcat is a design directly targeting Intel's Atom market. The review here is for the King of the Bobcat's, the high powered variant weighing in at 100W peek built on the 32nm processes. The low power bobcats only have 80 stream processors (5.9W, 9W, and 18W variants) instead of the 400 stream processors (100W) that this thing has at are on the 40nm process.
All the Bobcat modules have only 2 ALU's and 2 FPU's, and only a 1-channel memory controller, so it is no surprise that it has trouble competing with the i3's. What is surprising is that never-the-less, its competing with the i3's.
Re: (Score:2)
The review here is for the King of the Bobcat's, the high powered variant weighing in at 100W peek built on the 32nm processes.
All the Bobcat modules have only 2 ALU's and 2 FPU's, and only a 1-channel memory controller, so it is no surprise that it has trouble competing with the i3's. What is surprising is that never-the-less, its competing with the i3's.
It has twice as many cores and from the numbers you give here uses about 3x as much power. I'm not too surprised that you can compete with a cheaper chip in that case.
Power Consumption (Score:2)
The problem is that these chips are not competitive with the Atom when it comes to power consumption. They are about on par with SandyBridge i3's in that regard, which is why everyone is comparing their performance against the i3s. There is no chance they will replace the Atom in netbooks (especially after Atom moves to 32nm later this year), but they will be good for low end laptops.
Re: (Score:2)
globally, twice as fast. extremely memory constrained though, so shell out at least for 1600MHz DDR3, 1833 is best.
Slow? (Score:5, Insightful)
This new AMD product specifically targets the budget user with occasional gamings. It allows entry level gaming, for the price of a very cheap CPU + GPU at lower TDP. It's also a better solution than a CPU + Discrete graphics because it already gives you entry level gaming without taking up a PCI-E slot; at the same time allows for asymmetrical CrossFire so in case you want to get a high end CPU you can see a benefit (in DX10 & DX11 titles)
This new APU from AMD shoots down any budget graphics Intel has to offer whilst giving you more CPU power to do anything Atom does.
At the end of the day, Core i3 + HD3000 costs more and has a higher idle power usage.
IMO the title should read: "Brilliant new budget gaming APU from AMD!"
Re: (Score:3, Informative)
I know this article is about the desktop APUs, but as I've been running the C-50 Ontario on my netbook (Acer AO522-BZ897) for a few months now, I think I can share some real-world experience.
Overall: It's a dual-core netbook, and still gets 6 hours battery life if I'm writing code with the brightness down, a little less if I'm listening to music. It may be slower on the individual cores than a competitive Atom, but if your program is threaded it's great. I'm very happy with the performance. It replaced a Po
Re: (Score:2)
For those who haven't seen it a thousand times, (Score:2)
Re: (Score:2)
Ladies and gentlemen, I remind you about how well-documented this sort of thing is: the wheel of reincarnation [catb.org]. Personally, I'm betting that hardware is now so disposable that we'll eventually get to having our machines in one hunk of silicon, and the wheel will stall.
Exactly. I'll bet it will be called a "Tablet".
Actually, I envision the day when all phones will have a compatible interface that will allow for keyboards, mice and monitors to be hooked up to them. You take your "phone" to work, plug it in, do work. Pull it out, browse the web on your way home and plug it into your dock at home where you play games or whatever it is you do with your current PC at home. You go and visit your buddy and want to show him some new whiz-bang-app you have, you plug your phone
Re: (Score:2)
Not quite slow (Score:3)
To saw its slow is a little ridiculous. Compared to a 286? I know, that this is in comparison to other modern CPUs, but any modern CPU is pretty fast.
I wonder if AMD or Intel will ever manage to develop an x86 integrated chip for handheld devices. It would be pretty interesting to have binary compatability between desktop and handheld devices.
Re: (Score:3)
For most people CPU power is a none issue. Truth is that most office PC and home PCs are very over powered for what they are doing. Honestly most users would be see the biggest improvement in performance if they put their money into more RAM and faster storage as well as a half decent GPU over a faster CPU.
The APU idea really has so much merit that it just isn't funny. If AMD can get this pushed out and if more software starts to take advantage of the GPU you will see a big benefit. This isn't all that diff
Re: (Score:2)
Comparing power usage of an x86 to an ARM is just laughable:
http://netbooked.net/blog/arm-vs-atom-size-vs-power-vs-performance/ [netbooked.net]
Re: (Score:2)
Comparing ARM to EMULATED x86 (which is what that is today) is just as laughable.
WHOOPS. Might wanna throw your link away, quickly.
Re: (Score:2)
Re: (Score:3)
Slower than an i3, yes. But...it's waay faster than the fastest Atoms
And costs waay more and uses waay more power. Atoms are mostly being used for cheap, low power systems, and this chip fails on both counts.
Re: (Score:2)
Slower than an i3, yes. But...it's waay faster than the fastest Atoms
And costs waay more and uses waay more power. Atoms are mostly being used for cheap, low power systems, and this chip fails on both counts.
Exactly! The Llano chip alone is $130. For $100 you can get both an Atom CPU and motherboard.
Re: (Score:2)
For $100 you can get both an Atom CPU and motherboard.
I don't know about current prices, but back in 2008 I paid about $100 for my dual-core Atom motherboard, CPU and 2GB of RAM.
Faulty Testing Methodology (Score:2)
Llano (Score:2)
A6 reviews, anyone? (Score:2)
I'll be building a mini-itx system this summer, and I find the cheaper (and possibly cooler) versions of Llano more interesting. Since the GPU side of the chip is rather bandwidth-limited, I wonder whether the lower-clocked and/or lower shader count (320 instead of 400) versions of the chip might perform almost as well as the highest-end chip all the sites I've seen have tested. Anybody seen reviews of any of the rest of the lineup?
Re: (Score:2)
The 65W versions are not out yet, haven't even seen a single test anywhere, and I've been looking.
FYI, I couldn't wait and built a mini-ITX rig with Asus' E-350 board, and I'm fairly happy with it: dual screen, SD video on one, office stuff on the other, no real slowdowns, very quiet, no games later than 4+ years old though. The challenge was finding a nice vesa-mountable mini-itx case. Logicsupply.com has plenty (M-350 or T3410 caught my eye, bought both for funsies), or the elementQ is OK if you want a sh
Re: (Score:2)
I'll be building a mini-itx system this summer, and I find the cheaper (and possibly cooler) versions of Llano more interesting. Since the GPU side of the chip is rather bandwidth-limited, I wonder whether the lower-clocked and/or lower shader count (320 instead of 400) versions of the chip might perform almost as well as the highest-end chip all the sites I've seen have tested. Anybody seen reviews of any of the rest of the lineup?
If you don't game you'd be better off with an i3. Foxconn has a nice 1155 ITX board for $70. It's on newegg.
Re: (Score:2)
An Intel Atom with Nvidia ION can't even run half of these games.
And the entire Ion system will probably use about a quarter as much power as this CPU/GPU combo. My Ion Xbmc box with a cheap SSD and 4GB of RAM takes about 25W from the wall when playing HD video.
I honestly don't understand why people are comparing this chip to the Atom when it's in a totally different market; people buying Atoms are not going to replace them with a 100W CPU. The lower power versions may be more competitive, but they'll also be far less powerful.
Re: (Score:2)
It makes no sense to compare this to an Atom, they are nowhere near the same market segment. If you want to compare Atoms to something, it's closest competitors are the AMD C-50 or the VIA Nano.
uh... (Score:2)
That review, like all the others I've seen, only covered the A8-3850. Totally irrelevant to what I was asking.
A quad-core @ 2.9Ghz isn't slow! (Score:4, Insightful)
It's just not. Maybe it's "slow" compared to the newest chip, but, if you want to pull that crap, the newest chips are "slow" compared to a new Cray.
If you're doing things on a regular basis that are CPU-intensive, then, sure, you need speed. But 99% of applications aren't even going to stress a quad core @ 3ghz.
Re: (Score:2)
Indeed, I've got a dual core Zacate clocked at 1.6ghz, and I'm not having any performance problems, even when I unplug and start working cordless. Sure it can get hot and the battery life sucks when I turn it all the way up, but the entire laptop maxes out at about 25 watts.
In fact, the next time my folks are in need of a new computer, I'll probably recommend that they go with whatever equivalent is available at that time. Apart from gamers and people that regularly engage in computationally stressful tasks
Re: (Score:2)
It's just not. Maybe it's "slow" compared to the newest chip, but, if you want to pull that crap, the newest chips are "slow" compared to a new Cray.
If you're doing things on a regular basis that are CPU-intensive, then, sure, you need speed. But 99% of applications aren't even going to stress a quad core @ 3ghz.
FREQUENCY ALONE MEANS NOTHING. You have completely missed IPC (instructions per clock). You must take frequency and IPC into account when judging performance. The Llano has worse IPC than Phenom II.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
My best computers to peak under 300W. And they aren't old or slow (but they aren't the fastest ones availabe either, just near them).
I'd understand if you have 2 or more GPUs...
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Add in the motherboard and other basics you're talking 1000 Watts constantly.
It ends up being closer to 70-80 a month.
Plus the cards become worthless because you're running them so hot they are probably going to die and not be resellable ether.
Re:Perfect for Bitcoin mining! (Score:5, Informative)
Two 5870 running at full will be 350~400 Watts Each.
Add in the motherboard and other basics you're talking 1000 Watts constantly.
Nice job pulling those numbers out of your ass.
Here's the real power consumption of a 5870 right off of AMD's spec sheets: http://www.amd.com/us/products/desktop/graphics/ati-radeon-hd-5000/hd-5870/Pages/ati-radeon-hd-5870-overview.aspx#2 [amd.com]
I'll pull the relevent part out for you: Maximum board power: 188 Watts
Assuming people who bitcoin mine use at least a decent power supply that is 80% efficient PSU at given load (realistically most decent ones are 82%+ in optimal load range), you're going to be pulling 235 watts from the wall per card, max.
235 watts is way less than 350-400 watts, by a long shot.
The rest of the system isn't going to be pulling huge amounts of power, since nobody who is mining bitcoin for real cash does it on a CPU, they do it on GPUs, and the amount of power a motherboard, RAM, disk drive, CPU use while they aren't really working is pretty low, usually in the 30-60 watt range, depending on your CPU, but nowhere near 200 watts of draw
Re: (Score:2)
Re: (Score:3)
I don't necessarily have an opinion regarding your discussion, but I wanted to point out that power draw specs can be incorrect, especially now that everyone is trying to be green. We have a large cluster here that we ended up having to install extra power for because the machine would shut down during HPL runs. The vendor (and this is not a small vendor) told us that for HPL, you have to spec power for 130% utilization instead of 100%. Now HPL is pretty intense, but it's something to keep in mind.
Re: (Score:2)
Well, unless the currency undergoes something of a collapse (a glut of btc on the market is very possible).
Re: (Score:2)
Re: (Score:2)
If by "massive" you mean "trading at about $2 (12%) less than before MtGox got pwnd", then sure.
Other than that, hey, my PC currently makes me $20-$30 a day for about $0.90 worth of electricity. Yeah, definitely not worth it. I fully encourage all SlashDotters to run screaming from BitCoin. Nothing to see here, move along.
Re: (Score:2)
Who cares about the bitcoin scam? I want to know if this is good for building a xbmc nettop!
Re: (Score:2)
Right, because some types of applications that are heavily hardware dependent are on-topic than others. It's fine to talk about servers, and benchmarks, gaming, video encoding, and other topics that might have some relevance to hardware performance, but not bitcoins!
Seriously, I don't use bitcoins, don't care about them, but yet some of the overreactions regarding them and the outcries of, "Stop talking about bitcoin on slashdot" is more annoying than the mentions of bitcoin themselves. Since when is appl
Re: (Score:3, Informative)
Well, people that don't want to reward Intel's illegal behavior for a starter. I recently got a Llano based laptop and was shocked at how well the chip handles the things that I do on a day to day basis. Sure, there's no chance of playing The Witcher or DNF on it, but it handles casual gaming just fine, especially the older games that I tend to like to play.
In practice, the dual core is much more responsive than the celeron I was using a couple years back, even though it's a third slower than that older Int
Re: (Score:3)
You see, the $230 device you suggest to buy instead have no integrated graphics, and thus you'll want to add $100 or more for a matching decent pice or GPU(or you can be a retard and enjoy integrated shit-tier graphics along with your high end CPU.
Or you simply settle for a lower-mid tier system and buy the Llano device from the above article and end up with a $200 cheaper system.
Re: (Score:2)
The Intel Sandy Bridge parts (which I assume the GP is referring to) do have integrated graphics, but as the article says, the point here is that the Llano graphics outperformed the Sandy Bridge integrated graphics by 2-4x. Enough to make the difference between entry-level 3D gaming and no 3D gaming.
Re: (Score:2)
To be clear, the Sandy Bridge chipset has integrated graphics, not the CPU, but you can't have one without the other is the point.
Re: (Score:2)
To be clear, the Sandy Bridge chipset has integrated graphics, not the CPU, but you can't have one without the other is the point.
No. GPU is on the CPU, like Llano.
Re: (Score:2)
No, the GPU is integrated with the CPU in Sandy Bridge. In fact, you could even say they are better integrated than in Llano since in Sandy Bridge they share the same L3 cache.
Re: (Score:2)
Re: (Score:2)
Interestingly ars technica agrees with you, but clearly states that that is also GPU's weakest link on llano.
http://arstechnica.com/business/news/2011/06/another-look-at-amds-llano.ars [arstechnica.com]
Re:Who buys AMD? (Score:5, Insightful)
If you don't need that kind of performance, then that extra $100 is wasted.
My server currently runs on an AMD. For one, it was the lowest energy using quad core I could find (45W). For two, at the time, it was cheaper than most Intel quad cores. And used less power than all but their lowest end dual cores.
Then again, my gaming rig is an i7 and my notebook is a Core2 Duo.
So, to answer your question: when it is the right tool for the job.
Re: (Score:2)
FWIW I did the same thing. Athlon II 610e: part of 2010's awesomest series of server CPUs. But let's not kid ourselves: if you were building a server from scratch today (not late 2010), you wouldn't use Sandy Bridge? I sure as hell would.
I can see some niches where this Llano stuff fits, though. Not sure if any of these are on my upcoming computer menu, but I've got one particular box where if it su
Re: (Score:2)
Kinda depends on your expected server workload, no ?
Sandy Bridge is faster, has lower peak power consumption for a given performance level and lower idle power consumption. I can't really see any expected workload where AMD is a better choice unless you plan to have lots of CPUs in your system.
Re: (Score:2)
Your server doesn't smell like a server. ;-) But fair enough; my Sandy-Bridge-now-always-beats-AMD-on-servers position is pretty prejudiced to certain workloads. YMMV and all that.
Re: (Score:2)
Forgot to mention that it houses about 1 TB (mdam RAID6) of music, videos and pictures that are streamed to my HTPC, smartphone and laptop. It runs a few services like iTunes, SSH, Privoxy and Apache in separate VMs. While it's not a pure server it does do a lot of server stuff. I should call it Servstation!
Re: (Score:2)
Re: (Score:2)
ok pop quiz hotshot.
ok pop quiz hotshot, on a scale of 1 to infinity, how cool does this phrase make you in 1990?
Re: (Score:2)
Re: (Score:2)
We use cultural references all the time.
There is a difference between using them, and shoe-horning them in where they don't add to the conversation in an attempt to make yourself look cool. As for adding to the conversation: pot, meet kettle.
Re: (Score:2)
Yeah, who would want some bang for their buck? Such idiots...
Re: (Score:2)
My biggest reason not to go with AMD is build quality of CPU/motherboard and system drivers quality. I've had several AMD systems, back to first time AMD raped intel on CPU front with it's first slot A athlon and two others after that.
General issues with everything from driver quality to random breakages was clearly in favor on intel systems every time. On the other hand, AMD usually beat intel on bang/cost ratio by at least 10%, usually more. But honestly, I value time spent troubleshooting random stuff, a
Re: (Score:2)
Llano definitely has a future in laptops, but in desktops? There isn't much reason to go with it, I'm afraid.
Re: (Score:2)
90% of users who don't do video encoding or demand to play the latest games at the highest resolution available would be much better served by buying the AMD system and spending that spare $100 on a 60GB SSD to use as a boot drive. For most workloads, that'll make the system feel five times "faster".
Re: (Score:2)
You'd get much more bang for your buck spending the money you would have spent on cooling and spending it on a faster CPU+Discrete Graphics combo.
The only reason to overclock one of these would be shits and/or giggles.
Re: (Score:2)
Re: (Score:2)
The new i3 has VT-x, but not VT-d (which also rules out SR-IOV). For that you need a new i7.
Re: (Score:2)
These processors are for tablets and netbooks running Windows 8 or browsing the web with the GPU taking part of the load.
I'll be impressed the day I see someone using a tablet with a 100W CPU.
Re: (Score:2)
Re: (Score:2)
One of the noisiest and least power efficient CPUs to ever make it to mass market. Enron *loved* those CPUs. As the guy said, "Burn, baby, burn!"
CPU requirements for most applications have not increased all that fast in the meantime, except the ones recoded from C++ into JavaScript, or the heavy-duty applications (math, engineering, visualization, adenoidal frittering).
Software developers optimize their software relative to