Forgot your password?
typodupeerror
AMD

AMD Llano APU Review - Slow CPU, Fast GPU 184

Posted by CmdrTaco
from the welcome-to-the-doldrums dept.
Vigile writes "Though we did see the fruits of AMD's Fusion labor in the form of the Brazos platform late in 2010, Llano is the first mainstream part to be released that combines traditional x86 CPU cores with Radeon-based SIMD arrays for a heterogeneous computing environment. The A-series of APUs reviewed over at PC Perspective starts with the A8-3850 that is a combination of a true quad-core processor and 400 shader processors similar to those found in AMD's Radeon HD 5000 series of GPUs. The good news for the first desktop APU is that the integrated graphics blows past the best Intel has to offer on the Sandy Bridge platform by a factor of 2-4x in terms of gaming. The bad news is the CPU performance: running at only 2.9 GHz the Phenom-based x86 portion often finds itself behind even the dual-core Intel Core i3-2100. On the bright side you can pick one up next month for only $135."
This discussion has been archived. No new comments can be posted.

AMD Llano APU Review - Slow CPU, Fast GPU

Comments Filter:
  • Slower than an i3... (Score:5, Interesting)

    by LordLimecat (1103839) on Thursday June 30, 2011 @09:39AM (#36621924)

    On newegg that core i3-2100 is retailing for $124; how do the graphics in the llano stack up against the i3's graphics? Might not be such a bad deal at all.

    Article (or at least the material they got from AMD) indicates that graphics is precisely where it shines, so an i3-class CPU with nearly-discrete-class graphics, at an i3 pricetag, sounds quite compelling.

    • by h4rr4r (612664)

      That is AMDs plan with this unit. Same relative cost and performance as the i3 but much better GPU.

      • by butalearner (1235200) on Thursday June 30, 2011 @10:00AM (#36622160)
        I did a little digging for those wondering: it does run Linux [phoronix.com], but only with the proprietary Catalyst driver at the moment. Might be interesting once the open source driver catches up (assuming AMD shares the required info).
        • The open source driver won't catch up; the open source drivers have never even come near to the closed drivers in 3D performance. They're for people who want to always use the latest kernel without worrying about incompatibility.

          • by Kjella (173770) on Thursday June 30, 2011 @01:12PM (#36624496) Homepage

            Performance is one thing, it's not close in features or stability either. The 5850 was released in September 2009, I still can't get HDMI audio, there's no video acceleration, OpenGL is at 2.1 (card supports OpenGL 4.1) and last I checked it was rather easy to hang it. I'm not blaming the guys who work on it because they're few and working as hard as they can, but they're no match for the 100+ developer Catalyst team. It didn't help that in the long years where both ATI and nVidia were closed source the graphics stack really didn't get much love. But the info is there now, all it really needs is the manpower.

    • by rbrausse (1319883)

      how do the graphics in the llano stack up against the i3's graphics?

      this is not only answered in TFS but even in TFT :)

      and arguable your question is kind of senseless as Intels i3 is not a CPU/GPU combination but "only" a processor, though if you use your i3 with Intel on-board graphics the AMD will run circles around it.

      • by LordLimecat (1103839) on Thursday June 30, 2011 @10:04AM (#36622196)

        One of the SandyBridge selling points was "our integrated graphics no longer suck, and are now semi-decent". And calling the Llano a CPU/GPU combo while not doing the same for Intel is kind of pointless; both have integrated graphics, and both have it as a selling point. Since the prices are comparable, "one gives me good graphics and the other sucks" isnt a hard choice to make.

        • Since the prices are comparable, "one gives me good graphics and the other sucks" isnt a hard choice to make.

          You left out part of the equation. The choice is more like:

          i3-2100 - Fast CPU / Mediocre GPU

          Llano - Slow CPU / Good GPU

          For most non-gamers the choice will be the i3. For light gamers, HTPC, and notebooks the choice will be Llano. For more serious gamers the choice is obviously the i3 since the Llano CPU is too slow and the Llano onboard GPU isn't anywhere near good enough. These people will use higher end discrete graphics cards. I chose i3 for my gaming box for exactly that reason.

          • The Llano is about as fast as a Radeon 5550. Good for an integrated GPU, but lousy in the grand scheme of things. A $60 5570 handily outruns it.
          • For most non-gamers the choice will be the i3. For light gamers, HTPC, and notebooks the choice will be Llano. For more serious gamers the choice is obviously the i3 since the Llano CPU is too slow and the Llano onboard GPU isn't anywhere near good enough. These people will use higher end discrete graphics cards. I chose i3 for my gaming box for exactly that reason.

            Just a nitpick here, but Sandy Bridge supports full h.264 hardware decoding up to 1080p, 3D TV support, and Bitstreaming of HD Audio formats.
        • by Kjella (173770)

          Since the prices are comparable, "one gives me good graphics and the other sucks" isnt a hard choice to make.

          The reason people have moaned about Intel's abysmal integrated performance is that it's been the low bar of the market, all those computers that weren't built to game and didn't have a discrete graphics card. Because it turns out a lot of people got a used computer from work or borrowed their dad's work machine or whatnot to game, using the integrated graphics. With the Sandy Bridge graphics Intel raised that low bar quite a bit. Even if you buy a business machine you get that graphics performance for "free

      • by rbrausse (1319883)

        Intels i3 is not a CPU/GPU combination but "only" a processor

        argh, call me stupid; like you I read only half of TFS and ignored the Sandy-Bridge-sentence :/

        • call me stupid; like you I read only half of TFS

          Very stupid - you aren't supposed to read any of it.

      • by wagnerrp (1305589)
        No. The i3 and i5 lines integrate the graphics core on the same package as the CPU. The only thing the board provides are video transmitters. Intel has not produced a chipset with graphics since the G45 and Core 2 line.
    • IIRC, the contemporary i3s are Sandy Bridge parts(or older) and that intel's on-die graphics options come in a few tiers, depending on the tier of the CPU they are integrated with.

      So, if, in fact, the Llano's graphics are "2-4x better than the best Sandy Bridge has to offer" they should crush the i3's IGP like a bug, and be a better gaming part generally unless a given game is atypically CPU bound.

      I suspect that AMD will have themselves a cheapskate(and/or space constrained) hit, since their part woul
      • Ah, but if you get a discrete ATI card, it looks like the integrated graphics teams up with it in some kind of bizarre Crossfire setup, so the AMD processor would be even better than the i3. Good luck setting up dual-rendering between intel integrated and an nVidia or ATI card.

        • That does help to seal the fate of the lower-end i3s as the budget CPU of choice only for the must_have_intel brigade(I'm guessing that a lot of corporate typing boxes will be sold therewith...); and it certainly won't help Nvidia's chances of selling lower-end expansion boards to AMD users. However, at the higher end, I suspect that, while nice, the asymmetric Crossfire won't matter much: in the battle between two ~$50-80 expansion boards, having a bit of help from the most competent integrated graphics ye
          • Ive always heard people talk about how faster cards need a faster CPU, and that if you do a 2.2ghz 2core AMD you will end up bottlenecking your high-end 6990 card, but Ive never really seen it quantified or explained; surely the CPU isnt processing data that the GPU spits out onto the DVI port; and we are well past the days of needing the CPU to intervene on RAM and HDD requests; a lot of the point of AGP and PCIe (IIRC) is that they do not require CPU intervention to access memory-- they have a direct link

            • It depends on what you are using it for: As you say, the GPU does not directly lean on the CPU to any significant extent; but most people buying fancy GPUs(with the specific exception of people using them for entirely GPU-based compute tasks), are buying them to run applications that eat both considerable CPU time and considerable GPU time. If somebody is buying some serious GPU power, this usually means that they are running something, or cranking up their game's settings, or whatever it happens to be, in
              • When I play games, I generally always see the CPU spiked to 100%-- even doing something like WoW several years ago on a core2 duo. Changing graphics cards definately upped my FPS, but CPU tends to stay pegged.

                My understanding is that no matter what game it is (generally), its going to peg the CPU and GPU as hard as it can to get as much physics and rendering done as it can, and whatever it cant get done it just skips.

                As for the business scenario you mentioned, my understanding is that the CPU will take ove

                • by Luckyo (1726890)

                  It's worth noting that WoW is a VERY bad example here - that game is a massive exception to the rules, and is bottlenecked on CPU rather then GPU on any system with reasonably modern graphics card.

      • by AmiMoJo (196126)

        I don't think the benchmarks were very helpful because for most people the performance of this chip should be excellent. I have a dual core hyperthreaded Atom based server which is very responsive and usable, but going by raw CPU performance benchmarks sucks. For desktop use you don't need that much CPU power, and in fact simply having more cores is a better bet as it improves responsiveness massively.

        AMD are expecting the GPU to do a lot of the heavy processing like video decoding, we just need more softwa

        • by hedwards (940851)

          I'm watching the development of Open CL fairly closely, because it's probably going to end up making or breaking Llano in the long run.

    • The i3 does not have the best graphics for the SB, the i7 does. They say it is 2x-4x what that is. Well, that means pretty reasonable lower-midrange graphics. Enough to play modern games, though probably not with all the eye candy.

      That could make it worthwhile for budget systems. $135 for an all inclusive solution rather than $124 for a CPU and $50 on a video card.

      Of course there are some downsides too in that it is a weaker CPU and some games (Bad Company 2 and Rift come to mind) need better CPUs and of co

      • Intel pre-emptively released an i3 with their top-of-the-line HD3000 graphics GPU a short while ago, so the i3 is on a par with the best Intel can offer, iGPU-wise. 2105 I think.

      • by Rockoon (1252108) on Thursday June 30, 2011 @10:55AM (#36622888)
        Bulldozers wont have on-die graphics like these Llano (Bobcat) CPU's until mid to late 2012 at the earliest.

        What should be noted and what isnt well understood is that these "APU's" coming out from AMD are all Bobcat chips. Bobcat is a design directly targeting Intel's Atom market. The review here is for the King of the Bobcat's, the high powered variant weighing in at 100W peek built on the 32nm processes. The low power bobcats only have 80 stream processors (5.9W, 9W, and 18W variants) instead of the 400 stream processors (100W) that this thing has at are on the 40nm process.

        All the Bobcat modules have only 2 ALU's and 2 FPU's, and only a 1-channel memory controller, so it is no surprise that it has trouble competing with the i3's. What is surprising is that never-the-less, its competing with the i3's.
        • by 0123456 (636235)

          The review here is for the King of the Bobcat's, the high powered variant weighing in at 100W peek built on the 32nm processes.
          All the Bobcat modules have only 2 ALU's and 2 FPU's, and only a 1-channel memory controller, so it is no surprise that it has trouble competing with the i3's. What is surprising is that never-the-less, its competing with the i3's.

          It has twice as many cores and from the numbers you give here uses about 3x as much power. I'm not too surprised that you can compete with a cheaper chip in that case.

        • The problem is that these chips are not competitive with the Atom when it comes to power consumption. They are about on par with SandyBridge i3's in that regard, which is why everyone is comparing their performance against the i3s. There is no chance they will replace the Atom in netbooks (especially after Atom moves to 32nm later this year), but they will be good for low end laptops.

    • globally, twice as fast. extremely memory constrained though, so shell out at least for 1600MHz DDR3, 1833 is best.

  • Slow? (Score:5, Insightful)

    by Anonymous Coward on Thursday June 30, 2011 @09:42AM (#36621946)

    This new AMD product specifically targets the budget user with occasional gamings. It allows entry level gaming, for the price of a very cheap CPU + GPU at lower TDP. It's also a better solution than a CPU + Discrete graphics because it already gives you entry level gaming without taking up a PCI-E slot; at the same time allows for asymmetrical CrossFire so in case you want to get a high end CPU you can see a benefit (in DX10 & DX11 titles)

    This new APU from AMD shoots down any budget graphics Intel has to offer whilst giving you more CPU power to do anything Atom does.

    At the end of the day, Core i3 + HD3000 costs more and has a higher idle power usage.

    IMO the title should read: "Brilliant new budget gaming APU from AMD!"

    • Re: (Score:3, Informative)

      by cshake (736412)

      I know this article is about the desktop APUs, but as I've been running the C-50 Ontario on my netbook (Acer AO522-BZ897) for a few months now, I think I can share some real-world experience.

      Overall: It's a dual-core netbook, and still gets 6 hours battery life if I'm writing code with the brightness down, a little less if I'm listening to music. It may be slower on the individual cores than a competitive Atom, but if your program is threaded it's great. I'm very happy with the performance. It replaced a Po

      • If you're interested in smooth h.264 playback, i recommend downloading the most recent mplayer svn snapshot from here [mplayerhq.hu] (they recently added multithreading), compiling with -march=native, and pointing smplayer at the resulting binary. I used this method to get functional 1520x1080 playback on a 2100 MHz core2 duo.
  • Ladies and gentlemen, I remind you about how well-documented this sort of thing is: the wheel of reincarnation [catb.org]. Personally, I'm betting that hardware is now so disposable that we'll eventually get to having our machines in one hunk of silicon, and the wheel will stall.
    • by ArcherB (796902)

      Ladies and gentlemen, I remind you about how well-documented this sort of thing is: the wheel of reincarnation [catb.org]. Personally, I'm betting that hardware is now so disposable that we'll eventually get to having our machines in one hunk of silicon, and the wheel will stall.

      Exactly. I'll bet it will be called a "Tablet".

      Actually, I envision the day when all phones will have a compatible interface that will allow for keyboards, mice and monitors to be hooked up to them. You take your "phone" to work, plug it in, do work. Pull it out, browse the web on your way home and plug it into your dock at home where you play games or whatever it is you do with your current PC at home. You go and visit your buddy and want to show him some new whiz-bang-app you have, you plug your phone

      • I've always voted for that future, too. I think it would be a nice future. Although see the comment below; it mentions wireless. Wireless is good. Also see this concept thing [youtube.com], which amounted, sadly/predictably, to nothing.
  • by Eravnrekaree (467752) on Thursday June 30, 2011 @09:51AM (#36622042)

    To saw its slow is a little ridiculous. Compared to a 286? I know, that this is in comparison to other modern CPUs, but any modern CPU is pretty fast.

    I wonder if AMD or Intel will ever manage to develop an x86 integrated chip for handheld devices. It would be pretty interesting to have binary compatability between desktop and handheld devices.

    • by LWATCDR (28044)

      For most people CPU power is a none issue. Truth is that most office PC and home PCs are very over powered for what they are doing. Honestly most users would be see the biggest improvement in performance if they put their money into more RAM and faster storage as well as a half decent GPU over a faster CPU.
      The APU idea really has so much merit that it just isn't funny. If AMD can get this pushed out and if more software starts to take advantage of the GPU you will see a big benefit. This isn't all that diff

  • The article does not test using Quick Sync technology for the video rendering portion. When this is turned on, an Intel HD3000 is 6 times faster at video encoding than a top-of-the-line Radeon. ( Benchmarks here [tomshardware.com]). And also some of the tests show the Core i7-970 is twice as SLOW than a Core i5?? Gotta call B.S. on that one. And what's the point of testing a dual card (APU + Radeon) against a single Intel integrated graphics? We all know the HD3000 isn't for gaming, that's why you get a $65 Radeon to run
  • Can the llano be tapped for green?
  • I'll be building a mini-itx system this summer, and I find the cheaper (and possibly cooler) versions of Llano more interesting. Since the GPU side of the chip is rather bandwidth-limited, I wonder whether the lower-clocked and/or lower shader count (320 instead of 400) versions of the chip might perform almost as well as the highest-end chip all the sites I've seen have tested. Anybody seen reviews of any of the rest of the lineup?

    • The 65W versions are not out yet, haven't even seen a single test anywhere, and I've been looking.

      FYI, I couldn't wait and built a mini-ITX rig with Asus' E-350 board, and I'm fairly happy with it: dual screen, SD video on one, office stuff on the other, no real slowdowns, very quiet, no games later than 4+ years old though. The challenge was finding a nice vesa-mountable mini-itx case. Logicsupply.com has plenty (M-350 or T3410 caught my eye, bought both for funsies), or the elementQ is OK if you want a sh

    • I'll be building a mini-itx system this summer, and I find the cheaper (and possibly cooler) versions of Llano more interesting. Since the GPU side of the chip is rather bandwidth-limited, I wonder whether the lower-clocked and/or lower shader count (320 instead of 400) versions of the chip might perform almost as well as the highest-end chip all the sites I've seen have tested. Anybody seen reviews of any of the rest of the lineup?

      If you don't game you'd be better off with an i3. Foxconn has a nice 1155 ITX board for $70. It's on newegg.

  • by Anonymous Coward on Thursday June 30, 2011 @10:27AM (#36622514)

    It's just not. Maybe it's "slow" compared to the newest chip, but, if you want to pull that crap, the newest chips are "slow" compared to a new Cray.

    If you're doing things on a regular basis that are CPU-intensive, then, sure, you need speed. But 99% of applications aren't even going to stress a quad core @ 3ghz.

    • by hedwards (940851)

      Indeed, I've got a dual core Zacate clocked at 1.6ghz, and I'm not having any performance problems, even when I unplug and start working cordless. Sure it can get hot and the battery life sucks when I turn it all the way up, but the entire laptop maxes out at about 25 watts.

      In fact, the next time my folks are in need of a new computer, I'll probably recommend that they go with whatever equivalent is available at that time. Apart from gamers and people that regularly engage in computationally stressful tasks

    • It's just not. Maybe it's "slow" compared to the newest chip, but, if you want to pull that crap, the newest chips are "slow" compared to a new Cray.

      If you're doing things on a regular basis that are CPU-intensive, then, sure, you need speed. But 99% of applications aren't even going to stress a quad core @ 3ghz.

      FREQUENCY ALONE MEANS NOTHING. You have completely missed IPC (instructions per clock). You must take frequency and IPC into account when judging performance. The Llano has worse IPC than Phenom II.

"You tweachewous miscweant!" -- Elmer Fudd

Working...