Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Re:A4 is NOT just a Hummingbird (Score 1) 244

You're right, I was referring to the A8 core within the two SOCs. However, they both use the same PowerVR SGX 535 and were designed by the same firm (Intrinsity, which was acquired by Apple last year) in collaboration with Samsung. The A4 is slightly smaller, owing its decreased die size to smaller cache and other tweaks. Functionally, they're very similar, much moreso than any other A8-based SOC (TI OMAP, Qualcomm Snapdragon, etc).

Comment Re:Can't see why "dual core" would be a selling po (Score 4, Informative) 244

1) The same argument was made before dual core made it to consumer PCs. If you build it, they will come.

In any case, waiting on either PC or phone is usually due to some IO task, not heavy CPU usage. By far, the most waiting I'm going to be doing is when web pages are being loaded.

Media playback and games are primarily where users will see the most benefit from dual-core in the foreseeable future. Having a heavy webpage with Flash running smoothly doesn't hurt either. :)

2) Today, chips have very good power-gating. If only one core is being used, only one core is being powered. Also, the power usage increase is logarithmic. For this reason, having a second core doesn't double the TDP of the entire chip.

Also, most of these dual-core chips add a fraction of die space in return for an extra core. The SOCs already only dedicate a minority of space to the ARM core- the rest is taken up by the GPU, Memory, radio, and other misc controllers.

And due to die shrinkages with every generation, many dual-core chips will be drawing less power than their single-core counterparts. Case in point: the 3rd generation Snapdragon with dual-Scorpion cores is claimed (at least by Qualcomm) to use less power than the Snapdragons in current smartphones. Going from 65nm to 45nm (28nm expected by end of 2011!) provides that kind of headroom.

Besides, the biggest user of battery space is usually the screen, then radio (wifi, 3G/4G, bluetooth, etc), then the CPU at a distant third.

Double core- Double battery usage? Right, whatever.

Comment Re:Can't see why "dual core" would be a selling po (Score 4, Informative) 244

Why not? Multi-core was marketed successfully for PCs, what makes smartphones any different? Tech specs are pretty important to the Android crowd. Besides, now that certain devices will have docks that allow them become netbook and HTPC replacements, people will find uses for that extra core.

Comment Re:Sorry.. (Score 2) 123

I hate to break it you, but Fusion, Bobcat, and Bulldozer have been in development for quite a long time- all of these projects started when Hector was at the helm. Dirk can hardly be credited with these product releases, other than keeping AMD afloat long enough to allow these products see the light of day.

Comment Re:Hard call for GPU selection (Score 4, Insightful) 123

Intel has also promised OpenCL support on Sandy Bridge and later integrated GPUs. Not to mention S3 and VIA support.

I predict that Cuda will quickly become irrelevant and die a long, slow death (ie- just legacy support, no new features). Much like Cg did, after GLSL and HLSL matured. No one wants to be stuck on a single hardware platform, despite performance advantages.

Comment Re:Sorry.. (Score 4, Interesting) 123

I'll agree with this. AMD's been seeing some triumphs lately- their graphics division has been very successful, even despite a minor delay with the Radeon HD6900 GPU. Nvidia might have the performance crown this generation, but their previous generation has been shaky and their 40nm chips haven't been as available as AMD's, allowing AMD to gain considerable marketshare.

I've noticed a few netbooks with AMD Bobcat cores appear at CES, and has enough performance and power efficiency to give both Atom and Ion some serious competition.

While Llano doesn't appeal to me personally, it's nice to see Fusion reaching the desktop shortly. I'm also anxious to see how the Bulldozer will perform once it's released in a few months.

With the delay of Intel's Ivy Bridge into 2012, AMD has a lot of potential to make this year a profitable one.

Comment Re:The ridiculous problem is... (Score 3, Informative) 380

In a utopian future, people would pay the actual cost of manufacturing the console - plus a reasonable profit margin. Anyone could write games - and the cost of them would be reduced because they wouldn't have to pay the "Sony Tax" on each one. For people who'll own very few games over the life of the console, this is not so attractive - but for people who buy more than the average number of games, it's a huge win. But at least we're honest about it.

I already live in that future. I have a console hooked to my TV that runs code that doesn't have to be signed by Sony, Microsoft, Nintendo, et al. I can also run multiple OSes on it without having to jailbreak it. And I have hundreds* of legally-purchased games to play on it that probably cost me less than what 20 new PS3/360 games would (at $60).

It's called an HTPC. It pretty much does everything a PS3/360 does better (including blu-ray playback). Not to mention backwards-compatibility with at least a dozen of older consoles via emulators. I still have my PS3, but primarily for GT5 and not much else.

*My Steam account alone has 300+ titles. Mostly bought through holiday sale packs at a huge discount. I've probably played less than half so far, but I'm still discovering games that I bought more than a year ago.

Comment Re:From the No-shit-sherlock department (Score 1) 716

I had a cat that learned how to open doors. He was a big cat (about 16 lbs, not fat- just long) and would reach the doorknob with two paws and use the pads to turn the knob in the correct direction and lean backwards. He was only strong enough to open interior doors.

We had to lock the pantry because of him. Cats may not be as trainable as dogs, but they've surprised me with their ability to solve problems.

Comment Re:Ghost Recon (Score 1) 520

Many new graphics cards that utilize HDMI actually have audio controllers integrated into them. On my HTPC, I used to require a discrete soundcard that did realtime encoding of lossy 5.1 digital formats (DTS-Connect, DDL). Last year I removed the soundcard and since then my ATI/AMD Radeon 5450 feeds 7.1 digital lossless audio to my receiver via the same cable that carries video. It has a Realtek audio controller onboard and all audio is routed through the Realtek/AMD HDMI audio driver. It not only supports 24bit/192 KHz LPCM, but also bitstreams the HD-audio formats from blu-rays (DTS-MA and Dolby TrueHD). I believe that nvidia's Geforce 400-series supports bitstreaming as well. As long as you're doing HDMI-only (as there are no DACs on graphics card), the video card should be adequate for sound.

Can't comment on gaming performance- 5450 is inadequate for gaming at 1920x1080. For games, I use a different PC (w/ a Radeon 5850) with a dedicated Creative X-Fi titanium connected to headphones.

Comment Re:Let's get this right. (Score 2, Interesting) 518

Descent was a victim of its own success. I think for a small time after Descent I & II, the market for 6DOF shooters got crowded. I remember playing Forsaken (which was a pretty colorful Descent clone with awesome graphics for its time), and at least 2 or 3 other game demos that came on my PC Gamer CDs that had similar gameplay. Then Descent III came out, which, despite being a pretty decent game, flopped. Descent IV was then canceled, and its engine ended up being used in Red Faction. The Freespace spinoffs (space sims seem to have similarly disappeared lately) were pretty successful in their own right, but they're not what I'd consider as groundbreaking. Unfortunately, Interplay eventually went broke, and that was the end of Descent.

Like the Fallout series, I was hoping someone would be willing to buy the IP from Interplay and release a new title. I would definitely love to see a new Descent with current-gen graphics, but other than that, it's hard to improve upon the original.

Slashdot Top Deals

To do nothing is to be nothing.

Working...