Intel Demos 4.7-GHz Pentium 366
richmlpdx writes "Silicon Strategies has an article about Intel's latest demo...
"Providing a sneak preview of its future developments, Intel Corp. here today demonstrated its fastest microprocessors to date--a 4.7-GHz chip for high-end desktop PCs.""
But, (Score:2, Funny)
Tony.
Opps!.... (Score:2)
Tony.
Re:Opps!.... (Score:3, Informative)
Re:Opps!.... (Score:2)
Maybe, but only if your entire application and all its data can fit into the on-chip cache, and you make sure the cache is loaded before you start your measurements.
In the real world, there are no such applications. As I said in another post yesterday, the bottleneck in the majority of computing tasks is not the CPU but the memory and I/O bandwidth. A fast CPU starved of useful work by a bus that can't keep up will spend most of its time idle.
Re:Opps!.... (Score:2)
Now, if you ask if it can do the same job in 1/4 the time. . . that's another story. . .
The Weather Channel (Score:5, Funny)
Hammer & Intel (Score:5, Interesting)
Re:Hammer & Intel (Score:5, Informative)
Sure, they *could* manage to start manufacturing the Truly Final Non-Hammer Core sometime in mid-2003, but by then the Hammers should be out (?) and I'd definitely go for and AMD Athlon (Clawhammer) 3400+ in Q1 2003. Mwhaha
But they might plan on having
Re:Hammer & Intel (Score:2)
4.7 GIGAhertz? (Score:2, Funny)
Re:4.7 GIGAhertz? (Score:2)
burp! (excuse me) (Score:5, Funny)
And in the other news... (Score:5, Funny)
Re:And in the other news... (Score:2, Funny)
"a second group of teenagers in sweeden blew themselves up today in what appears to be a weird underground computer ritual called overclockers.. Is your child in danger??!"
report at 10..
Awesome (Score:2, Funny)
can be VERY poorly written and still probably
maybe run somewhat fast hopefully.
Slightly misrepresented....I think (Score:5, Interesting)
Anand Tech [anandtech.com] has more information from their IDF report.
Re:Slightly misrepresented....I think (Score:2, Informative)
Right you are. And any editor worth his salt might have noticed that this news is several weeks old. The article is dated (09/09/02 06:04 p.m. EST)
This was part of Paul Otellini's keynote at the Intel Developer Forum [intel.com]. Just the boys in the lab showing that they can overclock with the best of them [slashdot.org].
Re:Slightly misrepresented....I think (Score:2)
Re:Slightly misrepresented....I think (Score:2, Informative)
In reality it is more like reporting that Kyle over at [H]ard|OCP [hardocp.com] managed to get a few samples of P4 CPU's to run at 4.68 Ghz for a few minutes without it crashing.
There is nothing evil about Intel overclocking their own hardware, but it is getting totally misrepresented as an actual new product. Which it is not.
Don't worry. (Score:2)
Don't worry. One of the designers of the new Pentium IV told me that they will definitely release a Pentium IV of 5 GHz or more.
GHz Hunting (Score:5, Insightful)
What do I have against high frequencies? For starters, high speed, fully syncronized digital constructions rely on switching millions of transistors at the same time (each clock cycle), this burns lots of power which is a limiting factor today.
Also, high frequency does not imply high performance, the CPU still needs to do something each stage, for example older Pentiums (P3, if I remember right) had a 20 (yes twenty) stage pipeline. This yeilds huge penalties for miss predictions for branches etc.
This GHz hunting also leads to other problems, such as huge electromagnetic disturbances in the chip, and in busses, etc. The solution to this is to add more wires and pull them in different directions to compensate. This only wastes more power and emits even more heat.
What I suggest, now when we have lots of transistors to play with, are asyncronous designs! Yes they are harder to design and verify, but that is largely because the lack of supporting tools.
This would reduce the power needs, let the designers make longer critical paths in their constructions (just clock that part slower), and reduce the need for registers used to balance pipe-lines etc.
Another move could be to introduce simpler, but parallell CPUs, perhaps on the same piece of silicon. The software systems of today are multi-threaded already, so why not make the hardware capable of _true_ multi tasking...
Re:GHz Hunting (Score:5, Informative)
But what about the P4's Hyper Pipeline tech that allow it to do 3 pipeline stages per clock cycle? The P4's Branch Prediction Unit (BPU) is also said to be improved by around 30% when compared to the one found in the P3. Perhaps these improvements even things out a bit while still making it easy to achieve high clock speeds?
Re:GHz Hunting (Score:2)
As for Hyper Pipeline, it requires that the stage that you intend to jump to is empty, i.e. a bubble in the pipeline, which probably makes it's usefulness limited. Ahmdals law is a good thing to apply (Intel seems to miss that sometimes). I would like to say that these kind of small improvements simply increase the complexity of the construction.
As for the P4's BPU, good or bad, it will still fail sometimes. It is not possible to predict all jump properly, thus you will have big penalties when not doing so if you use a big number of pipeline stages.
Re:GHz Hunting (Score:2)
Re:GHz Hunting (Score:2)
Re:GHz Hunting (Score:2)
Actually, wasn't the 486 DX4 the designation intel used for a 486 that ran at 3x 33MHz? DX3 would have been a less deceptive appellation.
Re:GHz Hunting (Score:2)
As for Pentium 4, I remember reading an interview of Intel engineer who said P4 architecture is able to run up to around 6GHz, and that they could announce a 6GHz procssor anytime, but it would be economical desaster to Intel. People will buy newer, faster processors anyway so why jump from 1.7GHz to 6GHz while you can milk'em with 2.4GHz, 2.8GHz, 3.0GHz
Re:GHz Hunting (Score:2)
Re:GHz Hunting (Score:2)
Athlon uses an emulator (Score:2)
so far every platform that has tried to emulate x86 processors in software has dismally failed to make inroads into the PC market
What about Athlon processors and late Pentium processors? They devote half their silicon to what amounts to an emulator that translates x86 bytecode into instructions for a RISC backend.
Why keep x86 bytecode? Two words: Code density.
Re:GHz Hunting (Score:2)
But what if designing these complex asynchronous systems efficiently requires 5ghz processors?
-me
Re:GHz Hunting (Score:2)
Interestingly, Transmeta Crusoe processors are being used to build clusters. They give the most bang per watt, as far as I understand. Since cooling systems in clusters cost (serious) money, the reduced heat signature of the Crusoes pay off.
Re:GHz Hunting (Score:2)
Re:GHz Hunting (Score:2, Insightful)
Why the hell do you care about power consumption? (Score:2)
- A.P.
Re:Why the hell do you care about power consumptio (Score:2)
Re:GHz Hunting (Score:2)
Sun Microsystems is already planning this for their UltraSPARC IIIi CPU [sun.com].
One theory I have is that Sun recognizes that super-high frequencies result in less reliability than Sun will tolerate, driving them to new CPU architectures. Remember, Intel cares more about marketing and big business than they do about truly high-availability and zero-error CPUs, which leads to their high frequency yet terribly inefficient Pentium 4. Sun's chip designers are just as talented as Intel's, and if Sun wanted to release a 5GHz CPU they would. It's interesting that Sun chose the asynchronous architecture instead of taking Intel's route of over-the-horizon pipelines and other tricks.they chose
Re:GHz Hunting (Score:2, Interesting)
And that would be why a Pentium IV 2.8Ghz is the fastest tested on SpecInt [spec.org]? (Faster than any other processor in the world). That would also be why the SpecFP [spec.org] is dominated by the Intel Itanium2 (with, notably, the P4 not too far behind. The fact that the Itanium is at 1Ghz versus the P4 at 2.8Ghz is irrelevant, as both speeds are the fruits of their respective designs)?
Note that I'm not an Intel "fanboy": I have an Athlon in my machine, and if I bought a machine today it'd have an Athlon in it. However, the strategy of Intel for their P4 is just a different variation on the pursuit of speed, and obviously it works because it's the fastest processor in the world at SpecInt. Saying that it's just marketing is clearly not true when seeing the results of their efforts.
It's interesting that Sun chose the asynchronous architecture instead of taking Intel's route of over-the-horizon pipelines and other tricks.they chose
Let the results do the talking. As it is, clearly Intel is winning the processor war.
Re:GHz Hunting (Score:2)
Re:GHz Hunting (Score:2)
That's only true if you need to drive a signal clear across the die in one clock. You could build pipelined architectures that keep each signal in a tiny area for any given clock interval.
The speed of light limits latency, but it doesn't necessarily limit throughput or clock speed.
Branch misprediction will kill you (Score:2)
The speed of light limits latency, but it doesn't necessarily limit throughput or clock speed.
And latency combined with branch misprediction will kill performance.
Once we begin to approach the light speed limit, the best way to achieve more performance on a chip will probably be chip multiprocessing (compare IBM's Power4) rather than cranking up the clock frequency.
It all makes sense now! (Score:5, Funny)
It's been those punks at Intel with this chip all along!!
So we're 1000 times faster now (Score:5, Insightful)
Re:So we're 1000 times faster now (Score:2)
Re:So we're 1000 times faster now (Score:2)
Which brings me to a topic I would like to see discussed or even polled -- What is the real percentage of technical types whose work (develop, integrate or maintain) is 100% Internet-related? (me, for one) I am betting, even during the heydays, that it is/was a lot lower than most people think it is.
"The idea of Heaven and Hell was the first big power scam. If they can get you to believe that, they know they can get you to believe anything."
Re:So we're 1000 times faster now (Score:5, Funny)
Way back when, I would have believed that, since I knew Moores law.
I would have burst out in laughter if you told me it would still take 10 minutes to boot my PC.
Far more than that (Score:2)
Dude, it's WAY more than that. I would venture to say its more like 100,000x. Take into account cache size and speed (did the 8088 even HAVE SRAM, if it did it was on the motherboard), memory speed (5ns vs. 70ns). And in general the overall efficiancy of the cpu (superscalar, speculative execution, etc).
I would post the link to CPUScoreCard.com comparing the 8088 and the P4 2.6GHZ, but they went pay for access to older benchmarks.
Jeezus, I just realized my CPU ranking is considered "historical". Damnit. What 800Mhz isn't good enough anymore? pfft!
Re:No...it's much faster than 1000x as fast (Score:5, Insightful)
For the most part, for most apps, SIMD is irrelevant. Yeah, maybe you can use it for data copying or a few other general things, but for the most part SIMD only helps with specific types of data processing until SIMD is further developed and SIMD-savvy compilers are common.
I do think MIPS can be compared due to the similarity in instruction sets.
The 8088 ran at about .3 MIPS (howstuffworks.com) and Sandra benchmarks a P4 1.6 at 3004 MIPS (theregister.com), so
estimate ~8700 MIPS for a 4.7 GHz P4. That's a little crude obviously.
=> 8700/.3 = 29000 times more MIPS, which is only 1 order of magnitude higher than the straight MHz difference. If SIMD had an order of magnitude effect (which it doesn't), that would be 2 orders of magnitude difference.
-Kevin
but my 700 Mhz Apple Mac is faster (Score:2, Funny)
I want to see 4.77 (Score:5, Funny)
I'd hit it.
Re:I want to see 4.77 (Score:2)
For graphics, the original PC has 80x25x256 characters. This is actually just 2KB graphics RAM and a 2MB card seems a bit feeble. Maybe something capable of displaying recognisable text in 800x250?
Re:I want to see 4.77 (Score:5, Informative)
So, when I said two out of three ain't bad, I meant there is no way in hell an anniversary PC would give you a choice of OSes. Microsoft just wouldn't permit it.
p.s. No, it's not that funny. I have no idea why it's easier to get slightly humorous posts modded up to a 5 but posts with serious thought and hopeful insight in them never get modded up or often get modded down by someone who just doesn't agree with you.
Whatever, not like it all matters anyway...
Getting around Microsoft OEM contracts (Score:2)
So, when I said two out of three ain't bad, I meant there is no way in hell an anniversary PC would give you a choice of OSes. Microsoft just wouldn't permit it.
Even if the top-secret OEM contract with Microsoft rules out selling PCs without an operating system or with anything other than Windows pre-installed, what stops a PC vendor from including FreeDOS with the machine [slashdot.org], along with a voucher for a CD of FreeBSD or Red Hat Linux?
Re:Getting around Microsoft OEM contracts (Score:2)
At least that was the case, maybe not now, now that the Justice Department had their noses up Microsoft's bum there for a while...
Getting around the dual-boot ban (Score:2)
The OEM contract prohibits dual boot arrangements too.
The OEM contract prohibits dual-boot systems from being pre-installed. I don't think even Microsoft could prohibit OEMs from including a FreeBSD CD with every computer.
Free sig: "Anti competition's gone too far, here's your Antitrust Superstar."
Turbo button? (Score:2)
If so, what would it clock the PC down to when deselected?
Re:Turbo button? (Score:2)
in at least the dell poweredge servers, there is a BIOS setting called something like "x86 compatability" or somesuch which takes a nice dual Pentium II (thats what we had in these a few years ago) and makes it run at a nice slow 60 MHz or so... we had a bitch of a time remotely trying to figure out why simple tasks were pegging the machine. luckily we had a competant service tech who got the call, and he was able to walk through the bios settings with us over the phone.
Re:I want to see 4.77 (Score:2)
Also, as someone else pointed out below, the CGA used 2KB of memory for video, and 2MB video RAM is pretty small by today's standards.
This is turning into a summary of the other posts. Why not?
No hard disk.
One or two 180MB floppies.
2.38Gbps to the expansion cards.
Supports up to 640MB or RAM, but only comes with 64MB as standard (or 16MB if you get one of the first ones).
Vaporware (Score:2, Interesting)
For any motherboard that still uses conventional ram-flushing, the cpu will top out at ~3Ghz and stay there, I don't care what kind of data bus you're using.
Mark my words, AMD's next generation of motherboards (now documented to support async r-f) will blow Intel out of the water. Hold on to your asses, ass-holders.
And while everybody is spending money (Score:5, Insightful)
C'mon people... I'm not saying nobody needs this (it does say high-end), or that 166Mhz is enough for everybody (it certainly isn't for a desktop), but why aren't people still not smarting up? Why do they keep buying a completely new PC every 2 years while they don't need it to write their word-document? (and i'm not even asking why they buy such crap that a pc with only half of the specifications could perform equally well).
Please. (Score:3, Interesting)
The answer is simple: People perceive it as being of some VALUE. People buy new PCs because they look better, or because Internet Explorer will take less time to load, or because right now it's just taking too damn long to print out that document, or the Internet is too slow. Yes, some of these reasons are misguided, and it's our job as those "in the know" to tell people when they do have a misguided assumption ("A Pentium 4 will make my Internet connction faster...") It's also our job to explain to them how best to spend their money if they ask us for advice -- perhaps their money would be better spent on a broadband connection or a memory upgrade or a better video card. Maybe they don't need a new computer.
Whining about why people buy new computers is futile. People buy new things constantly. Don't forget that people buying and upgrading new computers is what keeps our industry afloat, as well. Not only does it make hardware prices go down, thus benefiting more of us, but we get the added benefit of easier tech support (for the most part, computers have dramatically improved in this area since Windows 95 first hit the shelves) and better software. (My personal favorite is finally dragging those last few holdouts off of Netscape 4.7 so I can make great-looking dynamic websites that actually work with their browser.)
Next time, instead of wringing your hands and saying "Why?!", encourage those who are upgrading to spend their money in the wisest way possible. The more people who enjoy using their computers, the more successful the industry will be as a whole, and the more jobs we will all have as a result.
Re:Please. (Score:2)
People do need to be educated about these things. Because a LOT of people don't think of a new computer as being of value, but as a necessity, which it often isn't.
And although it happens more often than not, having people constantly buy things they don't need (yet) just to keep the economy afloat is one of the worst reasons I've heard to support kapitalism. If that's what keeps us going then something is terribly wrong with the western system. And yes, I knew that allready...
And the argument of better tech support because of the quantity of new pc's is a little shaky as well... If so much people buy new computers, they should have funds enough to make quality products so we don't need that tech support. And isn't tech support better when they have product to support that don't change every 6 months ?
Sure, it might work for your server... (Score:2)
I guess without any content, it doesn't take much to run your server
Re:Why we need faster CPUs (Score:2)
Re:Why we need faster CPUs (Score:2)
But memorymanagement is a good thing. It means less processes hangin' around doing nothing or just eating recources. If they keep hanging around you will run out of recources even with computers 100x better than current ones. Not to mention that more memory would be more helpful than faster CPU.
Good memorymanagement IS part of the quality of software.
And I'm not even talking about CPU power hungry apps that could do with half the power if they were better written.
Re:And while everybody is spending money (Score:2)
Actually... that is a very good reason to DO buy one. As you've read I mentioned feeding my family. By buying a new CPU... you feed your sim-family......
Question. (Score:5, Insightful)
Background:
Remember the BBC Micro, the ZX Spectrum? When they first came out, games were slow and blocky. But then several years went by without any significant improvement in processor performance.
Therefore, in order to produce better software and better games, developers had to learn how to write better code on their favourite platforms. They developed techniques and tricks to make every Hz count.
Today, you can do impressive stuff with crap code, simply through virtue of the raw grunt of the processor.
Hence the question. Do they cancel out? If Intel had not brought out a new processor in the last 5 years, where would software be in relation? Better, worse, or same?
Re:Question. (Score:2, Informative)
Does rapid improvement in processor technology cancel out the need for developers to learn how to write better code on a particular platform in order to achieve the maximum possible benefit from Information Technology?
No, that's
It's no longer practical to hand-code assembler for speed: chances are your C compiler will do it much better than you can and in a fraction of time, too. Nowadays if you get the basic algorithms right your compiler should do all the rest. (And if it doesn't, go contribute to gcc [gnu.org] until that does.)
Re:Question. (Score:2, Insightful)
You can spend weeks optimizing a hand-rolled assembly loop, to no avail if a better design allows for a faster approach to the problem to be solved.
Thanks to compiler technology, programmers can now spend more time on the design. However, do they do that? Optimization is still an issue, because it seems solutions today only get more and more bloated.
Re:Question. (Score:2)
But bloat doesn't necessarily mean that something is not well optimized. Some kinds of optimization- like unrolling loops- can wind up getting improved performance at the cost of increased binary size. That's not always the case, since bloating the binary beyond a certain point can have diminishing returns as it prevents the whole thing from being able to reside in the fastest cache, but it is an important example of how big doesn't neccessarily mean bad.
Honestly, is bloat really that big of a problem for a typical computer, anyway? RAM and disk memory seem to be growing even faster than processor speed, so that bloat really shouldn't be a serious issue. When was the last time you had to clear out your hard drive because it was getting too full? And when you did, was it because the binaries were too big, or because there were too many data files taking up space?
Bloat is still a factor (Score:2)
Honestly, is bloat really that big of a problem for a typical computer, anyway?
A "typical computer" is not a PC. A typical computer is an embedded system in a microwave oven with a 0.5 MHz processor, 1 KB of ROM, and 256 bytes of RAM, if that.
Next step up from an embedded system is a handheld device such as the Palm or the Game Boy Advance. You get a processor in double-digit MHz, only about 384 KB of work RAM, and storage measured in single or double digit MB.
Then you have the typical six-year-old Pentium computers in public schools. 100 MHz, 24 MB of RAM, unaccelerated video, 800 MB hard drive, 4x CD-ROM (if that).
Then you get to DVD-based game consoles, which have 32 to 64 MB of RAM. Bloat begins to disappear, but the less bloat you have, the more triangles you can push, and the faster your game will load. That was one of Mr. Shigeru Miyamoto's biggest complaints about the Sega CD and the old Nintendo Playstation project[1], that disc technology wasn't fast enough to provide a seamless experience. Only recently have engineers developed the hardware to load data faster and the software tricks to cover up loading time.
Only after all those do you get to a relatively modern PC.
[1] The Nintendo Playstation was originally a project between Sony and Nintendo to develop a 32-bit CD-ROM system that connected to the Super NES. When Nintendo dropped the project in favor of the Nintendo 64 console, Sony finished it up and released it as a stand-alone game console.
Re:Question. (Score:2, Informative)
Nowadays it would be pure madness to even attempt to optimize a program the same way as 'back then'. Programs simply have become too large (size and features) and too complex to begin optimizing them in the same manner.
Not to mention the fact that the average system in use today is simply overkill for 99% of all applications.
Sure, it would be possible, but would it be worth it? It would cost lots of money, take more time of larger development teams, driving up the costs of software.
Optimization is a good thing, but only up to a certain point, beyond which it just doesn't make any sense.
Re:Question. (Score:2)
The speed gain was extreme and we fixed it by programming slightly different. I think most apps can benefit if some care is taken on how things are done.
I dont think they cancel out eachother and crappy code will always be crappy code. Slab a pile of code togheter over the weekend and you have a stinking pile of shit that works much worse than it could.
My view is that this is like building bridges or houses. Cheat on planning and youve got a bridge/house thats worthless and dangerous. Time learn us that there arent any shortcuts to do advanced stuff. The abstraction strives being made in some unamed programming languages gives us crappy programs. Look at some unamed applications from a certain company making much of their software in an unamed programming enviroment that seems to squirt out much worse code than other enviroments.
Re:Question. (Score:2)
That may be true, but programmers were likelier to jump-ship when the next latest-and-greatest computer would come out-- VIC-20 killed development in the PET, C-64 killed the VIC-20 (and Atari 400/800).
There would have been alot more tangible incentives to jump in those days too, the leaps in changes were phenomenal: C-64 had a real synthesizer and 20X more RAM than the VIC, Amiga whomped C-64 with 20,000X more colours and stereo sound-- each advance was just as tantalizing to the developer as it was to the consumer.
Nowadays, we're so completely divorced from the actual computer, what with APIs and hardware-abstraction-layers, why would you bother trying to squeeze every MHz out of a machine when how well the machine operates is really up to the OS maker? (*cough* Microsoft *cough**cough* planned obsolesence *cough* conspiracy with chipmakers *cough*)
Re:Question. (Score:2)
Right now code is out of control -- just look at how easy it is to simply throw lines of source at a problem until it is solved. It is the easiest method because CPUs continue getting faster -- the burden is on the CPU designers.
However, once gains in CPU power stall, there will be no choice but for developers to take stock of their bloaty code and make changes. Having recently attended a panel with the original implementors from Atari, it is very clear that necessity drives invention and creativity.
right now there is simply no need for creative coding w.r.t. efficiency. there may not be a need for a long time.
Dr Strabismus of Utrecht... (Score:5, Funny)
(With apologies to the irreplacible J.B.Morton, who for many years wrote the bellowing-out-loud-its-so-funny "Beachcomber" column in the London Daily Express. Wish I had the (longer) original of this list to hand to post it.)
This is like a dragster race (Score:5, Insightful)
Re:This is like a dragster race (Score:2)
What do you mean? It runs at 4.7 GHz.
a) that's not even close to super-high
b) how is it a super-short duration?
if you've been paying attention for longer than 2 years you will clearly notice that EVERY processor intel demonstrates becomes mainstream in a few quarters. period. they've never failed.
You know you're a nerd when... (Score:5, Funny)
If they don't make it by thanksgiving, don't worry! Just use your Athlon.
Joy, yet another CPU I can't afford. (Score:2, Insightful)
When you think about it, the average user (AKA Joe and Jane Sixpack) do three basic things with computers: Internet (including e-mail, browsing and the occasional Multimedia site), Music, and Games. That's it. They're not ubergeeks like most of us
They'll get all wide-eyed and tickled pink at the thought of that kind of power, but all they'll really notice is windows opening faster. It's a huge waste of money, and they'd be too blinded by the thought of "this will make everything so much better" to notice.
It won't make MP3s play any clearer, it won't filter out the spam that clogs 90% of their inbox, and it sure won't make "HotChicksPorn.com" load any faster. Unless the Sixpack's are running SETI@Home [berkeley.edu], they wouldn't notice much of a difference and feel ripped off. Those FFTs would render rather quickly on a 4.7 GHz machine, though, which I wouldn't mind.
Production people like me would kill for a machine that fast. I do alot of digital video and audio work, and that kind of processing power would be most welcome. But people like me (and you, the ubergeeks of the world) are a relative rare breed. Maybe it's time for Intel and friends (or is it enemies) to start splitting demographics a little better and targeting specific types of "Joe and Jane Sixpacks" with different processors instead of just offering up the same two processors (Pentium and Celeron) to everyone as if we're all the same. The need to upgrade constantly isn't that big a deal, or at least it shouldn't be treated as such...
Re:Joy, yet another CPU I can't afford. (Score:2)
You forget, games are one of the power eaters, especially now that games are starting to get moderately decent physics [mathengine.com]
Comments on this story : Digest Edition (Score:5, Funny)
New Thread
- Someone complains that they should be changing the architechture not the speed.
- Reply about how he just described the G4
- Further reply that G4 is now behind
- Sulky Apple - Intel speculation
New Thread
- AMD Roolz
- Intel Roolz
- Motorola Roolz
- Crusoe Roolz
- ARM roolz
- No AMD roolz (repeat to fade)
New Thread
- Complaint that no-one needs that power
- You said that last time and we did
- I don't, I like my 486
- Ever Rendered, played a game, video edited
- Reasons for needing that much power
- Offtopic bitch about CmdrTaco and reference to 640k being enough for everyone
New Thread
- Comment digest
- complaints about comment digest
missed one thread (Score:5, Funny)
- U can make coffee with new proc
- I can bake a Turkey with it
- No, I can spit-cook a yak with it
- offtopic rant about u damned meat eaters.
Redundant? (Score:2, Funny)
As opposed to their 4.7-GHz chip for low-end desktop PCs?
coffe? (Score:5, Funny)
Comment removed (Score:4, Funny)
Here is an application that could make use of it! (Score:2)
Re:4.7 is 1337 d00dz (Score:2, Funny)
Re: 4.7 is 1337 d00dz (Score:5, Funny)
> but what type of application requires that much horse power?
Locomotives. You use the heat to drive the steam engine.
Re:4.7 is 1337 d00dz (Score:2)
How about a more serious reply?
Umm... faster pr0n movie encoding?
If that isn't a serious advantage to a nerd, I don't know what is...
Re:4.7 is 1337 d00dz (Score:2)
In this business [computermusic.co.uk], you can't get enough GHz!
Re:4.7 is 1337 d00dz (Score:2)
Re:4.7 is 1337 d00dz (Score:2)
Re:Okay, I want some more data here. (Score:2)
4.7 doesn't seem unrealistic and with recent moves by Intel on cutting the prices and introducing the chips faster I wouldn't be surprised if the 4.7 GHz PCs would be available for Christmas shoppers.
Re:Processors get faster fast! But how about hd:s? (Score:2)
And considering the power requirements - that costs!
a grrl & her (26 watt) server [danamania.com]
Re:Liers liers pants on fire. (Score:2)
This of course would have nothing to do with the evils of Palladium [slashdot.org], would it?
Yes of course it's some big Intel conspiracy to make you want their newfangled DRM processors, it's certainly not like AMD is going to be doing the exact same thing [slashdot.org]
This is about processor advances, not processor crippling, which both companies will be a party too, and which very may well scare off many geeks from said advances, though it's fairly certain that mainstream users will care less.
Re:Liers liers pants on fire. (Score:2)
Better whip out that voodoo doll..
Re:Nice try intel. (Score:2)
Intel went optical with the P4? tricky devils...
a grrl & her server [danamania.com]
Re:Moor's law (Score:2)
Re:Moor's law (Score:2)
(The ALPHA is probably the best example of a purebred RISC chip IMO)