Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror

Comment Re:The important question is... (Score 3, Informative) 163

Yup if you shelled out for a Socket 1366 (high end i7), you're going to be sticking with Nehalem until Socket R comes out down the line.
If you went with 1156, which I did (P55 Classified + i7 860 @ 4.0 Ghz), then you're screwed, just earlier, since it's now Socket 1155, which isn't compatible even though it's just a 1 pin difference.
I wasn't very happy with Intel when I found this out, since they've recently switched sockets after holding on to 775 for so long, but from my understanding AMD has also done something with the AM-2/3 socket where some motherboards are back/forwards compatible, but others aren't. I think there is a derivative socket, Am-2/3+, that is backward compatible but the Am2-3 standard version isn't forwards compatible. Don't take my word on it though, my builds have been Intel since the Q6600 came out. AMD has done a better job of backward compatibility but the sweet spot for price/performance + overclocking has been Intel chips whenever I've done my last few builds, and I only do builds every few years, usually after new architectures are released so my motehrboards are usually replaced as well.
Anandtech covered upcoming socket changes in more detail in their writeup

Comment Re:Zuse? (Score 2, Interesting) 737

Ok -- there were 2 reports -- Eckert & Mauchly put out a memo describing EDVAC that Goldstine (who coauthored the other report with von Neumann) classified. He then deemed the other report unclassified, and began distributing it to academics around the globe, which popularized the concept. EDSAC was the first completed device, led by Maurice Wilkes, due to the friction between Ecker/Mauchly and Goldstein/von Neumann delaying EDVAC.
Eckert had already designed the memory system and implemented it in his previous work before von Neumann arrived at UPenn. Von Neumann definitely deserves some credit for writing up and distributing the paper but if you read the first report he gives no indication that anyone else designed any portion of the machine, and only cites Mauchly for one idea. Von Neumann's contributions to EDVAC were primarily in the logic areas, especially the design of the instruction set, but the First Draft was at best a glaring omission of the work Ecket, Mauchly, and the many other engineers at the Moore School that built ENIAC and EDVAC.
Source: ENIAC -- The Triumphs and Tragedies of the World's First Computer (which is a great book on the early years of electronic computers--basically convinced me to major in ECE)

Comment Re:Zuse? (Score 3, Informative) 737

Considering Von Neumann basically stole the idea from Eckert & Mauchley, who had already developed the stored-program architecture and implemented it with mercury delay lines for EDVAC, I would prefer not to give him credit for this and focus on his other numerous acheivements.
Von Neumann's self-appointment as a head of the EDVAC project did more to prevent EDVAC from being the first stored-program computer (instead the British completed EDSAC first) than he ever contributed to the design of any of the early electronic general purpose computers, although his First Draft did disseminate the theory behind these early machines.
If anyone should be on that list (along with Zuse), Eckert & Mauchley should be added. In addition to the first Turing-complete electronic computer, they also went on to found the first computer company and gave other pioneers, and a number of women, including Grace Hopper, the opportunity to work in the industry.
Earth

Submission + - Himalayan Glacier Disaster Claims Melt Away 1

Hugh Pickens writes: "VOA News reports that leaders of the United Nations' Intergovernmental Panel on Climate Change have apologized for making a "poorly substantiated" claim that Himalayan glaciers could disappear by 2035 as scientists who identified the mistake say the IPCC report relied on news accounts that appear to misquote a scientific paper that estimated the glaciers could disappear by 2350, not 2035. Jeffrey Kargel, an adjunct professor at the University of Arizona who helped expose the IPCC's errors, said the botched projections were extremely embarrassing and damaging. "The damage was that IPCC had, or I think still has, such a stellar reputation that people view it as an authority — as indeed they should — and so they see a bullet that says Himalayan glaciers will disappear by 2035 and they take that as a fact," says Kargel, one of four scientists who addressed the issue in a letter that will be published in the Jan. 29 issue of the journal Science. Experts who follow climate science and policy say they believe the IPCC should re-examine how it vets information when compiling its reports. "These errors could have been avoided had the norms of scientific publication including peer review and concentration upon peer-reviewed work, been respected," write the researchers."

Submission + - Crazy Firewall Log Activity by Country and Hour (youtube.com) 1

arkowitz writes: I happened to have access to five days worth of firewall logs from a US state government agency. I wrote a parser to grab unique ip's out, and sent several million of them to a company called Quova, who gave me back full location info on every 40th one. I then used Green Phosphor's Glasshouse visualization tool to have a look at the count of inbound packets, grouped by country of origin and hour. And it's freaking crazy looking. So I made this video of it and I'm asking the Slashdot community: What the frak is going on?
PlayStation (Games)

Submission + - PS3 Hacked (blogspot.com)

An anonymous reader writes: Hello hypervisor, I'm geohot

I have read/write access to the entire system memory, and HV level access to the processor. In other words, I have hacked the PS3. The rest is just software. And reversing. I have a lot of reversing ahead of me, as I now have dumps of LV0 and LV1. I've also dumped the NAND without removing it or a modchip.

3 years, 2 months, 11 days...thats a pretty secure system

Comment Re:Not a fan of (P/NG/LT/Berkeley)SPICE (Score 1) 211

I agree on the myth of HSPICE vs Spectre -- my models were SPICE-syntax (22nm PTM BSIM4 models) so it's just a couple of syntax changes to get them into Spectre.

My results found that Spectre fit much closer to the predictions, while HPSICE's were a bit farther out on simple process characterization tests, as well as on simple designs (FO4 Inverters, etc.). Since Spectre matched the trends closer, I'd definitely have to give it the nod.

The increasing foundry support is definitely a major plus, though all the classes here are taught on Cadence's gpdk090 since it provides full models, does a lot of the calculations automatically so that nobody is forced to learn SKILL on their own to speed things up, and provides layouts for transistors and lots of extras/integration into Assura, etc.

I didn't mean to come off against Spectre -- my experiences with it have been great -- most of the negatives I heard came from some engineers on DeepChip. Personally, I was most impressed with Ultrasim -- it maintains about 98% accuracy on both power consumption and delay of full chips versus Spectre with about 10x speedup on small (~15k transistors) designs and even more significant gains on larger designs. I've tested up to ~1 million transistors with ease on Ultrasim even with large amounts of mismatch across devices.

Comment Re:Not a fan of (P/NG/LT/Berkeley)SPICE (Score 2, Interesting) 211

I just spent the past summer doing research at the 22nm level (designing L1/L2 caches with DVFS and other low-power techniques) and I can't agree more on SPICE/HSPICE's inability to converge.

I shrunk my designs down to the criitical paths (~12k transistors), and even providing the proper nodesets/initial conditions HSPICE was unable to converge or segfaulted quikcly. Fortunately, my university has a deal with Cadence through their University Alliance program -- Spectre may not be quite as accurate at HSPICE for analog circuits, but both it and Ultrasim (a FASTSPICE simulator for large designs) can handle much larger digital designs without complaint.

To the original submitter: Is there a good reason behind the no network connection requirement? If the university has a proper setup, students should be fine either on or off campus -- then it may be worth checking if your university has any deals with either Cadence, Synopsys, or Magma -- their tools are primarily Unix-based (Solaris, AIX, and Linux support), so it's just a matter of having the students SSH in with X forwarding or use VNC. This would even allows users with underpowered machines to simulate large designs quickly since everything is done remotely. I primarily run Windows on my local box, but either VMs with Linux or using Putty with Xming work properly for all these tools.
Television

Submission + - Why Can't I buy a cablecard ready set top box? (arstechnica.com) 1

Al E Usse writes: "Ars Technica does a write up of the problems that haven't been solved by the July 1, 2007 integration ban on integrated security in your cable box. Three months after the ban went into effect, digging up a third-party, CableCARD-ready set-top box can be an exercise in hair-pulling frustration. The companies who make the boxes don't seem interested in selling to consumers, cable companies still push their own branded devices, and Best Buy employees... well, the less said the better. We've heard the pain of our readers on this issue. One of them described his own epic (and fruitless) quest to secure such a device. His conclusion? "Although I should be able to buy a set-top box of my own, nobody will sell me one. I am standing on the doorstep, wad of cash in hand, yelling, 'Please take my money! I want to buy!' but am turned away."
Microsoft

Submission + - Details of Microsoft's new analytics tool leaked (computerworld.com)

hhavensteincw writes: "Details of Microsoft's answer to Google's Analytics have leaked online. Screenshots have been posted on the Net of the new "Gatineau" Web analytics tool that Microsoft now says (http://www.computerworld.com/action/article.do?co mmand=viewArticleBasic&taxonomyId=9&articleId=9027 638&intsrc=hm_topic) will be available in beta this summer. In a blog post, Microsoft's Ian Thomas also reveals that Microsoft will use Live ID(formerly Microsoft Passport) profiles to get its demographic data."
Sun Microsystems

Submission + - Sun releases ODF plugin for MS Office (heise.de)

extra88 writes: Heise online reported that Sun has released their OpenDocument Format (ODF) plug-in for Microsoft Office 2000, XP and 2003. The plug-in allows Microsoft Office (for Windows) users to open ODF files and save their work in ODF formats used by OpenOffice, StarOffice, and other programs. According to the ReadMe, the plug-in adds "ODF Text Document (*.odt)" as a format to Word's Open and Save dialogs and adds Import and Export options to Excel and PowerPoint. Support for Excel and PowerPoint in the Microsoft-sponsored OpenXML/ODF Translator Add-in for Office is yet completed and that Add-in supports only Office 2003 and 2007.
Businesses

Submission + - The First Thing IT Managers Do in the Morning?

An anonymous reader writes: When I was a wee-little IT Manager, I interviewed for a IT management position at an online CRM provider in San Francisco, a job I certainly was qualified for, at least on paper. One of the interviewer's questions was "What is the first thing you do when you get to work in the morning." I thought saying "Read Slashdot" wouldn't be what he was looking for — so I made up something, I'm sure, equally lame. Needless to say, I didn't get the job. But the question has stuck with me over the years. What do real IT and MIS managers do when they walk in to the office in the morning? What web sites or tools do they look at or use the first thing? Tell me. And remember, this is for posterity, so be honest.
IBM

Submission + - The Mainframe Lives! (networkworld.com)

coondoggie writes: "If you believe the stories, IBM's mainframe has in the last 10 years been knocked down and gotten back up more times than most of the characters in all the Terminator, Die Hard and Rocky movies combined. And while there are some out there who'd like to see its demise, a true threat to the Big Iron has never really amounted to much. Even today, the proponents of commodity boxes offering less expensive x86/x64 or RISC technologies say the mainframe is doomed. But the facts say otherwise. For example, IBM recently said the mainframe has achieved three consecutive quarters of growth, marked by new customers choosing the platform for the first time and existing customers adding new workloads, such as Linux and Java applications. http://www.networkworld.com/community/?q=node/1713 3"

Slashdot Top Deals

fortune: not found

Working...