Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Comment Re:Not news (Score 3, Informative) 89

It's been "not news" now for at least 20 years. Tsvidis did significant work on subthreshold FETs for neural networks back in the 80s and early 90s. Subthreshold design isn't common, but it's by no means a new field.

Subthreshold has its place, but it's not a pleasant place to work. Mismatch between transistors is about 4x higher, gate leakage is a huge factor below 130nm, the models you get from your foundry aren't reliable or accurate, etc.

I make subthreshold circuits all the time when I need bandwidth and have no headroom (hello 32nm, how unpleasant to meet you when doing analog design!). But I'm not doing low power subthreshold design, rather it's for very, very, high speed analog signal processing designs.

Comment Re:Sure (Score 1) 578

You guys are somewhat right. Bits ARE written to disk as 1's and 0's logially, but as +1 and -1 in a magnetic sense since the direction switches as you go over each magnetic pole. Note that those signals interfere and destroy each other's signal if they get too close, a phenomenon known as high frequency zeros in the business -- write a 101010 pattern to the disk and if you put those bits too close together you get 000000 back out when you read it.

The UBD (user bit density) is the number of bits stored in an area of 50% of the width of an isolated pulse.

Old drives (pre-'96 or so) used peak detectors to find patterns. There we used high frequency "boost" or pulse slimming to read 1s and 0s and get around the high frequency zeros problem, but the UBD was limited to less than 1.

About '95 or so IBM introduced a much more complicated technology called PRML, for Partial Response, Maximum Likelihood detectors. These use intersymbol interference in a controlled way using things like Viterbi detectors and even more complicated backends. This more sophisticated technology allows drives to get UBDs to above 3+. It's not totally unlike QAM, but most of the details are pretty different. Besides, when I was doing QAM systems the data rate (not the carrier frequency) is much, much lower than it is in a drive, so it was a heck of a lot easier to do.

Oh, and writing exactly what you want to a drive is almost impossible anyway. You remember that PRML stuff? That requires coding, meaning that only certain patterns are allowed (this has to be a DC-neutral system). Further, in the more modern systems parity bits are written to the disk, special radomizers are added to improve coding efficiency and spread the spectrum, etc. You could no more write an arbitrary pattern to the disk than you could use a soldering iron to patch an i7's microcode.

Comment Re:Sure (Score 1) 578

Nobody in the drive industry has built a stand-alone controller for the last 8 years (longer in some cases). These days all the controllers are integrated into SoCs that consist of the controller, the read channel, SATA (these days), and memory controller. You can't chain controllers nor easily overwrite the ROM code for them.

Further, the software required to control the read channel part of the SoC is very difficult to write, so even if you could manage to get to the controller knowing what to control to get a proper write is very difficult without documentation.

Science

Antarctic's First Plane, Found In Ice 110

Arvisp writes "In 1912 Australian explorer Douglas Mawson planned to fly over the southern pole. His lost plane has now been found. The plane – the first off the Vickers production line in Britain – was built in 1911, only eight years after the Wright brothers executed the first powered flight. For the past three years, a team of Australian explorers has been engaged in a fruitless search for the aircraft, last seen in 1975. Then on Friday, a carpenter with the team, Mark Farrell, struck gold: wandering along the icy shore near the team's camp, he noticed large fragments of metal sitting among the rocks, just a few inches beneath the water."

Comment Re:Bad Karma (Score 1) 1231

Good Karmic to me. I had a BIOSTAR TA790GXB motherboard with built-in Radeon HD 3300 graphics I was using as a HTPC. I never could get graphics to be smooth under Jaunty, and DVD playback was a nightmare.

I did an install of Karmic (not an upgrade) and it went cleanly and the graphics worked flawlessly. I'm a strong believer in reinstalls and not upgrades, but considering I started at Slackware 1.0 I've been trained to never believe a company when they claim that an in-place upgrade will be trouble free.

Data Storage

Build Your Own $2.8M Petabyte Disk Array For $117k 487

Chris Pirazzi writes "Online backup startup BackBlaze, disgusted with the outrageously overpriced offerings from EMC, NetApp and the like, has released an open-source hardware design showing you how to build a 4U, RAID-capable, rack-mounted, Linux-based server using commodity parts that contains 67 terabytes of storage at a material cost of $7,867. This works out to roughly $117,000 per petabyte, which would cost you around $2.8 million from Amazon or EMC. They have a full parts list and diagrams showing how they put everything together. Their blog states: 'Our hope is that by sharing, others can benefit and, ultimately, refine this concept and send improvements back to us.'"

Comment Re:Who is running Nielsen anyway, Leslie? (Score 1) 248

Depends on the cable system. When I had Comcast I had internet only (the company was paying and I used Dish for TV) they put a trap on the signal to block all the channels. Of course, when I had internet related issues they pulled the trap off to fix it. After many calls and before they finally realized they had to replace the cable coming into the house the techs gave up and pulled the trap. Not that I cared...

Comment Re:Netbooks? (Score 1) 211

Your problem is your circuit. You're doing an analog circuit and you're using PWM which means you're driving the fets into and out of saturation, which means that you've just increased your simulation time 10x. Plot the actual solution points sometime and you'll see why full swing circuits like PWM, ADCs, and most synthesizers are CPU hogs. Throw in the fact that you've likely got many different time constants and you're asking for something that takes a long time to simulate.

Depending on whether you're doing this for a class or for business there are workarounds for the time constant issues, but full swing analog is always going to be slow.

Personally, I remember when an 8-MHz machine was considered fast, and 1 MB of memory was da bomb. But then again in those days you cut your own rubylith :-) These days near tapeout I'll keep 4-20 Core7 boxes busy for a couple of months. My bosses don't care too much since hardware and software are cheaper than people and time to market.

Comment Re:For pro software, the OS is secondary (Score 1) 211

If the class teaches the design techniques and not the application, the maybe students can use whatever they want.

You've not taught classes, right? What happens when the student comes in and asks about why he's got a problem with his circuit? If the teacher is familiar with the tool he can generally figure out quickly if it's the tool or the student that's causing the problem. But with 25 different students coming in with 10 different simulators with their own quirks you're way too overloaded.

Then there's the issue of checking the circuit when a student comes in with a "unique" solution that would never work and you've got to figure out what the heck they did and how much partial credit they might get. Tracking down the simulator to see if they got the right answer in that case is a big waste of time.

As to "buy the machine that runs the software" that's what Cadence used to do: you'd get the University license and they'd throw in the Sun workstations for "free." The software was pricey enough that throwing in the hardware was an afterthought and it removed many of the support issues. Now that Cadence runs under Red Hat, though, it's a different story. But it's a still royal pain trying to figure out exactly what rev and what patches you need for the exact version of Cadence you've got since Cadence is pretty touchy about RH versions.

Comment Re:Not a fan of (P/NG/LT/Berkeley)SPICE (Score 1) 211

SPICE isn't perfect, but it is perfectly workable. HSPICE is better at converging, and Spectre better yet. None of them are particularly "brittle." That usually comes because you've extracted too much reality from your circuit (ideal elements, etc), or you have bad models with nonlinear 2nd derivatives.

As to the tendency to run to look for the simulator that converges the best, what's the saw about the poor workman? For circuits at the level this poster wants pretty much any of the better known simulators will work fine.

I've taught circuit courses. When it's undergrad I generally use LTSpice since it's cheap (free), relatively unlimited, and pretty robust. That it runs so well under WINE is a bonus since that means I can run it at home on my Ubuntu box. I'm not a big fan of the interface, though, since I was exposed to the various big, full commercial CAD packages before I ever tried it. Still, for undergrad stuff LTSpice is the best of the packages I've seen.

When I've taught grad-level courses (analog integrated circuits) I tend to like to teach with Cadence tools, which is definitely not free but the cost is pretty low for the department. It's nicer to not have to go to multiple environments, and besides, the grad students could use Cadence for their MOSIS designs.

PlayStation (Games)

Developer Panel Gives Its Verdict On Sony's PSP Go 55

An anonymous reader writes "A panel of games industry veterans have given their final verdict on Sony's PSP Go. David Perry thinks the handheld is an excellent step in the right direction, though he wants it to include free-to-play games. Andrew Oliver of Blitz Games Studios was also optimistic: 'The iPod has demonstrated that, given a nice small device and a good interface and easy buying process, people are happy to download content. I think this will work and move gamers to accepting legal digital downloads, which is the way we want the market to go.' In total, a panel of eight developers discussed four key issues surrounding the handheld, including whether or not they will develop for it."

Comment Re:Verilog - larger market share and dangerous (Score 1) 301

I've done both languages, but I come at them from an analog perspective.

I do ADCs, so there's quite a bit of analog and digital interaction and the digital is fairly good sized but not enormous. From that standpoint, I prefer Verilog since what you get out more closely matches what someone who doesn't do digital 100% of the time expects. There are fewer chances for derived states and things like that.

But back when I first started I got pulled into a digital design group to "save a project" and they taught us VHDL. I can see why VHDL is popular in that for large groups (as ours was) since it has structures that help integrate larger design teams together without the need for as many stylistic rules.

Personally, I'd teach Verilog first. In my experience it's more friendly to a variety of disciplines and better for small groups and projects.

That's not to say that VHDL isn't without its good points, with its stronger typing and other requirements.

Both are good languages and you really shouldn't go too far wrong with either.

Slashdot Top Deals

Living on Earth may be expensive, but it includes an annual free trip around the Sun.

Working...