Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Re: Passed data with a ton of noise? (Score 1) 391

>Why? Can't you just tell me what you want to say?
Alright. It's not just ground loops. Everyone who attribute electrical noise only to ground loops doesn't understand noise.

Ground loops matter, but the way the current and signal flow in the wire matters too. Ott's book pages 58 and 59 show 10 configurations, 5 ground loop configurations and five non-ground loop configurations, all with different noise suppression characteristics.

Cutting a ground loop will cut ground loop currents and the associated noise, but there's another 25db of noise suppression between the worst performing and best performing non-ground loop configuration.

The Wiki page is about ground loops, so talks about ground loops. That doesn't mean ground loops are the only consideration or even the most important consideration. If your problem is about noise in wires, then there's more to it than ground loops.

Comment Re: Passed data with a ton of noise? (Score 1) 391

Such as the A, A not, B, B not. It is impossible to induce both a positive and a negative noise onto a wire at the same point in time, so as long as both signals are true it's a one, if not it's a zero. More wires, but zero errors.

Because it's massively inefficient. 4 wires to get data through which could each carry 80% of the goodput with 20% error redundancy. So that's 3.2 times better than your 4 wire scheme.

Comment Re:Passed data with a ton of noise? (Score 1) 391

That needs to go away. We need an Ethernet protocol extension with BCH or Hamming code support.

The 1970s called and wants its error correction scheme back.
Try LDPC or turbo codes or or Reed Solomon if you like it old school, but not too old school maybe mixed with soft decision decoding, or fancy DDFSE schemes.
But never meddle in the ways of MIMO, for therein madness lays.

Comment 30 Times Faster? (Score 5, Interesting) 223

For most specific problems thrown at supercomputers, you can go 30 times faster with a custom hardware architecture baked into silicon

To go 30 times fast for general purpose supercomputing, you use the latest silicon (2X) and more chips (15X) and come up with a super new interconnect to make it not suck. This would involve making some chips that support low latency IPC in hardware.

They are free to send me a few billion dollars, I'll get right on it and deliver a 30X faster machine and I'l even use some blue LEDs on the front panel.

Comment Re:Swift (Score 2) 365

Agreed. Swift makes it easier to program, but the notion that "anyone" can write apps is definitely a laugh. There are a lot of programmers who don't understand that some people have a really hard time with the core concepts and skills involved in creating software. It reminds me of math teachers who don't seem to understand that some people have a fairly difficult time with advanced mathematical subjects. People have different areas of competence, and not all are suited to be programmers. It's not just logic... you need to do some creative problem solving in formulating that logic, and you need to keep a LOT of complex things in your head all at the same time to get them to all mesh together at the end.

And that's how I became I developer. In college I was going to major in Economics with a minor in Computer Science - but then I took an "Intro to programming" class after 8 years of home computer BASIC - and I was amazed that these engineering students had no ability to understand the logic and problem-solving required for programming.

I have a degree in computer science. I've been programming since I was 9. I learned Swift. It's quite good as languages go. But no amount of language knowledge or computer science knowledge will make the Apple APIs simple. They're not. They're complicated and hard to use. Swift will not make the APIs simple or logical. Making the APIs simple and logical will make the APIs simple and logical.

Comment Re:Newegg (Score 1) 172

> It's DRAM that's in the crosshairs.

Only to a small extent. This would reduce the need for DRAM cache of SSD data. Computers will still need huge amounts of DRAM for workspace. Workspace memory needs trillions of times more write cycles than this provides.

Or more SRAM cache local to the CPU with cache lines being merrily lobbed twixt the SRAM and the magic new memory. Maybe. A non volatile PC would be neat.

Comment Re:Newegg (Score 1) 172

Usually i'd agree... there's been countless up and coming new types of memory that never make it.

But i'm cautiously optomistic here because

a) It's Intel and not some tiny obscure VC

b) they said they already have wafers and mention 2016 O_o !

no wonder they ditched their awesome SSD controllers.

It's DRAM that's in the crosshairs.

Slashdot Top Deals

Real Programmers don't eat quiche. They eat Twinkies and Szechwan food.

Working...