Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Comment Re:What could possibly go wrong? (Score 4, Informative) 174

Most conversion of CO2 to O2 is done by algae and other marine life (93% iirc). Trees only contribute a very small percentage. You can increase algae to absorb CO2, but having more algae is not a good thing - it creates toxic environments that kill other types of life: http://en.wikipedia.org/wiki/A...

By the way this is what a lot of people get wrong when they say 'CO2 is plant food!!'

The CO2 problem is a huge problem we've created that both environmentalists and anti-environmentalists usually vastly underestimate.

Comment Re:100+F or 38+C typical annual high (Score 0) 62

Portland is cool, yes. But that's mostly down to the bookshops and tea shops. Temperature-wise, it doesn't get "hot" per-se, but it does get very humid. And the air is horribly polluted. Spent the time moving up there reading about dangerously high levels of mercury in the air, the deadly pollutants in the river, the partially dismantled nuclear reactor and highly toxic soil (the reactor has since gone, the soil remains drenched in contaminants), the extremely high levels of acid rain due to excessive cars (which are driven by maniacs) and the lethal toxins flowing through the rivers that have been built over to level out the ground.

In short, I landed there a nervous wreck.

Things didn't improve. I saw more dead bodies (yes, dead bodies) in Portland and heard more gunfire in my five years there than I heard in the suburbs of Manchester, England, in 27 years. You will find, if the archives let you get back that far, that I was almost normal before that time.

Comment Re:Souinds like the data center of the future, cir (Score 3, Interesting) 62

1955. The Manchester Computing Centre was designed to be one gigantic heat sink for their computers in the basement, using simple convection currents, ultra-large corridors and strategically-placed doors to regulate the temperature. It worked ok. Not great, but well enough. The computers generated enormous heat all year round, reducing the need for heating in winter. (Manchester winters can be bitingly cold, as the Romans discovered. Less so, now that Global Warming has screwed the weather systems up.)

The design that Oregon is using is several steps up, yes, but is basically designed on the same principles and uses essentially the same set of tools to achieve the results. Nobody quite knows the thermal properties of the location Alan Turing built the Manchester Baby in, the laboratory was demolished a long time ago. Bastards. However, we know where his successors worked, because that's the location of the MCC/NCC. A very unpleasant building, ugly as hell, but "functional" for the purpose for which it was designed. Nobody is saying the building never got hot - it did - but the computers didn't generally burst into flames, which they would have done if there had been no cooling at all.

Comment Re:BRILLIANT (Score 1) 131

[satire]That's a fucking brilliant idea! I really really really mean it. Sincerely.

I think that's sarcasm, not satire.

Is it? I wasn't aware. Clearly sarcasm must have some association with satire, because making sardonic statements seems to be the first thing I want to do when I'm writing satire. Then I take someone's stupid idea, and extend it, by including absurd examples of where their (il)logic would/should take them....

... Which I did on the very next fucking line.

Stephen Cobert's show is satire of Bill O'Reilly.

Do tell. Next you'll be telling me that The Daily Show isn't real news.

Because Jon Stewart never uses sarcasm when he indulges in acts of satire.

-----------------
P.S. I'm still being sarcastic. And by aping your tone, satirical, too.

Comment Re:Self Serving Story? (Score 5, Insightful) 267

Adding to this, a number of existing altcoins do, in fact, attempt to address bitcoin's weaknesses. Litecoin attempts to resist customized hardware mining and also make the blockchain update faster. Primecoin solves a somewhat useful mathematical problem instead of completely wasting computer cycles like Bitcoin does. There are other examples.

Anyway, it only seems natural that as time goes on, better and better cryptocurrencies will be incrementally developed. To ask everyone to use ONLY what's the first iteration of this tech would be silly.

Of course, there are "me-too!" cryptocurrencies as well, typically with only minor 'improvements' and designed to make the creators rich. I'm all for educating people about how they could be taken advantage of. But boycotting? Come on.

Comment Re:Seems simple enough (Score 1) 168

Let's start with basics. Message-passing is not master-slave because it can be instigated in any direction. If you look at PIC Express 2.1, you see a very clear design - nodes at the top are masters, nodes at the bottom are slaves, masters cannot talk to masters, slaves cannot talk with slaves, only devices with bus master support can be masters. Very simple, totally useless.

Ok, what specifically do I mean by message passing? I mean, very specifically, a non-blocking, asynchronous routable protocol that contains an operation and a data block as an operand (think: microkernels, MPI-3). If you're clever, the operand is self-describing (think: CDF) because that lets you have overloaded functions.

The CPU is a bit naff, really. I mean, at least some operations can be pushed into a Processor In Memory, you have a fancy coprocessor for maths that you're repeatedly (and expensively) calling to create the functions that exist as a limited subset in FFTW, BLAS and LAPack. Put all three, in optimized form, along with your basic maths operations into a larger piece of silicon. Voila, massive speed boost.

But now let's totally eliminate the barrier between graphics, sound and all other processors. Instead of limited communications channels and local memory, have distributed shared memory (DSM) and totally free communication between everything. Thus, memory can open a connection to the GPU, the GPU can talk to the disk, Ethernet cards can write direct to buffers rather than going via software (RDMA and OpenSockets concepts, just generalized).

You now have a totally open network, closer to Ethernet than PCI or HyperTransport in architecture, but closer to C++ or Java in protocol, since the data type determines the operation.

What room, in such a design, for a CPU? Everything can be outsourced.

Now, move onto Wafer Scale Integration. We can certainly build single wafers that can take this entire design. Memory and compute elements, instead of segregated, are mixed. Add some pipelining and you have an arrangement that could blow most computer designs out the water.

Extrapolate this further. Instead of large chunks of silicon talking to each other, since the protocol is entirely routable, get as close to individual compute elements as you can. Have the router elements take care of heat and congestion issues, rather than compilers. Since packet headers can contain whatever label information you want, you have a notion of processes with independent storage.

It doesn't (or shouldn't) take long to figure out that a true network, rather than a bus, architecture will let you move chunks
of the operating system (which is just a virtual machine, anyway) into the physical computer, eliminating the need for running an expensive bit of simulation.

And this is marketspeak? Marketspeak for what? Name me a market that wants to eliminate complexity and abandon planned obsolescence in favour of a schizophrenic cross between a parallel Turing machine, a vector computer and a Beowulf cluster.

Comment Re:Seems simple enough (Score 1) 168

OpenCL is highly specific in application. Likewise, RDMA and Ethernet Offloading are highly specific for networking, SCSI is highly specific for disks, and so on.

But it's all utterly absurd. As soon as you stop thinking in terms of hierarchies and start thinking in terms of heterogeneous networks of specialized nodes, you soon realize that each node probably wants a highly specialized environment tailored to what it does best, but that for the rest, it's just message passing. You don't need masters, you don't need slaves. You need bus switches with a bit more oomph (they'd need to be bidirectional, support windowing and handle multipath routing where shortest route may be congested).

Above all, you need message passing that is wholly target-independent since you've no friggin' clue what the target will actually be in a heterogeneous environment.

Comment Re:Can you fit that in a laptop? (Score 1) 168

Hemp turns out to make a superb battery. Far better than graphene and Li-Ion. I see no problem with developing batteries capable of supporting sub-zero computing needs.

Besides, why shouldn't public transport support mains? There's plenty of space outside for solar panels, plenty of interior room to tap off power from the engine. It's very... antiquarian... to assume something the size of a bus or train couldn't handle 240V at 13 amps (the levels required in civilized countries).

Comment Re:So what's the problem here? (Score 2) 136

Nobody is forcing you to read the Washington Post. Nobody is forcing you to buy anything from Amazon. You can easily avoid both of them, if you want, without any harm or negative effects to yourself. So what's the big deal here?

Just because neither of us hangs out with him doesn't mean I don't get to tell you what a giant douchebag Jeff Bezos is. That's one of the joys of the First Amendment, my friend! Freedom of speech is the freedom to bitch inanely about things that don't directly affect you.

You, of course, are equally free to tell me to shut the fuck up, or to take your own advice and not bitch about something that doesn't interest or affect you....

... But if you do decide to keep talking about the problem, and maybe even about how to address or resolve it, then you see the true glory of Open Public Dialogue - the very thing that makes Slashdot such a lovely place to be. :-)

And no, I am not being in the least bit sarcastic, Sheldon.

Slashdot Top Deals

New York... when civilization falls apart, remember, we were way ahead of you. - David Letterman

Working...