Forgot your password?

Comment: Tag #WhereIsTheFuckingPaper (Score 4, Informative) 62

by Guppy (#47716859) Attached to: Study: Seals Infected Early Americans With Tuberculosis

Oh, here it is: Pre-Columbian mycobacterial genomes reveal seals as a source of New World human tuberculosis (Paywall -- free Nature summary article here).

Modern strains of Mycobacterium tuberculosis from the Americas are closely related to those from Europe, supporting the assumption that human tuberculosis was introduced post-contact. This notion, however, is incompatible with archaeological evidence of pre-contact tuberculosis in the New World. Comparative genomics of modern isolates suggests that M. tuberculosis attained its worldwide distribution following human dispersals out of Africa during the Pleistocene epoch, although this has yet to be confirmed with ancient calibration points. Here we present three 1,000-year-old mycobacterial genomes from Peruvian human skeletons, revealing that a member of the M. tuberculosis complex caused human disease before contact. The ancient strains are distinct from known human-adapted forms and are most closely related to those adapted to seals and sea lions. Two independent dating approaches suggest a most recent common ancestor for the M. tuberculosis complex less than 6,000 years ago, which supports a Holocene dispersal of the disease. Our results implicate sea mammals as having played a role in transmitting the disease to humans across the ocean.

Comment: Re:Home fecal transplant went wrong (Score 1) 48

by Guppy (#47716225) Attached to: How To Read a Microbiome Study Like a Scientist.

While the clinical picture and timing suggests the possibility, it's far from certain that this was a primary infection stemming from his home fecal transplant. I would have liked to see an analysis of anti-CMV IgM titers, although in this case it's also possible that his case was recognized too long afterwards to determine whether or not it was an actual primary infection.

Comment: Re:100+F or 38+C typical annual high (Score 0) 59

by jd (#47698809) Attached to: The Data Dome: A Server Farm In a Geodesic Dome

Portland is cool, yes. But that's mostly down to the bookshops and tea shops. Temperature-wise, it doesn't get "hot" per-se, but it does get very humid. And the air is horribly polluted. Spent the time moving up there reading about dangerously high levels of mercury in the air, the deadly pollutants in the river, the partially dismantled nuclear reactor and highly toxic soil (the reactor has since gone, the soil remains drenched in contaminants), the extremely high levels of acid rain due to excessive cars (which are driven by maniacs) and the lethal toxins flowing through the rivers that have been built over to level out the ground.

In short, I landed there a nervous wreck.

Things didn't improve. I saw more dead bodies (yes, dead bodies) in Portland and heard more gunfire in my five years there than I heard in the suburbs of Manchester, England, in 27 years. You will find, if the archives let you get back that far, that I was almost normal before that time.

Comment: Re:Souinds like the data center of the future, cir (Score 3, Interesting) 59

by jd (#47698749) Attached to: The Data Dome: A Server Farm In a Geodesic Dome

1955. The Manchester Computing Centre was designed to be one gigantic heat sink for their computers in the basement, using simple convection currents, ultra-large corridors and strategically-placed doors to regulate the temperature. It worked ok. Not great, but well enough. The computers generated enormous heat all year round, reducing the need for heating in winter. (Manchester winters can be bitingly cold, as the Romans discovered. Less so, now that Global Warming has screwed the weather systems up.)

The design that Oregon is using is several steps up, yes, but is basically designed on the same principles and uses essentially the same set of tools to achieve the results. Nobody quite knows the thermal properties of the location Alan Turing built the Manchester Baby in, the laboratory was demolished a long time ago. Bastards. However, we know where his successors worked, because that's the location of the MCC/NCC. A very unpleasant building, ugly as hell, but "functional" for the purpose for which it was designed. Nobody is saying the building never got hot - it did - but the computers didn't generally burst into flames, which they would have done if there had been no cooling at all.

Comment: Re:Insurance rates (Score 1) 238

With fast networks it's even possible that the insurance companies could bid on outcomes as the accident was happening. Theoretically my insurer could throw my car into a ditch to avoid damage to a bmw coming the other way.

I might get to see the first car get diverted into a schoolbus to avoid a 50-million-dollar superduperhypercar. I'll have to dress for the occasion with my best fingerless gloves and head-worn goggles.

Comment: Re:Seems simple enough (Score 1) 168

by jd (#47690387) Attached to: Processors and the Limits of Physics

Let's start with basics. Message-passing is not master-slave because it can be instigated in any direction. If you look at PIC Express 2.1, you see a very clear design - nodes at the top are masters, nodes at the bottom are slaves, masters cannot talk to masters, slaves cannot talk with slaves, only devices with bus master support can be masters. Very simple, totally useless.

Ok, what specifically do I mean by message passing? I mean, very specifically, a non-blocking, asynchronous routable protocol that contains an operation and a data block as an operand (think: microkernels, MPI-3). If you're clever, the operand is self-describing (think: CDF) because that lets you have overloaded functions.

The CPU is a bit naff, really. I mean, at least some operations can be pushed into a Processor In Memory, you have a fancy coprocessor for maths that you're repeatedly (and expensively) calling to create the functions that exist as a limited subset in FFTW, BLAS and LAPack. Put all three, in optimized form, along with your basic maths operations into a larger piece of silicon. Voila, massive speed boost.

But now let's totally eliminate the barrier between graphics, sound and all other processors. Instead of limited communications channels and local memory, have distributed shared memory (DSM) and totally free communication between everything. Thus, memory can open a connection to the GPU, the GPU can talk to the disk, Ethernet cards can write direct to buffers rather than going via software (RDMA and OpenSockets concepts, just generalized).

You now have a totally open network, closer to Ethernet than PCI or HyperTransport in architecture, but closer to C++ or Java in protocol, since the data type determines the operation.

What room, in such a design, for a CPU? Everything can be outsourced.

Now, move onto Wafer Scale Integration. We can certainly build single wafers that can take this entire design. Memory and compute elements, instead of segregated, are mixed. Add some pipelining and you have an arrangement that could blow most computer designs out the water.

Extrapolate this further. Instead of large chunks of silicon talking to each other, since the protocol is entirely routable, get as close to individual compute elements as you can. Have the router elements take care of heat and congestion issues, rather than compilers. Since packet headers can contain whatever label information you want, you have a notion of processes with independent storage.

It doesn't (or shouldn't) take long to figure out that a true network, rather than a bus, architecture will let you move chunks
of the operating system (which is just a virtual machine, anyway) into the physical computer, eliminating the need for running an expensive bit of simulation.

And this is marketspeak? Marketspeak for what? Name me a market that wants to eliminate complexity and abandon planned obsolescence in favour of a schizophrenic cross between a parallel Turing machine, a vector computer and a Beowulf cluster.

Successful and fortunate crime is called virtue. - Seneca