Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Note: You can take 10% off all Slashdot Deals with coupon code "slashdot10off." ×

Comment Oh, rats... (Score 1) 116

So they finally invented something that would make me seriously consider buying an Xbox. I hope they have it in the stores in time for Christmas. I can think of a dozen great uses for it -- killing flies, drilling holes through would-be burglar's shoes (feet inside), a new way to light the barbecue grill. In fact, who needs the grill? Next summer I can reprogram it to keep the squirrels out of my peaches at the same time it prepares me a tasty laser-grilled peach-fed squirrel for dinner out of the ones that move too slowly! One wonders if it is sensitive enough to keep a yard cleared of mosquitoes (without putting anyone's eyes out, of course, at the yard party).


Comment Re:Wait, physics doesn't work either? (Score 3, Interesting) 179

"Entanglement" is a philosophically difficult arena. According to quantum theory, there is just one wavefunction for the entire Universe. However, we as observers are part of that wavefunction observing another part of that wavefunction with a really, really, big chunk of the whole wavefunction effectively unobservable but still coupled to the observer (part of the wavefunction), the measuring apparatus (part of the wavefunction), and the "experiment" (yep, part of the wavefunction. Everything is "entangled", but quantum mechanics also predicts that large systems approximated with a random phase condition will behave like a classical system, and the usual rule is that we treat a measurement apparatus as a classical system that breaks the entanglement of a measured systems and forces it "unpredictably" into a separable state. But even this is words, not equations although random phase approximations are indeed equations and are used frequently in field theory.

The only coherent explanation of this that I am aware of is the process of:

a) Starting with a density matrix (or other representation) for "the Universe".

b) Use the Nakajima-Zwanzig approach of splitting the (fully entangled) density matrix up into two parts -- a "system" that you will continue to treat as a quantum system, and a "bath" -- everything else -- which would also include the measuring apparatus if you were trying to describe an experiment. One then accepts the fact that one cannot know or prepare the state of the bath (which is really, really big being the rest of the Universe and everything) and so one makes a statistical approximation of the bath (taking the trace) which essentially eliminates the pesky entanglement but also breaks useful things like unitarity and in a sense, conservation laws. One them creates projection-valued operators and transforms the equations for the system into stochastic or semiclassical equations of motion.

c) IIRC your final result is quantum mechanics for the system expressed as a non-Markovian integrodifferential equation that is almost impossible to solve. However, if one makes a Markov approximation (forces it to be time-local, delta-correlates time) you end up with a decent explanation for things like spontaneous decay as an "exponential" process rather than a punctuated unitary process. You go one way, you can make it into a Langevin equation, go another you can make it more like Fokker-Planck.

The lovely thing about this approach is that it renders moot all sorts of nonsense, such as EPR paradoxes and "wavefunction collapse". It is perfectly clear that in the Universal wavefunction no such paradox or collapse can occur. They are simply expressions of our ignorance of phase and state whenever we try to isolate some part of the whole and pretend that it is a standalone "system" that can ever be decoupled from everything else. Schrodinger's cat paradoxes disappear as there is no paradox in the Universal wavefunction, only when we try to project the state of the cat against our ignorance of phase and interaction with the outside Universe. The cat, if you like, cannot be entangled separately from its preexisting entanglement with the rest of the Universe, and we only get into trouble when we have to force it by partitioning the system in order to get a chunk small enough to work with.

Hope this helps. I doubt it will. Very few people seem to be in touch with Nakajima-Zwanzig and the Generalized Master Equation these days, and don't treat problems like this as OPEN quantum systems as opposed to closed systems with a classical measurement apparatus, which is a place you only get to on the far side of the N-Z GME ritual.


Comment Re:Custom firmware (Score 2) 373

That's not that feasible: they use the consumer-area electronics a lot now to allow configuration of the more critical systems, and to read data from them.

It's not feasible to lock my front door, because my house was built with a non-stop conveyor belt running from the mailbox to the kitchen.

The entire point of this ask-slashdot is to identify cars that DON'T integrate entertainment systems and wireless access with the safety critical electronics. Cars that DON'T do the dumb&dangerous stuff you just listed.

Data flow *from* the primary systems *to* entertainment&wireless systems is marginally acceptable, if it's a physically enforced one-way data flow using optocouplers or something.

I seriously want each car manufacture to have one employee on staff, who's sole job is say "YOU'RE FIRED" every time any idiot engineer wants to permit ANY data flow from entertainment-or-wireless systems into safety-critical systems. I don't care how limited the APIs are, I don't caret how encrypted it is, I don't care how cryptographically-secure the certificates are. If there's data flow into critical safety systems, it's effectively certain that it's going to be vulnerable. You don't connect safety-critical systems to wireless input, period.


Comment Re:Why hasn't anybody forked Firefox already? (Score 2) 294

I haven't used it much yet, but Pale Moon may be what you're looking for. It's a fork of Firefox. The development design choices favor privacy, user-control, and improving speed&stability by dumping rarely-wanted code. Examples: They removed the Parental Controls code, they're excluding the new Firefox DRM support, they dumped support code for obsolete CPUs, they dumped some of the code for handicap-accessibility, and they currently removing phone-home code for crash reports and other potentially privacy-violating telemetry.

I haven't seen specific mention of it, but I'm certain there's no way in hell they will implement Mozilla's new policy of *prohibiting* you from loading any extension that hasn't been reviewed&approved&signed by Mozilla.


Comment Re:Tired... (Score 2) 294

In the next release or two, Firefox is going to start blocking you from loading any extension that hasn't been approved and signed by them. People have been SCREAMING on their message boards for a way to disable/override this, but they flat out refuse. The only way to get around it is to install a non-standard browser executable.


Comment Re:The poor and CO2... (Score 1) 248

As for my profession: I got out of a career in Astrophysics because I figured that if I had to put up with egos like yours on a daily basis, I might as well get paid for it. Thanks for reminding me that I made the right choice.


Every day, you probably look in a mirror. Do you pay yourself for the privilege?

Curiously, in my career in physics, even when there has been considerable debate -- and on one occasion I was in a room at a conference where people were almost shouting at each other (over a paper I had written, curiously enough:-) there was still a thread of collegial respect. At the same time, there was and continues to be a lot of resistance to being proven wrong, in no small part because of the grant system.

Yes, physicists often have egos, although a lot of the ones I know and work with don't have huge ones, just ones that are healthy enough to be able to participate in the scientific process, which requires at least some personal investment in the hypotheses you are studying if only because we've created a system that punishes null results in research to an incredible fault (and which produces a matching bias towards getting a positive result, no matter what, for any hypothesis being studied, sigh, and I don't just mean in climate science). I know you think I'm arrogant (exactly what tone SHOULD I take to respectfully disagree with YOUR personal beliefs in a non-arrogant way, or is the mere act of daring not to agree arrogance, I wonder) but in fact one generally needs to have scientists with egos that are sufficiently healthy to be able to state new beliefs that are not in agreement with "accepted" beliefs in science or science would never make any progress. Is this what annoyed you in astrophysics? People invested in beliefs in some specific picture of the big bang, or dark matter, or whatever? Or did somebody -- referees or grant officers, perhaps -- piss on your Post-Toasties?

Don't get me wrong -- this is by no means impossible or even unlikely -- it is a serious question! Science is not perfect, gatekeeping is far more common than people think, as is inbreeding in the grant process where reviewers favorably review grants for research that does not challenge the results or beliefs of the reviewer. My primary colleague over 25 years of research was a grant officer for the ARO, and I got to see a lot of this from both sides -- he was absolutely not in this category, but some of his colleagues were, and he had to be very careful in his selection of outside reviewers. But hey, my own thesis advisor had a world class ego and was disliked by roughly 2/3 of the physics department and would insult people en masse at dinner parties, so I do understand where you are coming from. At the same time, he was really pretty damn smart, quite possibly as smart as he thought he was. Smart and modest exists in physics, but you don't hear about it as much and may well hear from stupid and immodest more. Meditating on why may provide some insight into why science is not an ivory tower -- scientists are human and have to be in conflict for the process to work and yet are subject to normal human frailties such as pride and a need to eat and have job security. I personally think "science" works pretty well given all of this, but it could certainly work better...

I know you like to post a lot on /., and I know that /. isn't known for the maturity of its discussions so that your style is likely adapted to the needs of the medium. All I would suggest is that you consider reducing the snark factor in your replies even when you disagree with something. Obviously, we disagree on this issue. You still seem to wish to impute some sort of fundamental dishonesty to me, and not acknowledge that it is possible for both of us to honestly disagree. I'm sure posting additional links to papers (some of which probably are paywalled, and even though I can get through a paywall via Duke it is a PITA and I hate having to do it to get to research my own tax dollars no doubt paid for) won't change your mind, although I'm tempted to give the tree issue specifically one more try:




I'm sure you will automatically reject the latter as political, but the graphics assert that they are generated as summations from substantial numbers of peer reviewed papers and, frankly, the curves presented are pretty believable given the known physics and biochemistry. The first two are straight up results that validate the assertion that trees are growing roughly 50% faster due to the extra 120 ppm CO2, at least in the a broad band across the temperate zone in Europe.

Note especially the curve for stressed, resource constrained plants. I would argue that most food crops are not grown under stress or resource constraints, so that response is likely to be somewhere between the lower curve and the upper one, but even the lower curve shows a 10% growth enhancement from 280 ppm to 400 ppm. Which again, makes perfect sense. I am surprised that you are so certain that this sort of a number isn't a reasonable estimator for food production overall, worldwide, given the data.

And before you say it, I too would like a list of the "hundreds" of papers they assert that this curve is built from and some indication of how it was built. But is isn't crazily out of whack with what I've read in specific papers. C3 plants respond well to increased CO2. C4 plants are less responsive because they are already evolved to grow well under conditions of C3 stress with a completely different photosynthesis chain, but there is evidence that their response is still positive. CAM plants are almost certainly increasing in response to CO2, according to satellite evidence, although they are not a significant source of food unless you really like pineapple.

To conclude (again, patiently) it is not unreasonable -- in my opinion, although obviously not in yours -- to think that the increase in atmospheric CO2 has been accompanied by a corresponding increase in the growth rate of all plants. For C3 and C4 crops grown in agriculture, this increase is likely variable and somewhere in between the measured rates for stressed and unstressed growth in laboratory experiments. For trees the increase has been measured and directly connected to CO2 by means of radiometrics applied to growth rings, as well as increases in natural nitrogen fertilization and is almost certainly greater than 50% (acknowledging that there is variability over tree species and probably over resource constraint as well so this is very much an average number). For food plants, the increase is almost certainly at least 10%, as that is roughly the lower bound established by food crops grown under stressed, controlled conditions in the lab and one would hope that worldwide most food crops are not grown under lab conditions defined as "stressed", most of the time.

I do not expect to convince you, since I do not think you will allow yourself to be convinced no matter what the evidence. But I do not think that my position here is far outside of the mainstream in either environmental science or agricultural science. It was first pointed out to me by a soil scientist at Duke, who has no pony in the global warming race. I would even argue that minimizing this possibility in spite of a good mix of sound scientific reasoning and empirical evidence rather than accepting that it is plausible (consistent with the science and data as far as we can tell at this time), if not probably true is something that suggests a lack of scientific objectivity and desire to maximally support a political agenda that is intolerant of any evidence that increasing CO2 might be at least partially beneficial.

Really? So 280 ppm is the grand optimum for the global biosphere or (not precisely the same thing) the human ethnosphere? Not 300 ppm, not 220 ppm, but 280 ppm, the level we would have if humans hadn't mucked coal out of the ground and burned it creating civilization -- maybe? Or you have some evidential or science supported reasoning to demonstrate that it is some particular value? If so, I'd be very interested in knowing what it is, and where and how you got it.

Comment Re: invalid data (Score 1) 337

Strasser was one of the guys purged from the party.

From Wikipedia:

In what became known as the Night of the Long Knives, selected men of the Schutzstaffel ("Protection Squadron"; SS) arrested and eventually killed at least 85 people from 30 June to 2 July 1934.[20] Among these were Strasser, who had been included in the purge on Hitler's order.[9][20] He was shot once in his main artery from behind in his cell, but did not die immediately. On the orders of SS general Reinhard Heydrich, Strasser was left to bleed to death which took almost an hour.[21] His brother Otto, who had left the NSDAP in July 1930, managed to avoid the Nazi purge and survived World War II.[22][23]

Comment Re:better solution (Score 4, Interesting) 182

Pigs work better and faster. Feed a body to the pigs and you might end up with a few teeth in pigshit, nothing more. Use the pigshit to fertilize some fields, and the body is just gone.

Dissolving tissue in bathtubs, etc, just leaves evidence in the trap, the pipes, (probably) in the bathroom and tub, and "weeks" is a lot of exposure.

All good drug dealers need to invest in a hog farm. It's worth it even when you don't have bodies to dispose of! Tasty bacon!

Comment MIT professor Ju Li (Score 1) 35

The new findings, which use aluminum as the key material for the lithium-ion battery's negative electrode, or anode, are reported in the journal Nature Communications, in a paper by MIT professor Ju Li and six others.

In related news, MIT professor Ju Na has some exiting discoveries in Sodium technology.


Comment Donate Slashdot to Archive.org & take tax writ (Score 1) 552

The title says the main idea, but here are some more details in my usual rambling style... :-)

== More details (and should be better/conciser, but headache plus other stuff to do)

When you wake up in the early morning before sunrise, you may feel you need to turn on a lightbulb to get around safely otherwise in the dark. That lightbulb seems blindingly bright -- so bright you can't look at it. You need that lightbulb though. Then, hours later, after the sun is up, you may forget the lightbulb is even on -- it is bright everywhere, and the bulb hardly stands out. Is that the story of Slashdot? As well as the story of many other tech innovations and communities and individuals (perhaps even myself)? These tools, communities, and individuals help bootstrap something greater and then just fade into the background.

As a different analogy, each year, seeds produce the next generation of plants that produce more seeds, but generations later, who thinks of (or thanks) the seeds from years ago that made everything possible? It remains important for our own mental health to be thankful for the past generations that made our life possible (an idea very strong in some Native American culture, and even made into politics by C. H. Douglas and Social Credit) -- but it is perhaps too much to expect direct gratitude for our own contributions. "We do what we must because we can"? :-) Or maybe because we "should". Or even just because it was "fun" (a preference for fun perhaps shaped by millions of years of evolution and selection for survival). Sure, some people get remembered and celebrated because they won some commercial popularity contest (Edison, Gates, Jobs), but most just get mostly forgotten (Steinmetz/Tesla, Kildall, Wozniak/Wayne). Even Doug Engelbart's obituary did not even get a full open article on the Slashdot home page, just a title line (and I and others complained about that at the time). People like Peter H. Huyck & Nellie W. Kremenak (who wrote the very insightful "Design & Memory: Computer Programming in the 20th Century" in 1980) get essentially no mention, as does William Kent (who wrote "Data & Reality" in 1978). I'm continually amazed how little mention Smalltalk (Kay, Ingalls, Goldberg, etc.) gets these days -- it seems almost entirely forgotten, even if Java and now JavaScript is step-by-step reinventing most of it (often badly) -- even as the Smalltalk community struggles on, and when I squint just right, I see the web as a Smalltalk image Theodore Sturgeon envisioned ubiquitous mobile networked wearable nanotech-built computing in the 1950s in "The Skills of Xanadu", which inspired Ted Nelson and other technologists (myself included), but who remembers him for that? Chuck Moore (Forth) and James Martin (everything IT) likewise are near forgotten at this point, as far as their name coming up much in discussion. Even Clifford Berry and John Vincent Atanassoff are pretty much forgotten, even given their Atanasoffâ"Berry Computer (ABC) the main reasons digital computing was not burdened with core patents early on. And those are just a few I know who are public or published figures. And that does not include the people like high school teachers Jack Woelfel, Joe Maurer, David Gray, or many others (including my father) who made a big difference in my own personal computing education (but few others would ever hear of outside of where I grew up).

I can also look at old computer magazines and catalogs from decades ago and see so much diversity of hardware ideas before the PC monoculture took over. Still, as Manuel De Landa said, uniformity at one level can promote diversity at another. A lot of different software has been built on the Windows/Intel hardware/OS monoculture. Hardware and OS diversity seems also to be going up again with Smartphones and Chromebooks. Although again, with lots of soon to be forgotten developers -- even if it may mean a lot to the developer and their local community and successors that they did what they did. And even if much of this is just playing out the visions of Vannevar Bush, J. C. R. Licklider, Sturgeon, Engelbart, Kay and so on.

So things change. There was a time when people came to Slashdot because of the good infrastructure and good management. September 11th, 2001 was one of Slashdot's finest hours as the Slashdot servers kept going to support discussion when other websites crashed under the load. Now it seems that people come to Slashdot *despite* those things (e.g. Beta, audio ads, whatever) based on the legacy of those times. I can sympathize with the plight of Slashdot's staff given the corporate profit-making emphasis and whatever other internal corporate social power struggles are taking place, and they may be doing the best anyone could under the circumstances, but it remains a sad situation that feels wrong. I had thought the idea of selling job advertisements on Slashdot was a good one (as far as advertising as a revenue stream goes), but I can see, like other posters have suggested, that Stack Overflow or GitHub are better positioned for that in some ways. There remains a tension between a company hiring for skill (SO, GitHub) and hiring for conscience (aspects of Slashdot). Although as one of the earliest SourceForge adopters, I can be even more sad about what happened there, like with bundling stuff into downloaded software.

IMHO, at this point, the biggest value of Slashdot (or, for that matter, SourceForge) is perhaps as an archive representing a time in history relating to the emergence of the internet as a social force. There are also other historical discussions on Slashdot relating to shaping the interaction of technology and society beyond the web. Archive.org seems like a good fit in that sense. DHI Group might even get a nice tax write-off from such a donation. It seems most of Slashdot is not showing up in Google searches? That seems unfortunate to me as another loss of history. Hopefully something that could be fixed in the future especially by Archive.org, either on the Slashdot side or Google side. Archive.org already seems to have much of Slashdot archived, but having the raw archives might help -- the earliest crawl for this page is 2009, for example:

I don't mean to dismiss the value of the current community even if just a shadow of itself. It's still a good community, even if smaller and less engaged. As you say, it's strongest around open source / free software issues. I'd be proud to have started a community that grew to just the size Slashdot is even just now. But with so much competition from other news aggregator and discussion sites (including for FOSS etc.), as well as other internet distractions, it's hard to know what the future is for the Slashdot community. Clay Shirky wrote in 2003 about the interaction of communities and their infrastructure in "A group is its own worst enemy"; Doug Engelbart also wrote even earlier on that, about co-evolution of tools, community, knowledge, and processes. I like the idea of the community buying Slashdot, but with the community slowly fading and likely with DHI Group asking a lot of money even now, I wonder if that is going to happen. Is the best use of a couple million community dollars (maybe a lot less, 200K?) to ransom the Slashdot community from its corporate overlords or perhaps instead to build something new (or just move over to pipedot or Soylent News)? The fact that this article on the sale has so few posts, and that posts to Slashdot these days now tend to be a couple lines (a general trend on the internet), suggests the community is not as strong or engaged as it once was, so I can doubt it would come up with the money -- although there always might be a tech millionaire here who could do it personally. I wish I could buy Slashdot today to give it to Archive.org myself, but I can't.

Also, I feel the future of technological communications is more in the direction of a distributed social semantic desktop. I've worked towards that end myself like with my Pointrel/Twirlip/etc software. So, I also wonder how much a centralized web platform for discussion will last in general, even if web platforms may still have years of life in them. My vision of Slashdot would be to push the community into using a more decentralized system -- but that would be a very radical shift and no doubt attract even more moans than Beta, if such were possible. :-)

Ideally, that would be a platform where people like me could write long rambly posts, and others could, if they want, summarize them, or take pieces of them, or create diagrams of them, (or I could do that myself later). Those derived items could lead to other discussions and so on in some organic way where people did not feel put upon from seeing long essays. One can say such long things should go on a blog, and maybe they should, but then that misses something Slashdot has. Still, maybe, in that sense, a sea of interlinked WordPress blogs is really what Slashdot should become (as a first cut)? Is the free-ranging webforum itself increasingly obsolete? If not, how could it be better (other than Beta)?

As my Slashdot user ID suggests (109597), I've been on the site since near the beginning around 1997, starting reading it when I was a contractor at IBM Research -- yes, it was mostly work related research and education. :-) I lurked for a year or so before getting this ID. I have not posted in the last five months (since March 2015) for a few reasons. I've also (almost entirely) avoided reading the site since then. I saw this article on the day it came out because I peeked again at Slashdot looking for comments on Intel's new 3D memory technology whose announcement was all over the place and I wanted to see what people here thought about it. I've been thinking about it for the past week or so.

The main reason for my avoiding Slashdot during the last few months was to focus on getting some software written, especially as our finances dipped from cash into credit as our software project dragged on and on. That just shows how powerful an influence Slashdot still has over me -- like an alcoholic trying to stay dry. :-) I've been turning to Inhabitat for my news fixes once a day or so (a mostly optimistic site about green design), although I don't post there and there only about 100 comments to read -- or at least there were, but now somehow there are 551 -- some weird interface or database glitch?

As a result of abstaining from Slashdot for almost six months, :-), yesterday my wife and put up this new GitHub project called "NarraFirma". It is a single-page JavaScript application supporting Participatory Narrative Inquiry (PNI) as described in her free book which we have been working on for about a year:

That application supports multi-user editing using a version of the Pointrel system that supports distributed messaging using a triple store. It's not "done", and has many issues and TODOs, but at least it is useable either as a NodeJS application or a WordPress plugin.

Letting myself post this to Slashdot again right now is sort of my reward for all that progress -- even as it is another lost morning. :-)

A lesser reason to stop posting was this reply by "swell" to a comment I posted related to technological militarism -- which may have struck a nerve with someone who says elsewhere he "was one of the first from the US in Vietnam":

While I discarded his specific hedonistic advice, and I disagree with aspects of the extreme fatalism in his post ("it won't make any difference" vs. perhaps "it won't make *much* of a universal difference, but it will still matter a lot to you and your local community in the near term"), nonetheless, his negative feedback crystalized the fact that my postings on Slashdot had been facing diminishing returns. If my Slashdot posts were starting to stand out, that was in part a function of much of the rest of the community fading away. But I won't disagree I was posting too much on the same topics -- as important as they were like vitamin D deficiency, vegetable phytonutrient deficiency, advanced technology's effect of employment, basic income, the irony of tools of abundance like advanced computers and better materials being used in militarism assuming scarcity, etc.. And to not much good effect for me or the community at that point. True, I linked to my personal site (which has no ads or such), but in order to inform, even if maybe the message was ignored. So, thanks "swell" -- for helping me realize that.

Coincidentally, I also signed up as a Reddit user yesterday, although that was just to post a couple of links to my writings to subreddits on Technostism (seen mentioned at e-catworld) and Basic Income. Just more of the usual but in a different place. But don't plan to participate much on reddit (I don't have the time). And looking for those items on the main reddit page, one can see how in seconds they are buried under many other posts on endless different topics -- like "drinking from a firehose". Compared to the web of 1997, it much harder these days to put a signal through all the noise...

Anyway, I don't know if I'll ever post on Slashdot again regardless of how it gets sold. I had used Slashdot for more than a decade as sort of a mix of a blog for me and a way to interact with a community of technologists -- and maybe hopefully to even help educate the next generation of technologists. I miss Slashdot, but that is maybe also missing a Slashdot that "was", as well as a life situation with enough time to participate in that community, and a certain hope there was still time to make a difference. I can also be worried about reverting to my own old ways and spending too much time here I should be spending in other ways. That can be true eve if I now feel increasingly out of touch with things like, say, current fast-moving trends in UNIX system administration (although there are many news sources for that kind of info these days). I've changed; Slashdot has changed; the world has changed -- hard to make sense of all those changes or what they imply. But I can still be very thankful for all that the Slashdot community has given me over the years through many discussions.

So, is this post, "So long, and thanks for all the fish"? Dunno. Maybe.

Here is one last article submission for the road, at least: :-)

In any case, whatever happens to the community, the Slashdot archive remains of historical interest IMHO, and should be preserved by a group like Archive.org. Perhaps one of my college professor's influence (the late historian Michael S. Mahoney who wrote on the history of computing) may have rubbed off a bit? :-) I'd love to curate such a site and build tools to study it. I'm a trustee of my local historical society. We focus mostly on history of about 150 years ago (such as farming, homesteading, and simple manufacturing) in a couple museums. Increasingly I'm thinking of computing as history (e.g. my old Commodore VIC-20) -- even if I doubt I could ever sell that idea to the current board, who still see computing as a mostly future thing. :-) Even if the people and artifacts of the social transition from pre-computing/pre-internet era to what we have now and beyond may be fading as I write... Such is perhaps the nature of history, with several tales of old historical computers just carted off as scrap... Although in this case we're just talking about preserving access to data, and supporting annotating it, ideally in a distributed way, which is probably overall cheaper than running big physical facilities open to the public... Still, no doubt there are privacy issues and so on. Few worthwhile projects seem easy, otherwise they would have been done already. Sure there is a "Computer History Museum" in CA -- but what would the world be like with just one museum about art or early farming? Still, Archive.org is accessible globally, so it seems like a good potential home for the Slashdot archives, however the Slashdot community/infrastructure transforms in the future.

As always, I'll close with my sig, in case some huge AI reads this someday and maybe gains some insight from it (maybe, like a toddler with nuclear bombs as toys, only after wiping out its physical humanity "parents" either accidentally or on purpose):
"The biggest challenge of the 21st century is the irony of technologies of abundance in the hands of those thinking in terms of scarcity."

Submission + - Why Are There Still So Many Jobs?

Paul Fernhout writes: MIT economist David H Autor has written an article entitled "Why Are There Still So Many Jobs? The History and Future of Workplace Automation". His article is a good read to understand the best of emerging mainstream economics thinking on technology and employment.

I feel his article leaves out some fundamental political aspects of the situation like I brought together in "Beyond a Jobless Recovery: A heterodox perspective on 21st century economics"). His article of course assumes consumer demand is infinite (despite Maslow's hierarchy of needs suggesting people more to more low-cost self-actualization activities over time). It assumes that the business benefits of employing a human will always outweigh the costs for many jobs (despite strikes, lawsuits, quality, illness, turnover). It assumes humans will always have special advantages over AIs and robots. It ignores whether some aspects of the economy (like long pipelines to become a professor) are really needed or are just protectionism. It ignores the social impact of rich/poor divides on working conditions and the operation of a capitalist economy itself. It ignores the value to the worker of the intrinsic nature of the work (i.e. some people may just be less happy in service jobs compared to agriculture or manufacturing). It ignores deeper issues of rethinking work as play (like Bob Black wrote about). It also ignores (incidentally, in relation to humans vs. robots) that "comparative advantage" only applies theoretically when you have "full employment". The article jumps between proving some points with numbers and then making other points as "strong hunches" or by quoting suggestions about technological unemployment from fifty years ago (quoting Herbert Simon). His prescription is of course mostly just more "education" — which is nice job security for a professor. :-) But, within those sorts of limits, it's an excellent article which makes many good points, especially about the dynamics of economic networks as different parts of them are automated. The article has many interesting facts and figures. His points on how jobs are a mix of tasks which different near-term prospects for automation is excellent. And his point about human jobs changing as people work together with automation is well made. So, his article provides a good base for further study and/or rebuttal of the mainstream position. His article could be a good starting point for anyone writing an economic simulation, to see what really happens to economic networks based on distributing the right to consume based on perceived contribution to production as such networks undergo severe stress from automation.

Comment Re:The poor and CO2... (Score 2) 248

Nothing happens for some plant types, and even the authors of this study said may. They had good reason to.


To quote from its abstract:

The consensus of many studies of the effects of elevated CO2 on plants is that the CO2 fertilization effect is real (see Kimball, 1983; Acock and Allen, 1985; Cure and Acock, 1986; Allen, 1990; Rozema et al., 1993; Allen, 1994; Allen and Amthor, 1995). However, the CO2 fertilization effect may not be manifested under conditions where some other growth factor is severely limiting, such as low temperature (Long, 1991). Also, plants grown in some conditions, where limitations of rooting volume (Arp, 1991), light, or other factors restrict growth, have not shown a sustained response to elevated CO2 (Kramer, 1981).

Note well that again they use the term may. This is because -- unlike you -- they seem to recognize that even though the effect is real and will have an impact in many locations and conditions, including those that generally hold in agriculture where one generally avoids growing plants in strongly resource constrained environments, one can certainly suppress the effect (or fail to observe it in the wild) in specific environments, and they go even further and note that the effect is differential according to plant type with some plant types more likely to exhibit a stronger response or be resource limited than others.

The bulk of this report simply works through specific food crop species and estimates their likely response to a mix of increased CO2 and the imagined climate changes that are predicted, or projected, or prophecied (as you wish) by the GCMs that so far haven't done a very good job of PP or P-ing the climate.

You would obviously like more papers:



(Abstract: Satellite observations reveal a greening of the globe over recent decades. The role in this greening of the “CO2 fertilization” effect—the enhancement of photosynthesis due to rising CO2 levels—is yet to be established. The direct CO2 effect on vegetation should be most clearly expressed in warm, arid environments where water is the dominant limit to vegetation growth. Using gas exchange theory, we predict that the 14% increase in atmospheric CO2 (1982–2010) led to a 5 to 10% increase in green foliage cover in warm, arid environments. Satellite observations, analyzed to remove the effect of variations in precipitation, show that cover across these environments has increased by 11%. Our results confirm that the anticipated CO2 fertilization effect is occurring alongside ongoing anthropogenic perturbations to the carbon cycle and that the fertilization effect is now a significant land surface process.)

Probably the best review article on the effect on trees, in particular, is this:


where in laboratory experiments on trees increasing CO2 by 300 ppm increased growth by 50 to 60%. Idso remarks that the problem with laboratory experiments is the opposite of what you assert -- it is difficult to grow trees in the lab without constraining their roots and access to resources and work he cites (in less abundance as it was ongoing in 1993) suggested that the response in the wild is even higher.

In general, in the mean, increasing ONLY CO2 in the environments of most wild plants does, in fact, increase their biomass and the net biomass of the Earth has almost certainly substantially increased on average, allowing for changes in land use over the last century. The effect is pronounced and relatively enormous in trees (and yes, I can cite papers there too -- a recent study at Exeter found a strong increase in tree growth rates across all of Europe due to increased CO2 fertilization). If you want to address crop species, you really need to look at crop yields where the plants are fertilized and irrigated as needed or grown in environments with consistently adequate rainfall as is the practice over a rather substantial majority of the world's agricultural production. There the benefits are clear, consistent, and large in the lab, and there is little good reason to doubt an average positive response in the field even though measuring the response with all of the confounding factors is difficult. Even without fertilization and irrigation, since nitrogen fixing species themselves respond positively to CO2 and since one response to CO2 is improved respiration efficiency and water retention, one usually gets some positive response, but in drought constrained, fertilized environments one can sometimes see no or even negative responses because CO2 aside it is a bad idea to overfertilize or outgrow your water supply.

Now that we've established that the literature on this -- which you clearly have not read outside of a single google search or the moral equivalent thereof -- is widespread and not limited to a single Nature article on wild grasses (which are one of the least likely species to benefit from increased CO2 because grasslands are often grasslands because they are resource constrained) let's address the effectiveness of your debate style. Do you really think ad hominem improves your argument? I have at no time asserted that I am the "smartest motherfucker in the room", and calling somebody "Mr. Smarty-pants" is the act of a child in a child's argument, not an adult. Asking for references is perfectly reasonable, although they aren't terribly difficult to find on your own, but seriously, slashdot is already way too much like kindergarten as a debate forum to make it even worse. I would respectfully suggest that you respectfully reply, even to individuals that you think are wrong, and that you make an effort not to argue like a five year old.

To conclude, In all of your replies so far, outside of your bellittling language and sarcasm, you have made a single categorical statement -- that CO2 will have and has had no effect on plant growth or crop yield or biomass increase because plant growth is "almost never" CO2 limited that, I trust, I have just proven to be pure bullshit at least as far as the vast majority of the literature on the subject is concerned. Again, this isn't really even a matter worthy of contention -- in C3 plants it increases respiration efficiency and water retention (making better use of even constrained water resources, hence its impact at the boundaries of deserts), it increases photosynthesis rates and efficiency, it increase the rates of nitrogen fixation in nitrogen fixing species (again directly feeding back into a potentially rate limiting resource). It is standard practice to boost CO2 in greenhouses, the positive effect on the biosphere is clearly visible from satellites, the measured effect is huge in trees and is clearly observed in natural tree populations, and the variability of the effect across C3 species (where it is larger for tuberous plants, for example, than thin-rooted plants) is established. The effect is nearly always positive although of course it is still being studied and can be neutral to negative in certain environments.

In actual agriculture the effect is almost always going to be beneficial and fairly substantial, supporting my claim that it has had a fairly substantial positive impact on global food production as the planet has gone from a CO2-limiting 280 ppm to 400 ppm.

Bear in mind that for most of the geological history of the Earth, CO2 levels have been far higher than they are today, and that during the last glacial period partial pressures dropped to just above the edge that would have caused mass extinction of at least some plant species. Plants in general are well-adapted to use more carbon dioxide than the atmosphere contains right now given "average" levels of water and nitrogen and other natural constraining resources, and this can hardly be a surprise as it makes perfect sense. I realize that this is an inconvenient fact if your only interest is to demonize CO2 and belittle anyone that doesn't agree with you, but as I said in the top comment that you are replying to, opponents of carbon dioxide are happy to count the "death toll" of climate change on the extrapolation of the most tenuous of evidence, such as assertions of an empirically resolvable increase in the frequency or violence of tropical storms or extreme weather events, while ignoring and shouting down any related extrapolation of benefits, even when they are backed by rather solid evidence (such as, almost the entire literature on the subject of the effect of carbon dioxide on plant growth).

I don't know what your profession might be, NeutronCowboy, but if it is a scientific one you might meditate on the importance of maintaining an open and balanced mind when considering even things you feel very strongly about. I'm just sayin'...


Comment Re:The poor and CO2... (Score 1) 248

Plant growth is almost never CO2 limited outside of the lab.

This is, with all due respect, simply not true. It not only isn't true, it is in multiple textbooks as not true, backed by many studies of both lab and field data. Not only is it not true, it is differently untrue for different kinds of plants -- some of them are more likely to be carbon limited than others and respond more strongly to the almost 50% increase in CO2 from 280 ppm to 400 ppm. It isn't true in multiple dimensions -- it alters plant respiration and water retention, it alters the quantum efficiency and temperature optimum of photosynthesis, increases the rates of nitrogen fixation (altering directly one of the other rate limiting resources), and increases the nitrogen efficiency of woody and other C3 plants. One of the more interesting ways it is not true is that the pattern of tree ring growth around the world has been altered (increased) by the increased CO2. This is one of many confounding effects in the use of tree rings to infer past temperature and is sadly often neglected (or breezed on past) simply because it by hypothesis is covariant with the temperature (making causality very difficult indeed to disentangle). You might want to talk to some agricultural scientists and biologists before shooting from the hip on that one, cowboy -- it was first pointed out to me by a soil scientist in the Duke School of the Environment as not even something worthy of contention, a simple well-known fact..

It is, however, something that is frequently stated by those that want to completely discount any possible benefit from CO2 in order to demonize it and to enhance their political argument that we have to reduce its production at all costs because it has no benefits and not even "risks" -- certainties of disaster if we don't.


Comment The poor and CO2... (Score 1) 248

Sadly -- and I do mean sadly -- the effect of CO2 on "the poor" is never accurately or fairly tallied.

If it were, the tally would have to begin with the massive amount of greenhouse research on the positive effects of CO_2 on plant growth, research that demonstrates (for example) that it is easily cost-beneficial to buy apparatus to maintain a CO2 concentration over 1000 ppm in actual greenhouses. By raising atmospheric CO2 from 280 to 400 ppm, we have in fact raised crop yields worldwide by between 10 and 15%. Close to 1 billion people dined last night on the extra crop yields (all things being equal) produced by the extra carbon dioxide. It is difficult to put a price tag on this generally neglected benefit, simply because it is so enormous. It is a benefit not only to humans -- it pervades the entire biosphere with very few exceptions. Different types of plants don't all respond equally to increases in CO2, but they all respond positively and everything from the grass in your front yard to food crops to trees are growing faster and more every year. It also has secondary benefits -- plants raised in a CO2 rich environment tend to be more drought resistant as the extra carbon dioxide causes the plant's need to respire to reduce, so it retains water longer. There is evidence that this is already impacting deserts by greening their edges.

Then there are the benefits of the electricity produced. So far, the benefits of making electricity (and other products) burning coal have included things like "building civilization". The lack of the benefits of electricity are one of the fundamental things that make the global poor (the poorest third of the world's population) poor in the first place. Cheap and abundant electricity means clean water, sewage treatment, inexpensive fertilizer, cooking on something other than dried dung or charcoal, light after dark, refrigeration, transportation, jobs and manufacturing, health care, and access to communication, education, information, and entertainment. At the very least, the lack of reliable and affordable sources of electricity means the general lack of most or all of these things. The people reading this post (many of whom will, I'm sure, already be gathering their nuclear device flames:-) would, I would wager, forgo flaming this post if the cost of dong so would be spending one single month living in a mud hut in north India without electricity or one single one of the products electricity enables (such as clean water).

Anything that raises the cost of electricity and imposes barriers to its cost-effective implementation in the world's poorest countries has the direct and immediate effect of hurting the poorest people of the world far more than all of the "climate change" that has thus far been attributed to increased carbon dioxide in the atmosphere, even if you are willing to attribute every single storm or heat wave to climate change instead of acknowledging that the data on storms itself (over a pitifully short interval of accurately recorded history) provides almost no evidence for change, let alone attributable negative impact.

One can easily understand why China and India are investing in coal burning power plants at a ratio of something like two parts new coal generation capacity to one part everything else (including nuclear) put together. Unlike Mr. Gates, they can do the human arithmetic. Even though their coal plants are comparatively dirty and have directly observable negative impacts, those impacts pale beside the benefits of the reliable electricity they produce, and like it or not, wind and solar are thus far neither reliable (in terms of having a high quality of service duty cycle) nor (generally) cost effective when directly compared to the delivered cost of coal generated electricity. If they were, China would invest even more heavily in them as they are doing the math without the saving-the-world sentiment.

With all of that said, it is still absolutely worthwhile to continue to invest in research that can improve the cost-efficiency and reliability of sources of energy other than coal. Coal has value beyond just making electricity - concrete, steel, chemicals, liquid vehicle fuels -- there are many things that coal is good for (some of which produce CO2) and it is a finite resource worthy of being conserved. It would also be nice to put human civilization on a long run sustainable basis, which ultimately means thermonuclear fusion, thorium fission, and some mix of solar and wind with the means to both store and distribute that is currently lacking. At some point we might advance the technology to where adopting it isn't basically investing at an economic loss and subsequent, substantial, cost in human economic and social comfort, at which point one won't need to make it a "save the world" issue to ensure its rapid deployment.

But in the meantime, it is pointless to demonize CO2 per se on the grounds that burning coal to make concrete and electricity is somehow hard on the poor. It is not. Quite the opposite. At the moment, it and the CO2 produced from burning it so far have been so overwhelmingly, civilization-building poverty ending positive that it is IMO very difficult to point to one single thing in all of human history that has had such a positive impact on the human species -- perhaps the invention of the microscope or the development of physics, calculus, and the scientific method that gave rise to the current electrical grid, with its predominant dependence on coal, the invention of "human rights", little else. Indeed, IMO it is reasonably likely that if mankind were presented with a clear picture of the advantages of a 400 ppm CO2 atmosphere relative to a 280 ppm CO2 atmosphere including all impacts on climate and the biosphere and the value of the civilization coal built and continues to sustain they would vote tomorrow to do it all again.

Perhaps -- and it is at most perhaps, with lots of questions and uncertainty from top to bottom -- CO2 is a future risk that will somehow manage to create costs that overwhelm its immediate and obvious cumulative past and net present benefits so far. It is actually a bit difficult to imagine how, given that the benefits so far include building civilization as we know it and feeding a billion people a day, but perhaps. This justifies prudent investment, but it does not justify -- in my opinion -- the in-perpetuity overstatement of colorfully amplified risks and the complete ignoring of its plainly obvious past and present benefits, or the distortion of the certainty of the science upon which predictions of disaster are based. It also doesn't do to ignore the fact that most of the worst-case scenarios used to "sell" draconian and expensive "solutions" to a problem that is nowhere near as clearly defined as it is presented to be are themselves arguably rather unlikely.


Comment Flexibility is a feature (Score 1) 904

Electric car advocates continually make the flawed argument that because an electric car can have a daily range of 200 miles or so, it can replace the gasoline car for most users. This isn't true at all. People pay for gas cars not just to be commuter appliances, but to have transportation flexibility. Flexibility matters to a lot of people, even if they don't use it, it matters. It's nice to know that if I wanted to, I could drive my gas car the 790 miles to my in-laws house, or 200 miles to my brothers, or 500 miles to my aunts and uncles. It my cheaper for me to take a plane to go by myself, but, add a wife and a couple of kids, then my transportation cost for each trip is about $100-$150 in fuel and my time in driving.

So, with that in mind, I think the real tipping point for electric vehicles will be total operating time on a charge. That means, I want to be reasonably able to drive 10-12 hours on a long road trip with perhaps an hour time for charging. Once that happens, then electric cars will take over for everyone.

With that said, in a married family, having two vehicles, one for road trips, an SUV, and a daily commuter that is electric, makes a great deal of sense. But most families are going to have that "one" vehicle.

"Pok pok pok, P'kok!" -- Superchicken