Forgot your password?
typodupeerror

Comment: Re:FONC: Fundamentals of New Computing -- Alan Kay (Score 1) 367

by Paul Fernhout (#47522231) Attached to: 'Just Let Me Code!'

I certainly understand the frustration of dealing will overly complex systems or also a rushed language like JavaScript where variables are globals by default. Still, did you know that you can compile almost anything to JavaScript these days and have it run in the browser at near native speeds (well, only some browsers, but likely more and more). See: http://techcrunch.com/2013/12/...
"While Google is betting on Native Client to allow web apps to execute native compiled code in the browser, Mozilla is betting on its ability to run JavaScript at near-native speeds, too. While they approach this problem from very different angles, both Google, through Native Client, and Mozilla, through its Emscripten LLVM-to-JavaScript compiler, allow developers to write their code in C or C++ and then run it in the browser."

So, JavaScript is just another platform now in that sense. But it is a platform that is almost everywhere significant for substantial human interaction... And installing JavaScript software for the end-user is as easy often-times as just surfing to a web page. If people don't actually install your software, what good is it?

Does JavaScript have problems? Yes. Tons. But it also has a lot of merits.

As for gibberish -- Cantonese sounds mostly like gibberish to me, but that is because I never learned to speak it. :-)

Would I like a simpler software stack and simpler but better languages and tools? Yes. I helped fight that battle over a decade ago like with VisualWorks/Squeak and we lost to stuff like Java, PHP, and JavaScript and the associated tool chains. Squeak showed what was possible, but almost no one would install it (the Squeak licensing confusion did not help there either). See also:
http://bitworking.org/news/290...
"Regular readers are quite tired of me pointing to this video, Alan Kay: The Computer Revolution hasn't happend yet. Keynote OOPSLA 1997, but I think it's quite fundamental to understand that Alan Kay had a vision for the web, and though his understanding of the role of HTML in the world of 1996 was flawed, it seems the collective web has spent the last ten years building exactly what he described, with HTML/SVG being the display substrate and JavaScript being the code to drive that display. Ten years later we have the Lively Kernel: ..."

But the past is the past. We have to start from where we are -- and today, people live in their HTML5/CSS3/JavaScript browsers or will soon (even on phones, and especially on the emerging Firefox OS phones). I'm writing this from a US$250 Chromebook that is almost entirely just a browser as far as user experience. Any sophisticated-enough system could eventually remake the stack underneath it, like native Squeak could do. So, saying we could build on JavaScript does not mean endless perpetual complexity. Chrome OS shows how a focus on HTML/CSS/JavaScript can sometimes simplify things though from one perspective, especially user experience.

There are several issues related to complexity (inherent complexity, accidental complexity, user expectations, installability, standards, etc.). Regardless of technical merit, HTML/CSS/JavaScript/PHP won in key areas. Sure, you can create the next Squeak, and good luck with that, it is a fun project. And/or we can try to use the current widespread platform as best as we can. For me right now, that means minimizing the backend (PHP or whatever) while emphasizing the front-end (JavaScript) and ideally encoding data in useful long-term forms.

And for those who like their Smalltalk deployed as HTML/CSS/JavaScript, see:
http://amber-lang.net/
"The Amber language is deeply inspired by Smalltalk. It is designed to make client-side development faster and easier. Amber includes a live development environment with a class browser, workspace, unit test runner, transcript, object inspector and debugger. Amber is written in itself, including the compiler, and compiles into efficient JavaScript, mapping one-to-one with the JS equivalent."

As Manuel De Landa wrote:
http://www.t0.or.at/delanda/me...
"Certain standardizations, say, of electric outlet designs or of data-structures traveling through the Internet, may actually turn out to promote heterogenization at another level, in terms of the appliances that may be designed around the standard outlet, or of the services that a common data-structure may make possible."

And that is what we are seeing with JavaScript now, where you can run anything from a KIM-1 to a Commodore C64 to Linux on it:
http://www.robsayers.com/jskim...
http://www.kingsquare.nl/jsc64
http://bellard.org/jslinux/

Comment: Serval Mesh Networking for Android (Score 3, Informative) 59

by Paul Fernhout (#47521955) Attached to: How the Internet of Things Could Aid Disaster Response

From: http://www.servalproject.org/ and http://developer.servalproject...
---
"Serval Mesh is an Android app that provides highly secure mesh networking, voice calls, text messaging and file sharing between mobile phones using Wi-Fi, without the need for a SIM or any other infrastructure like mobile cell towers, Wi-Fi hotspots or Internet access."
1. Communicate anytime
Mobile phones stop working when cellular infrastructure fails. The Serval Mesh changes this, allowing mobile phones to form impromptu networks consisting only of phones. This allows people nearby to keep communicating when needed most.
2. Communicate anywhere
Cellular networks are not available everywhere. In Australia for example, around 75% of the land area lacks mobile coverage. Letting mobile phones form stand-alone networks provides a cost-effective solution for communities in these remote areas to enjoy mobile communications.
3. Communicate privately
In this modern world private conversation with friends, families and service providers is vital, whether discussing medical issues or other private subjects. The Serval Mesh is built on a foundation engineered to support security. Voice calls and text messages are always end-to-end encrypted using strong 256-bit ECC cryptography. Encrypted calls work even on low-cost Android phones.
4.Communicate with people
The Serval Mesh is about enabling people to communicate with one another, regardless of what circumstances may befall them, or where they live in the world. Because at the end of the day, relationship with one another is what life is all about.
---

Serval was one of the first things I installed on a trio of cheap Android phones I bought for Andriod development and testing purposes several months ago (the Kyocera Hydro phones themselves ranged from US$35-$55 in price each). Still has rough edges, but getting there.

The Serval project is also working towards cheap rugged repeaters. "The Serval Mesh Extender is a hardware device that helps other devices to join and participate in a Serval Mesh network. ... Mesh Extenders mesh together over short distances using Ad Hoc Wi-Fi, over longer distances using packet radio on the ISM 915 MHz band"

I suggested related ideas back around 2000 based on two-mile range radios:
"[unrev-II] The DKR hardware I'd like to make..."
http://www.dougengelbart.org/c...

Very cheap insurance to make sure people have these sorts of devices for an emergency, which these days would not cost much more than a decent US$100 "weather radio" even with basic Smartphone features...

Comment: Solutions -- require tokens & connection phase (Score 0) 125

by Paul Fernhout (#47521847) Attached to: The Psychology of Phishing

In the past, I used whether an email contained my first name as an indicator (a textual token) of whether the email was legitimate, as a sort of password to gain access to your attention. That stopped being useful several years ago as many spammers must have a name database to go with email addresses now. That also would not work for people whose entire first name was in their email address, as is often a corporate practice. Still, the idea of filtering email on a token can make sense, where the token says the sender has been authorized to send you email.

I still have filters for certain keywords like products I support as a way of doing some of this filtering. A next step could be to tell people (on a contact web page) that they need to include some token phrase like "swordfish" in any email to you if they want it to get read as a first-time sender. Or the token could be a random uuid like "f34f775b-3ccb-45e0-a75e-06f845f0c318". It is relatively easy to make filters in many email clients that would prioritize emails with an expected token. After you get such an email from someone the first time, you can whitelist the sender. Granted, phishing or spam often forges sender email addresses. So, there is a problem here that the validity token ideally should be in every email sent to you to avoid relying on whitelisting address.

Ideally, there could be one unique token per entity (or email address) you want to get emails from. Then you could selectively disable and change the token if spammers got one. These tokens then are specific to an allowed communications channel. That requires more complexity though. For example, when you signed up for a mailing list, you could give the list a token such as the above (or perhaps just accept a random one from the list signup procedure), and the list software would store that token to include in a header when it sends a message to you. You would also tell your email client about the token being associated with the sender somehow (either the email address or the sender name or perhaps some other unique sender identifier like a public key). When your client software receives email, it would check if the email has the expected token for the sender. If the email does not have the token, it would be marked as probably spam or phishing. Email tools would need to have this facility built into them, both for sending and receiving. Public mailing lists might need to filter out such tokens from their public web pages of email archives to prevent spammers from harvesting such data to spam the list.

Still, how can people contact you the first time? One answer is to separate the process of getting emails from a trusted source from the process of requesting a token. For example, when someone new wanted to contact you, they could need to go to a web page (or other means) and get a token for their sending email address (or other identifying information, like a public key). That web page might include some sort of captcha challenge or something requiring computational cost or even direct monetary cost (like a small amount of money required to be spent via Paypal or another service, perhaps as a donation to a favorite charity). A web form to do this might need to send a special email to your client that includes both its own token and the new sender and new token, which would need to be processed by your email client to make the association.

This would be a big difference from now, when the first contact you get from someone new might be directly via a new email which could be the spam or phishing attempt. Tokens could also be valid for a limited time. There could even be general tokens not associated with a specific email address, perhaps time-limited ones, ones that need to be paired with other tokens or perhaps topical key words (like a product name) to be considered valid. This does make it harder for senders to send emails, but it makes it more likely they will be read and not ignored as spam.

One advantage of this system is it could build on top of the current email architecture. Clients that don't support such tokens would just not work as well because they would not be able to distinguish Spam of Phishing as well, and emails they send without such a token would be more likely to be ignored.

There are other approaches of course to reduce spam and phishing. There are already mail receivers that will ask a first time sender to confirm identity or do greylisting. There are domain-oriented solutions too like DMARC: http://www.dmarc.org/

But a token-based system just seems appealing to me because it sort-of worked for me for a time (based on my first name). I'm sure spammers or phishers would find some new way to get around these channel tokens eventually, but it seems like a next step in the evolutionary arms race of validating wanted communications. If tokens were associated with public keys which represented identities, a next step could be to sign the entire email or at least the token and a timestamp and message ID with the associated private key. This approach relies also on the fact that much email being sent is now encrypted all along the way.

There is at least one big flaw in this approach though. What to do about the CC or BCC list? Ideally, each recipient should get a token specific to the recipient. That seems to imply each recipient will have the email sent directly to him or her or it. But that is not how email servers work now, where one server might dispatch a single email to multiple destinations. So, still things to think through, including how much email servers should get involved with such tokens.

Paul Jones has his #noemail campaign as one potential solution to email woes, but I feel that throws the locally-stored email message center baby out with the spam & tl;dr bathwater. Paul Jones pushes social media and blogs as an alternative to email, which have their merits (see below at end), but are still suffering from more and more spam too. Web solutions also make it harder to have a local copy of correspondence. For example, I have many years of emails in my local email archive which I can search and review locally including to mailing lists, but l have little history of what I have read or posted via the web. Still, private email has its limits. As with my immediately previous Slashdot post referencing a post to the FONC mailing list, I can find some of my emails to public lists via Google. Using Google to find emails I wrote to public lists can be easier than searching my my email archive which is on a different machine than I post to Slashdot from often. And it is great to be able to link to such posts via a URL. I've started saying, "if it does not have a URL, it is broken". Ultimately I feel we need something that combines the best of email and the best of the web like perhaps a "social semantic desktop" such as I and others have worked towards. That could have a better infrastructure than email with anti-spam protections built-in (including perhaps public key authentication of senders, and perhaps even using DNS records to associate public keys with domains).

Comment: FONC: Fundamentals of New Computing -- Alan Kay (Score 3, Insightful) 367

by Paul Fernhout (#47520095) Attached to: 'Just Let Me Code!'

From: http://vpri.org/fonc_wiki/inde...
---
We are faced with a need for significant action and the odds are stacked against us. Invention receives no attention, and innovation (even when incorrectly understood) receives lip service in the press but no current-day vehicle exists to to nurture it. This wiki is an open invitation for talented individuals to pool their energy and collaborate towards fundamentally changing computing.

Over the years many groups have debated how to make progress in computing. There were likely as many opinions as there were people in the debates. Nevertheless personal accounts suggest that initiatives were sometimes reduced to a handful and then pursued with vigour. Consider what could be achieved by following the same pattern today, with the added benefit of doing it as a virtual, distributed team.

Our goal could be to capture the significant ideas and initiatives that we have been exposed to, are aware of, or can discover, distil them into groups, reduce them to a handful of concepts worthy of vigorous exploration, and focus our efforts on these common ideas with the eventual aim of making substantial progress towards finding a common set of fundamentals of new computing.
---

See also: http://vpri.org/fonc_wiki/inde...

A big focus of FONC was in reducing lots of complexity. Smalltalk shows what is possible... But in practice new languages and new standards often just add more complexity to the mix and what we often need are better tools for dealing with complexity. And community and trends mean a lot too, as does hireability and ubiquity and easy installability. So, again, in practice, I'm moving to JavaScript with conceptually simple backends (even in, yikes, PHP) -- inspired in part by Dan Ingall's own work with the Lively Kernel which shows what is possible as near-zero-effort-to-install JavaScript apps.

My own thoughts on FONC from 2010:
"fonc] On inventing the computing microscope/telescope for the dynamic semantic web"
https://www.mail-archive.com/f...
---
Biology made a lot of progress by inventing the microscope -- and that was done way before it invented genetic engineering, and even before it understood there were bacteria around. :-)

What are our computing microscopes now? What are our computing telescopes? Are debuggers crude computing microscopes? Are class hierarchy browsers and package managers and IDEs and web browsers crude computing telescopes?

Maybe we need to reinvent the computing microscope and computing telescope to help in trying to engineer better digital organisms via FONC? :-) Maybe it is more important to do it first? ...

It's taken a while for me to see this, but, with JavaScript, essentially each web page can be seen like a Smalltalk ObjectMemory (or text-based image like PataPata writes out). While I work towards using the Pointrel System to add triples in a declarative way, in practice, the web of calling cgi scripts at URLs is a lot like message passing (just more like the earlier Smalltalk-72 way without well-defined syntax). So, essentially, a web of HTML pages with JavaScript and CGI on servers is like the Smalltalk system written large. :-) Just in a very ad hoc and inelegant way. :-)

---

Comment: Look into adequate vitamin D for joint pain (Score 1) 528

And also eating more vegetables and fruits (such as Dr. Joel Fuhrman's work or Dr. Andrew Weil's work) to reduce inflammation. You might also be sensitive to some compounds in food, such as in the nightshade family (like tomatoes) or possibly other things (food additives, etc.)

If you want true alternatives. gold and guns/ammo won't help. All that can be confiscated.

I collected some better solutions at this link and elsewhere on my site:
http://pdfernhout.net/beyond-a...
"This article explores the issue of a "Jobless Recovery" mainly from a heterodox economic perspective. It emphasizes the implications of ideas by Marshall Brain and others that improvements in robotics, automation, design, and voluntary social networks are fundamentally changing the structure of the economic landscape. It outlines towards the end four major alternatives to mainstream economic practice (a basic income, a gift economy, stronger local subsistence economies, and resource-based planning). These alternatives could be used in combination to address what, even as far back as 1964, has been described as a breaking "income-through-jobs link". This link between jobs and income is breaking because of the declining value of most paid human labor relative to capital investments in automation and better design. Or, as is now the case, the value of paid human labor like at some newspapers or universities is also declining relative to the output of voluntary social networks such as for digital content production (like represented by this document). It is suggested that we will need to fundamentally reevaluate our economic theories and practices to adjust to these new realities emerging from exponential trends in technology and society."

Learning more about health creation for yourself falls in part under subsistence production... And also the gift economy,,,

Comment: The End of Diabetes by Joel Fuhrman, M.D (Score 1) 253

by Paul Fernhout (#47487407) Attached to: New Treatment Stops Type II Diabetes

Glad to here about your success story! If you want to take your success to the next level, you may find this of interest:
http://www.drfuhrman.com/shop/...
http://www.amazon.com/The-End-...
"This New York Times best seller offers a scientifically proven, practical program to prevent and reverse diabetesâ"without drugs. Diabetes does not have to shorten your life span or result in high blood pressure, heart disease, kidney failure, blindness or other life-threatening ailments. In fact, most type 2 diabetics can get off medication and become 100 percent healthy in just a few simple steps. This book offers no compromises, it is the most aggressive and effective approach to reverse obesity, high blood pressure, high cholesterol, and heart disease; which typically accompany type 2 diabetes. The information about Type 1 diabetes is simply life saving. It is a must read for every diabetic, as well as any nutritionally-aware person wanting to understand the failure of conventional medical care for diabetic treatments and the "no-brainer" of using nutritional excellence, not drugs."

And see:
http://www.drfuhrman.com/disea...

The grand parent poster said quadrupling *vegetables* (many of which are leafy greens like Kale) not "complex carbs"... And there are much healthier things to eat than cancer-implicated processed lunchmeat if you want to eat meat...

Also, exercise does not help much with weight loss because it stimulates the appetite, even though exercise in general is good for health...

Also, for yet another different perspective (on how the recommendations decades ago to avoid fat on the theory it made people fat have instead led to an epidemic of obesity and heart disease by leading people to eat too much sugar):
http://healthimpactnews.com/20...

Good luck staying with what is working for you and maybe even going further which might then free up energy for your titanic plans! :-)

Comment: WordPress powers ~20% of web with 257 employees (Score 1) 272

by Paul Fernhout (#47487181) Attached to: Ask Slashdot: How Many Employees Does Microsoft Really Need?

See the chart here: http://automattic.com/work-wit...

Granted there are many people who contribute to the WordPress ecosystem who don't formally work for Automattic given the FOSS nature of WordPress and related plugins. It's just a very different 21st century way of doing business compared to the 20th century Microsoft model, and is doing a better job of bridging the exchange and gift economies (like I talk about on my site).

Automattic, which shepherds the core of WordPress, sounds like a great place to work for people like me who are comfortable working from home. The future for WordPress looks pretty amazing, especially given ever better JSON/AJAX RESTful support for JavaScript-powered frontend apps. See also:
http://inside.envato.com/the-f...
"For those willing to ignore the prevailing opinions in the programming community, Tom Willmot says that WordPress presents developers with incredible opportunities, and a wonderful sense of community: ..."

I've been looking at shifting my own "Pointrel" and "Twirlip" projects, my wife's "Rakontu" and "NarraCat" projects and other similar work (stuff related to participative narrative inquiry, civic sensemaking, public intelligence, social semantic desktop tools, educational simulations, and more) to have JavaScript frontends that use WordPress as an application server backend (rather than have them run stand-alone). That would make it easy for millions of WordPress users who might want such tools to install them as a WordPress plugin with a couple clicks. As Alan Kay said about Squeak, getting people to install anything to try it is hard. Other benefits would include easy authentication support. I expect more and more projects by other people will be moving in that direction. I'm tempted to apply to work at Automattic myself at some point given their FOSS focus. They are also hiring as they got a bunch of venture financing recently. But I would want to make at least a demo of that integration first. I plan on putting such a demo here when it works: http://twirlip.com/

Of course, JavaScript has problems (globals by default), PHP has problems (such a long list..), and WordPress has problems (no doubt), with many problems coming from their historical roots and a need for backward-compatibility. But I can't deny all three won some battle for mindshare for whatever reasons (especially ease of initial use), and when you can't beat 'em, join 'em, right? :-) Like Manuel De Landa wrote in "Meshworks, Hierarchies, and Interfaces", a uniformity on one level can often in turn support a diversity on a level above it.

See also on the value of having a diversity of programmers of a variety of experience levels in an organization:
http://slashdot.org/comments.p...

What I especially envision is that all those millions of WordPress sites could start talking to each other in interesting ways... See also Theodore Sturgeon's 1950s short story "The Skills of Xanadu" for where it all might lead...
http://slashdot.org/comments.p...
https://archive.org/details/pr...

Or as I reprise here:
http://lists.alioth.debian.org...
"Gold Leader: Pardon me for asking, sir, but what good are semantic wikis and desktops going to be against [that]?
General Dodonna: Well, the Empire doesn't consider a small cgi script on a shared server or desktop to be any threat, or they'd have a tighter defense."

Comment: Cheaper to robo-bulldoze house and reprint it? BI? (Score 1) 508

by Paul Fernhout (#47473469) Attached to: Ask Slashdot: Future-Proof Jobs?

http://slashdot.org/story/07/0...

While I agree with the validity of your points for the next 10 to 20 years, in the longer term, better design and better tools will make it cheaper to completely rebuild houses so they are lower maintenance, more energy efficient, and easier to clean and maintain by robotics or by modular snap in replacements like the grandparent poster suggested. The only reason not to bulldoze older housing in a world of cheap energy and cheap robotics would be for historical preservation reasons or perhaps sentimentality (although Virtual Reality could address some of the sentimentality aspect).

This is similar to how people are now generally getting rid of old computer equipment (especially cellphones) when a capacitor or battery goes bad rather and replacing it with something new rather than trying to take it apart and repair the component like 50 years ago. "Computers" used to cost millions of dollars and take up rooms, now you can put a few in your pocket. I don't know what the equivalent shift for housing is, but we will no doubt find out. Some speculations are VR and pods like the Matrix or like in Marshall Brain's "Manna", or even just complete simulation of uploaded humans "living" in silicon RAM instead of air-filled wooden houses?

See also Marshall Brain's "Manna" for a suggestion of how computer-given instructions delivered by wearables could turn almost every profession, even plumbing, into a micromanaged low-wage nightmare before general robotics arrive:
http://marshallbrain.com/manna...
"Depending on how you want to think about it, it was funny or inevitable or symbolic that the robotic takeover did not start at MIT, NASA, Microsoft or Ford. It started at a Burger-G restaurant in Cary, NC on May 17. It seemed like such a simple thing at the time, but May 17 marked a pivotal moment in human history. ... Manna told employees what to do simply by talking to them. Employees each put on a headset when they punched in. Manna had a voice synthesizer, and with its synthesized voice Manna told everyone exactly what to do through their headsets. Constantly. Manna micro-managed minimum wage employees to create perfect performance. The software would speak to the employees individually and tell each one exactly what to do. For example, "Bob, we need to load more patties. Please walk toward the freezer." Or, "Jane, when you are through with this customer, please close your register. Then we will clean the women's restroom." And so on. ... And Manna was starting to move in on some of the white collar work force. The basic idea was to break every job down into a series of steps that Manna could manage. No one had ever realized it before, but just about every job had parts that could be subdivided out.HMOs and hospitals, for example, were starting to put headsets on the doctors and surgeons. It helped lower malpractice problems by making sure that the surgeon followed every step in a surgical procedure. The hospitals could also hyper-specialize the surgeons. For example, one surgeon might do nothing but open the chest for heart surgery. Another would do the arterial grafts. Another would come in to inspect the work and close the patient back up. What this then meant, over time, was that the HMO could train technicians to do the opening and closing procedures at much lower cost. Eventually, every part of the subdivided surgery could be performed by a super-specialized technician. Manna kept every procedure on an exact track that virtually eliminated errors. Manna would schedule 5 or 10 routine surgeries at a time. Technicians would do everything, with one actual surgeon overseeing things and handling any emergencies. They all wore headsets, and Manna controlled every minute of their working lives.That same hyper-specialization approach could apply to lots of white collar jobs. Lawyers, for example. You could take any routine legal problem and subdivide it -- uncontested divorces, real estate transactions, most standard contracts, and so on. It was surprising where you started to see headsets popping up, and whenever you saw them you knew that the people were locked in, that they were working every minute of every day and that wages were falling. ..."

I hope we get a basic income (BI) and/or other broad economic changes before then. Something I put together on that a few years ago:
http://www.pdfernhout.net/beyo...
"This article explores the issue of a "Jobless Recovery" mainly from a heterodox economic perspective. It emphasizes the implications of ideas by Marshall Brain and others that improvements in robotics, automation, design, and voluntary social networks are fundamentally changing the structure of the economic landscape. It outlines towards the end four major alternatives to mainstream economic practice (a basic income, a gift economy, stronger local subsistence economies, and resource-based planning). These alternatives could be used in combination to address what, even as far back as 1964, has been described as a breaking "income-through-jobs link". This link between jobs and income is breaking because of the declining value of most paid human labor relative to capital investments in automation and better design. Or, as is now the case, the value of paid human labor like at some newspapers or universities is also declining relative to the output of voluntary social networks such as for digital content production (like represented by this document). It is suggested that we will need to fundamentally reevaluate our economic theories and practices to adjust to these new realities emerging from exponential trends in technology and society."

Comment: See also Dr. David Goodstein's 1990s predictions (Score 1) 178

by Paul Fernhout (#47430183) Attached to: Peer Review Ring Broken - 60 Articles Retracted

You make good points. See also: http://www.its.caltech.edu/~dg...
"The public and the scientific community have both been shocked in recent years by an increasing number of cases of fraud committed by scientists. There is little doubt that the perpetrators in these cases felt themselves under intense pressure to compete for scarce resources, even by cheating if necessary. As the pressure increases, this kind of dishonesty is almost sure to become more common.
    Other kinds of dishonesty will also become more common. For example, peer review, one of the crucial pillars of the whole edifice, is in critical danger. Peer review is used by scientific journals to decide what papers to publish, and by granting agencies such as the National Science Foundation to decide what research to support. Journals in most cases, and agencies in some cases operate by sending manuscripts or research proposals to referees who are recognized experts on the scientific issues in question, and whose identity will not be revealed to the authors of the papers or proposals. Obviously, good decisions on what research should be supported and what results should be published are crucial to the proper functioning of science.
    Peer review is usually quite a good way to identify valid science. Of course, a referee will occasionally fail to appreciate a truly visionary or revolutionary idea, but by and large, peer review works pretty well so long as scientific validity is the only issue at stake. However, it is not at all suited to arbitrate an intense competition for research funds or for editorial space in prestigious journals. There are many reasons for this, not the least being the fact that the referees have an obvious conflict of interest, since they are themselves competitors for the same resources. This point seems to be another one of those relativistic anomalies, obvious to any outside observer, but invisible to those of us who are falling into the black hole. It would take impossibly high ethical standards for referees to avoid taking advantage of their privileged anonymity to advance their own interests, but as time goes on, more and more referees have their ethical standards eroded as a consequence of having themselves been victimized by unfair reviews when they were authors. Peer review is thus one among many examples of practices that were well suited to the time of exponential expansion, but will become increasingly dysfunctional in the difficult future we face.
    We must find a radically different social structure to organize research and education in science after The Big Crunch. That is not meant to be an exhortation. It is meant simply to be a statement of a fact known to be true with mathematical certainty, if science is to survive at all. The new structure will come about by evolution rather than design, because, for one thing, neither I nor anyone else has the faintest idea of what it will turn out to be, and for another, even if we did know where we are going to end up, we scientists have never been very good at guiding our own destiny. Only this much is sure: the era of exponential expansion will be replaced by an era of constraint. Because it will be unplanned, the transition is likely to be messy and painful for the participants. In fact, as we have seen, it already is. Ignoring the pain for the moment, however, I would like to look ahead and speculate on some conditions that must be met if science is to have a future as well as a past."

I think a "basic income" for all could be part of the solution, because a BI would make it possible for anyone to live like a graduate student and do independent research if they wanted.

Comment: WebODF seems to use Dojo? (Score 1) 91

by Paul Fernhout (#47379677) Attached to: WebODF: JavaScript Open Document Format Editor Deemed Stable

https://github.com/kogmbh/WebO...

I like Dojo in part because it attempts to make all the core widgets accessible. From:
http://dojotoolkit.org/referen...
"Dojo has made a serious commitment to creating a toolkit that allows the development of accessible Web applications for all users, regardless of physical abilities. The core widget set of Dojo, dijit, is fully accessible since the 1.0 release, making Dojo the only fully accessible open source toolkit for Web 2.0 development. This means that users who require keyboard only navigation, need accommodations for low vision or who use an assistive technology, can interact with the dijit widgets."

Comment: See also Dr. David Goodstein on the Big Crunch (Score 1) 538

by Paul Fernhout (#47356707) Attached to: Teaching College Is No Longer a Middle Class Job

http://www.its.caltech.edu/~dg...
"Although hardly anyone noticed the change at the time, it is difficult to imagine a more dramatic contrast than the decades just before 1970, and the decades since then. Those were the years in which science underwent an irreversible transformation into an entirely new regime. Let's look back at what has happened in those years in light of this historic transition. ...
    We must find a radically different social structure to organize research and education in science after The Big Crunch. That is not meant to be an exhortation. It is meant simply to be a statement of a fact known to be true with mathematical certainty, if science is to survive at all. The new structure will come about by evolution rather than design, because, for one thing, neither I nor anyone else has the faintest idea of what it will turn out to be, and for another, even if we did know where we are going to end up, we scientists have never been very good at guiding our own destiny. Only this much is sure: the era of exponential expansion will be replaced by an era of constraint. Because it will be unplanned, the transition is likely to be messy and painful for the participants. In fact, as we have seen, it already is. Ignoring the pain for the moment, however, I would like to look ahead and speculate on some conditions that must be met if science is to have a future as well as a past.
    It seems to me that there are two essential and clearly linked conditions to consider. One is that there must be a broad political consensus that pure research in basic science is a common good that must be supported from the public purse. The second is that the mining and sorting operation I've described must be discarded and replaced by genuine education in science, not just for the scientific elite, but for all the citizens who must form that broad political consensus. ..."

So, the academics you knew were from before the "Big Crunch". Such people advised me, from their success, and meaning well, to get a PhD. But the world I faced was post-Big-Crunch and so their advice did not actually make much sense (although it took me a long time to figure that out).

More related links:
http://p2pfoundation.net/backu...

Comment: Echoing Greenspun on academia (Score 1) 538

by Paul Fernhout (#47356585) Attached to: Teaching College Is No Longer a Middle Class Job

From: http://philip.greenspun.com/ca...
---
Why does anyone think science is a good job?

The average trajectory for a successful scientist is the following:
age 18-22: paying high tuition fees at an undergraduate college
age 22-30: graduate school, possibly with a bit of work, living on a stipend of $1800 per month
age 30-35: working as a post-doc for $30,000 to $35,000 per year
age 36-43: professor at a good, but not great, university for $65,000 per year
age 44: with (if lucky) young children at home, fired by the university ("denied tenure" is the more polite term for the folks that universities discard), begins searching for a job in a market where employers primarily wish to hire folks in their early 30s

This is how things are likely to go for the smartest kid you sat next to in college. He got into Stanford for graduate school. He got a postdoc at MIT. His experiment worked out and he was therefore fortunate to land a job at University of California, Irvine. But at the end of the day, his research wasn't quite interesting or topical enough that the university wanted to commit to paying him a salary for the rest of his life. He is now 44 years old, with a family to feed, and looking for job with a "second rate has-been" label on his forehead.

Why then, does anyone think that science is a sufficiently good career that people should debate who is privileged enough to work at it? Sample bias. ...

Does this make sense as a career for anyone? Absolutely! Just get out your atlas.

Imagine that you are a smart, but impoverished, young person in China. Your high IQ and hard work got you into one of the best undergraduate programs in China. The $1800 per month graduate stipend at University of Nebraska or University of Wisconsin will afford you a much higher standard of living than any job you could hope for in China. The desperate need for graduate student labor and lack of Americans who are interested in PhD programs in science and engineering means that you'll have no trouble getting a visa. When you finish your degree, a small amount of paperwork will suffice to ensure your continued place in the legal American work force. Science may be one of the lowest paid fields for high IQ people in the U.S., but it pays a lot better than most jobs in China or India.
---

Comment: Example Vitamin D reduces cancer risk study: (Score 1) 51

by Paul Fernhout (#47356495) Attached to: Endorphins Make Tanning Addictive

http://www.ncbi.nlm.nih.gov/pu...
"This was a 4-y, population-based, double-blind, randomized placebo-controlled trial. The primary outcome was fracture incidence, and the principal secondary outcome was cancer incidence."

Eating a lots of vegetables and fruits and mushrooms can also reduce cancer risk (see Dr. Joel Fuhrman's summary works like "Eat To Live" with many references). I've found by eating more fruits and vegetables that my skin tone has changed from pale to having more color (even in winter). Adequate iodine can also help prevent cancer.

Reducing risk of incidence is not the same as cure though. Sorry to hear about you father getting cancer. Once you get cancer, everything is iffy, so cancer is best avoided preventatively. Fasting may also help in some cancer situations, and it also helps with chemotherapy by protecting cells from the toxic chemicals (since fasting seems to causes many normal cells to go into a safe survival mode but cancer cells generally do not). And eating better may hope prevent recurrence. In general, the human body is always developing cancerous cells, but generally they are dealt with by the immune system. So boosting the immune system could help with some cancers and there are many ways to do that -- but again, it is all iffy once cancer is established.

See also for other ideas:
http://science-beta.slashdot.o...

I agree supplements and natural sunlight are probably better choices than tanning beds --although there may still be unknowns about how the skin reacts to sun or tanning beds and produces many compounds vs. supplements. I also agree conventional tanning beds are not tuned to give lots of vitamin D.That is unfortunate, even if they produce some. See also about other tanning choices (and supplement suggestions):
http://www.vitamindcouncil.org...
"If you choose to use a tanning bed, the Vitamin D Council recommends using the same common sense you use in getting sunlight. This includes:
Getting half the amount of exposure that it takes for your skin to turn pink.
Using low-pressure beds that has good amount of UVB light, rather than high-intensity UVA light."

BTW, if you look into chemotherapy for cancer, for many cancers you'll find it is of questionable value relative to the costs both in money and suffering, where is on average may add at most a couple months of life on average if that. Chemotherapy can apparently even sometimes make cancer worse:
http://www.nydailynews.com/lif...
"The scientists found that healthy cells damaged by chemotherapy secreted more of a protein called WNT16B which boosts cancer cell survival."

It's hard to know who to trust regarding medical research results or interpretations:
http://www.pdfernhout.net/to-j...
"The problems I've discussed are not limited to psychiatry, although they reach their most florid form there. Similar conflicts of interest and biases exist in virtually every field of medicine, particularly those that rely heavily on drugs or devices. It is simply no longer possible to believe much of the clinical research that is published, or to rely on the judgment of trusted physicians or authoritative medical guidelines. I take no pleasure in this conclusion, which I reached slowly and reluctantly over my two decades as an editor of The New England Journal of Medicine. (Marcia Angell)"

Good luck sorting it all out. I've suggested creating better tools for medical sensemaking, but still not time to work on them...

EARTH smog | bricks AIR -- mud -- FIRE soda water | tequila WATER

Working...