Forgot your password?
typodupeerror

+ - Fearing government surveillance, U.S. journalists are self-censoring

Submitted by binarstu
binarstu (720435) writes "Suzanne Nossel, writing for cnn.com, reports that 'a survey of American writers done in October revealed that nearly one in four has self-censored for fear of government surveillance. They fessed up to curbing their research, not accepting certain assignments, even not discussing certain topics on the phone or via e-mail for fear of being targeted. The subjects they are avoiding are no surprise — mostly matters to do with the Middle East, the military and terrorism.' Yet ordinary Americans, for the most part, seem not to care: 'Surveillance so intrusive it is putting certain subjects out of bounds would seem like cause for alarm in a country that prides itself as the world's most free. Americans have long protested the persecution and constraints on journalists and writers living under repressive regimes abroad, yet many seem ready to accept these new encroachments on their freedom at home.'"

Comment: Re:Spend more, because kids aren't learning more (Score 1) 95

by binarstu (#45570855) Attached to: White House Calls On Kids To Film High-Tech Education

Maybe the kids could do a high tech film about how throwing money at technology doesn't actually improve education.

Exactly what I was thinking.

There is a general feeling in the U.S. that public schools are failing (regardless of whether that opinion is justified). It seems to me that buying more technology is the lazy administrator's way of "doing something about it." Purchasing technology also provides a convenient measure of progress, however dubious. Administrators can brag about how they are providing every student with an iPad, or putting smart boards in every classroom, or whatever the current fad is, and claim that they are improving the school.

Are these purchases usually made with a clear plan for how to use the technology, or solid research-based evidence that the new technology will actually improve students' learning? I would guess that most of the time, the answer is "no" on both counts. The fact that we're now having Bill Nye ask K-12 students, "So, you've got all of this cool technology in your school... how is it actually useful?" suggests I might not be wrong.

Comment: simple answer (Score 3, Interesting) 115

by binarstu (#45450473) Attached to: How MOOC Faculty Exploit People's Desire To Learn

From the original post: "Such behavior is not tolerated in "real" college courses, so why is it tolerated in MOOCs taught by the same faculty?"

TFA answers the question quite nicely: "Despite a couple of years of discussion, the question of monetization remains largely unresolved. MOOCs are about as popular as they were, they still drain resources from the companies hosting them, and they still don’t provide much to those hosts in return." Good or bad, it's an attempt to try to get something useful in return for the effort it takes to create a MOOC course. It's as simple as that, and there's no reason to read anything more sinister into it.

And let's not hyperbolically describe this as "holding the users hostage," okay? Users are free to leave the course whenever they want -- hostage situations don't usually work that way.

+ - NSA wants to reveal its secrets to prevent Snowden from revealing them first

Submitted by binarstu
binarstu (720435) writes "According to a recent report by Tom Gjelten of NPR, 'NSA officials are bracing for more surveillance disclosures from the documents taken by former contractor Edward Snowden — and they want to get out in front of the story. ... With respect to other information held by Snowden and his allies but not yet publicized, the NSA is now considering a proactive release of some of the less sensitive material, to better manage the debate over its surveillance program.'"

+ - Stephen Wolfram Developing New Programming Language->

Submitted by Nerval's Lobster
Nerval's Lobster (2598977) writes "Stephen Wolfram, the chief designer of the Mathematica software platform and the Wolfram Alpha “computation knowledge engine,” has another massive project in the works—although he’s remaining somewhat vague about details for the time being. In simplest terms, the project is a new programming language—which he’s dubbing the “Wolfram Language”—which will allow developers and software engineers to program a wide variety of complex functions in a streamlined fashion, for pretty much every single type of hardware from PCs and smartphones all the way up to datacenters and embedded systems. The Language will leverage automation to cut out much of the nitpicking complexity that dominates current programming. “The Wolfram Language does things automatically whenever you want it to,” he wrote in a recent blog posting. “Whether it’s selecting an optimal algorithm for something. Or picking the most aesthetic layout. Or parallelizing a computation efficiently. Or figuring out the semantic meaning of a piece of data. Or, for that matter, predicting what you might want to do next. Or understanding input you’ve given in natural language.” In other words, he’s proposing a general-purpose programming language with a mind-boggling amount of functions built right in. At this year’s SXSW, Wolfram alluded to his decades of work coming together in “a very nice way,” and this is clearly what he meant. And while it’s tempting to dismiss anyone who makes sweeping statements about radically changing the existing paradigm, he does have a record of launching very big projects (Wolfram Alpha contains more than 10 trillion pieces of data cultivated from primary sources, along with tens of thousands of algorithms and equations) that function reliably. At many points over the past few years, he’s also expressed a belief that simple equations and programming can converge to create and support enormously complicated systems. Combine all those factors together, and it’s clear that Wolfram’s pronouncements—no matter how grandiose—can’t simply be dismissed. But it remains to be seen how much of an impact he actually has on programming as an art and science."
Link to Original Source

Comment: please don't throw Wikipedia into this (Score 5, Insightful) 63

by binarstu (#45380185) Attached to: Could We "Wikify" Scholarly Canons?

From TFA: "When academics have been asked why they do not contribute to Wikipedia, or why they do not make their data more easily available, or why they continue to avoid new “open access” publication venues, one of the most common explanations is “not enough time” [7,8]."

The article gets a lot of things right, but that sentence is not one of them. The reasons that academics do not contribute to Wikipedia have been well documented and discussed here and elsewhere. In brief -- you get no credit for your work, and your contributions can be totally wiped out at the whims of editors. The reason experts don't contribute to Wikipedia is not a lack of time; rather, it's because doing so is perceived (quite reasonably) as a waste of time.

In contrast, most scientists I know are quite receptive to publishing in open access journals. Some are still suspicious of them, but I've never heard "I don't have enough time" given as a reason for not publishing open access. Honestly, that objection wouldn't even make sense.

+ - CIA Pays AT&T Millions to Voluntarily Provide Call Data

Submitted by binarstu
binarstu (720435) writes "The New York Times reports that 'The C.I.A. is paying AT&T more than $10 million a year to assist with overseas counterterrorism investigations by exploiting the company’s vast database of phone records, which includes Americans’ international calls, according to government officials. The cooperation is conducted under a voluntary contract, not under subpoenas or court orders compelling the company to participate, according to the officials.'"

Comment: yet 33% in the House opposed it (Score 4, Insightful) 999

by binarstu (#45149767) Attached to: US Government Shutdown Ends
The bill passed the House, but 144 votes were cast against it -- more than 1/3 of those voting! One can only guess at the careful thought that went into casting those votes. Do these people actually believe that funding "Obamacare" for a few months is worse than letting the federal government default on its loans? There is no acceptable answer to this question. If the answer is "yes," well -- yikes. If the answer is "no," and this is just shameless pandering to the extreme right faction of the GOP/"Tea Party", then -- yikes.

Comment: uninspiring choice (Score 1) 61

by binarstu (#45107285) Attached to: Anti-Chemical Weapon Group Awarded Nobel Peace Prize
While I am not questioning that the OPCW does really great work to make the world a better place, my disappointment with this selection is that it is simply uninspiring. The Nobel committee had nominations for numerous individuals who, at great risk to their own livelihood and safety, did extraordinary things to stand up for what they knew was right and just. Giving the Nobel Peace Prize to such a person would affirm that even today, one individual doing the right thing can make a difference that is felt on a global scale. Regardless of whether the OPCW was most deserving of this year's prize (which is certainly debatable, as attested to by other comments on this story), the choice doesn't really stir much passion.

Comment: Because of the Web? (Score 4, Insightful) 374

Wow -- it has actually been 20 years since Myst came out?? That seems unbelievable. I haven't done any "real" computer gaming in a long time, but I spent many hours working my way through Myst and absolutely loved that game.

I wonder if the popularization of the World Wide Web had something to do with the eventual decline of Myst and games like it. I remember that a big part of the satisfaction of playing Myst and other puzzle-based games, such as the King's Quest series, was that you really needed to struggle through the challenges until you figured them out. For example, a staple of those games was a maze that you had to traverse at some point (remember the little subterranean train thing in Myst?). To solve them, you had to spend considerable time exploring and mapping until you finally figured out how to get where you needed to go. If you were stuck, there wasn't much you could do except try harder until you got it. Sure, the game companies had "hot lines" that you could call for hints, but they charged you for it, and nobody I knew ever used them. As a result, the game was much more rewarding because you had to do it all by yourself. This environment also was conducive to playing the game with others, because two (or more) heads are better than one. My brother and I worked through a number of these games when we were kids, and playing them together added to the fun.

Once the Web became mainstream, the situation changed very quickly. Suddenly, game "walk throughs" were widely available for free, and much of the mystique that led to these games' success disappeared. You need to solve that maze? Just look it up on the walk through and you can be done with it in about two minutes. Once the entire game solution was readily available, the sense of accomplishment from solving the puzzles was greatly diminished, in my opinion.

So, imagine a world where there is no quick, easy way to look up game solutions. It seems terribly quaint now, but that was the environment in which Myst and similar games before it became popular. Once that changed, I think the days were numbered for the puzzle-based games, at least as far as their ability to become blockbusters.

I haven't done any research to compare how well actual market trends correlated with the rise of the Web. This is just my recollection of how the gaming world changed during that time.

Comment: Re:Great idea! (Score 4, Insightful) 95

by binarstu (#44938075) Attached to: Romanian Science Journal Punked By Serbian Academics

If I had any mod points to give, I'd mod the parent up.

The GP states,

There should not be a place "scientific" journals in modern science. They have no added value whatsoever and in fact harm free sharing of knowledge and information.

Anybody who makes that claim has no real grasp of how science works. Science journals have come under fire for a variety of reasons in recent years, but the peer review process that is central to scientific publishing is why journals are so important. And I am using "journal" in the broadest sense to include open-access, online-only publications. As long as they include quality peer review, they are science journals.

As others have pointed out, the process of taking a paper through peer review often leads to substantial improvements to the original manuscript or reveals shortcomings that must be addressed before the work can be published. And, most of the time, it keeps the really bad work from ever being published at all. Is the process perfect? Of course not. But an anecdotal case of spectacular failure by an obscure mettalurgy journal does not mean the whole concept is worthless. It merely means that journal is bad. The peer-review process is the best method we have for ensuring the quality of scientific work, and without it (and the journals that provide the structure for it), scientific progress would be greatly hindered. Until we come up with a better way to filter the good from the bad, journals will remain an essential part of science.

Comment: Re:Why bother at all (Score 2) 308

by binarstu (#44898849) Attached to: To Boldly Go Nowhere, For Now

Both of those things would be easy if anyone cared enough to do them; we've had a permanent presence in Antarctica for decades

Notice the adjective "self-sufficient" in the GP. You think building a self-sufficient settlement in Antarctica is easy, and it's only a problem of nobody wanting to do it? Here's a hint: The "permanent presence in Antarctica" you speak of is nowhere near self-sufficiency. Were it not for a continuing cycle of supplies (food and fuel, primarily) periodically arriving by boat or plane, everyone there would die. So no, not easy at all.

Comment: Re:Human missions are better for long term health (Score 3, Insightful) 308

by binarstu (#44898287) Attached to: To Boldly Go Nowhere, For Now

But why is manned space exploration necessary for any of the progress you describe? To the contrary, it seems to me that if the goal is to create new medical breakthroughs, spending loads of cash on human spaceflight is, at best, a rather inefficient way to achieve that objective. If the goal is to slow aging, preserve vision, or whatever, I can't think of any reason that Earth-based research wouldn't work.

Now, as to your point about the incredible amounts of money we waste on things that ultimately do very little to improve our lives, I wholeheartedly agree!

Those who can, do; those who can't, simulate.

Working...