Maybe the kids could do a high tech film about how throwing money at technology doesn't actually improve education.
Exactly what I was thinking.
There is a general feeling in the U.S. that public schools are failing (regardless of whether that opinion is justified). It seems to me that buying more technology is the lazy administrator's way of "doing something about it." Purchasing technology also provides a convenient measure of progress, however dubious. Administrators can brag about how they are providing every student with an iPad, or putting smart boards in every classroom, or whatever the current fad is, and claim that they are improving the school.
Are these purchases usually made with a clear plan for how to use the technology, or solid research-based evidence that the new technology will actually improve students' learning? I would guess that most of the time, the answer is "no" on both counts. The fact that we're now having Bill Nye ask K-12 students, "So, you've got all of this cool technology in your school... how is it actually useful?" suggests I might not be wrong.
From the original post: "Such behavior is not tolerated in "real" college courses, so why is it tolerated in MOOCs taught by the same faculty?"
TFA answers the question quite nicely: "Despite a couple of years of discussion, the question of monetization remains largely unresolved. MOOCs are about as popular as they were, they still drain resources from the companies hosting them, and they still don’t provide much to those hosts in return." Good or bad, it's an attempt to try to get something useful in return for the effort it takes to create a MOOC course. It's as simple as that, and there's no reason to read anything more sinister into it.
And let's not hyperbolically describe this as "holding the users hostage," okay? Users are free to leave the course whenever they want -- hostage situations don't usually work that way.
Open-access journals and scientific wikis are failing...
Do you have any evidence to support this claim? In the sciences, at least, open access journals are thriving. Take a look at any of the PLoS journals, for instance. These venues are well-respected and scientists are eager to publish in them.
From TFA: "When academics have been asked why they do not contribute to Wikipedia, or why they do not make their data more easily available, or why they continue to avoid new “open access” publication venues, one of the most common explanations is “not enough time” [7,8]."
The article gets a lot of things right, but that sentence is not one of them. The reasons that academics do not contribute to Wikipedia have been well documented and discussed here and elsewhere. In brief -- you get no credit for your work, and your contributions can be totally wiped out at the whims of editors. The reason experts don't contribute to Wikipedia is not a lack of time; rather, it's because doing so is perceived (quite reasonably) as a waste of time.
In contrast, most scientists I know are quite receptive to publishing in open access journals. Some are still suspicious of them, but I've never heard "I don't have enough time" given as a reason for not publishing open access. Honestly, that objection wouldn't even make sense.
Wow -- it has actually been 20 years since Myst came out?? That seems unbelievable. I haven't done any "real" computer gaming in a long time, but I spent many hours working my way through Myst and absolutely loved that game.
I wonder if the popularization of the World Wide Web had something to do with the eventual decline of Myst and games like it. I remember that a big part of the satisfaction of playing Myst and other puzzle-based games, such as the King's Quest series, was that you really needed to struggle through the challenges until you figured them out. For example, a staple of those games was a maze that you had to traverse at some point (remember the little subterranean train thing in Myst?). To solve them, you had to spend considerable time exploring and mapping until you finally figured out how to get where you needed to go. If you were stuck, there wasn't much you could do except try harder until you got it. Sure, the game companies had "hot lines" that you could call for hints, but they charged you for it, and nobody I knew ever used them. As a result, the game was much more rewarding because you had to do it all by yourself. This environment also was conducive to playing the game with others, because two (or more) heads are better than one. My brother and I worked through a number of these games when we were kids, and playing them together added to the fun.
Once the Web became mainstream, the situation changed very quickly. Suddenly, game "walk throughs" were widely available for free, and much of the mystique that led to these games' success disappeared. You need to solve that maze? Just look it up on the walk through and you can be done with it in about two minutes. Once the entire game solution was readily available, the sense of accomplishment from solving the puzzles was greatly diminished, in my opinion.
So, imagine a world where there is no quick, easy way to look up game solutions. It seems terribly quaint now, but that was the environment in which Myst and similar games before it became popular. Once that changed, I think the days were numbered for the puzzle-based games, at least as far as their ability to become blockbusters.
I haven't done any research to compare how well actual market trends correlated with the rise of the Web. This is just my recollection of how the gaming world changed during that time.
If I had any mod points to give, I'd mod the parent up.
The GP states,
There should not be a place "scientific" journals in modern science. They have no added value whatsoever and in fact harm free sharing of knowledge and information.
Anybody who makes that claim has no real grasp of how science works. Science journals have come under fire for a variety of reasons in recent years, but the peer review process that is central to scientific publishing is why journals are so important. And I am using "journal" in the broadest sense to include open-access, online-only publications. As long as they include quality peer review, they are science journals.
As others have pointed out, the process of taking a paper through peer review often leads to substantial improvements to the original manuscript or reveals shortcomings that must be addressed before the work can be published. And, most of the time, it keeps the really bad work from ever being published at all. Is the process perfect? Of course not. But an anecdotal case of spectacular failure by an obscure mettalurgy journal does not mean the whole concept is worthless. It merely means that journal is bad. The peer-review process is the best method we have for ensuring the quality of scientific work, and without it (and the journals that provide the structure for it), scientific progress would be greatly hindered. Until we come up with a better way to filter the good from the bad, journals will remain an essential part of science.
Both of those things would be easy if anyone cared enough to do them; we've had a permanent presence in Antarctica for decades
Notice the adjective "self-sufficient" in the GP. You think building a self-sufficient settlement in Antarctica is easy, and it's only a problem of nobody wanting to do it? Here's a hint: The "permanent presence in Antarctica" you speak of is nowhere near self-sufficiency. Were it not for a continuing cycle of supplies (food and fuel, primarily) periodically arriving by boat or plane, everyone there would die. So no, not easy at all.
But why is manned space exploration necessary for any of the progress you describe? To the contrary, it seems to me that if the goal is to create new medical breakthroughs, spending loads of cash on human spaceflight is, at best, a rather inefficient way to achieve that objective. If the goal is to slow aging, preserve vision, or whatever, I can't think of any reason that Earth-based research wouldn't work.
Now, as to your point about the incredible amounts of money we waste on things that ultimately do very little to improve our lives, I wholeheartedly agree!
As the Slate piece points out, the argument about continuing manned (and womanned) space exploration because "we might need to leave Earth in the near future" seems to be quite popular right now, especially with all of the buzz about the Mars One plan to establish a semi-permanent colony on Mars. I was disappointed, though, that the Slate article didn't really address the core of the issue: believing that, if Earth were to actually become uninhabitable, we could simply colonize Mars, or Venus, or any other distant rock, is absolutely preposterous. This idea has been thoroughly discredited.
For an excellent summary of why this is nothing more than magical thinking, I suggest reading physicist Tom Murphy's excellent post on the matter. As he alludes to, if we convince ourselves that we need to spend unfathomable resources on human spaceflight so that we can "save ourselves" some day, we simply avoid fixing the real problems here on Earth, where we are very much stuck for the long haul. Pretending otherwise will only hasten our demise.
This restaurant was advertising breakfast any time. So I ordered french toast in the renaissance. - Steven Wright, comedian