If he is so into giving his stuff away for free, why does his website consist of a couple minutes of youtube clips and a link to, you guessed it, iTunes (where they decidedly do not give away stuff for free)?
Either you are exaggerating greatly or your experience as an MCSE is actually harming your ability to learn Win 8. Windows 8 desktop-mode is an evolution from Win 7. The new UI is obnoxious, but there is nothing complex about it. For the lay person, Windows 8 will work out of the box, and they'll be able to buy everything they need from the app store (which is also a seamless experience). In the worst case, you face the minor annoyance of switching to desktop mode to get a similar experience to Win 7, which of course was well-regarded by users. Let's compare this to my experience last week loading Ubuntu 12.04 on my Lenovo laptop. Installation was seamless and the wireless connection was easy to set up. Then, after install, the OS asked to download some updates. Boom. Broken wireless. The solution? Just figure out the exact model of my wireless adapter, download the appropriate drivers on a different machine (a wired connection was not available), load them into Linux, then block the OS from making changes to this driver. Luckily, this issue was easy to locate a solution for, because it has affected many users of the same (Broadcom) wireless adapter and has been an issue for years. The last time Windows Update rendered my machine useless? Never (and really, WU hasn't caused significant issues for users on a wide scale for nearly a decade since the infamous XP service pack issues). I hope and pray Windows never becomes as "user-friendly" as any Linux flavor I've ever encountered. Give me Linux on my server and my development box, and give me Windows on my recreation box. Both are fine, but Linux really isn't anywhere close to unseating MS in the usability space; I don't think the high incidence of Asperger's in the techie crowd and the high incidence of techies who think Linux is as usable as Windows is any sort of coincidence.
This is called chronic media multitasking, and you are not alone (likely a large portion of those calling you a loser and telling you to get over it are avoiding doing something more important). http://health.usnews.com/health-news/family-health/brain-and-behavior/articles/2009/08/24/chronic-media-multi-tasking-makes-it-harder-to A single-tasking environment would be helpful, but at what cost? While it isn't good to read your e-mail and surf the net while you are trying to get something done, it IS often useful to look up that related e-mail or useful reference. You might use some measure to block the websites you abuse the most, but who is to say something else won't take their place? What worked for me was simply to recognize and study the problem. Once you see what a common occurrence it is, and how it affects your ability to function even after the fact, it should make it easier to prioritize fixing it. For me that meant hiding most Skype notifications, closing my e-mail client while I worked, and closing out programs that I didn't need for the current task. Your mileage may vary; this is what worked (very well) for me.
Google Flu has never been used to officially declare a flu outbreak. It's a neat tool, and it has been successfully used in retrospective studies, but until it actually helps us prepare for a flu outbreak in ways above and beyond what traditional surveillance already does, it will continue to just be a neat tool and not a useful one. The same goes for the Twitter flu prediction models. These tools are cool, but unless people actually do things differently to prepare for an outbreak based on their predictions, they don't mean anything. Consider this question: If you were a public health professional and you knew about a flu outbreak 2 weeks earlier, what would you do differently? Encourage people to get vaccinated? Already being done. Shut down schools? You had better be damn sure. Warn local hospitals? You are kidding yourself if you think they are going to start bringing in extra staff in the hopes that your prediction was right. So really, what does that extra week or two get you?
As you might imagine, they've already thought of that.
I'm not sure how you managed to entirely misinterpret everything I said to arrive at this conclusion. Bravo. The title of my original comment could very well apply to your response.
I think you may have missed my greater point by citing only the portion you did. The CFAA might suck (probably does). Revising it doesn't fix what happened to Aaron Swartz. The "chinese hacker" example was just a hypothetical to force readers to think about this not in terms of the situation with Aaron, but in terms of some other hypothetical scenario where the accused won't get (or deserve) as much sympathy as Swartz. There are countless instances of over-prosecution used to make an example out of someone both in and outside of the technology universe. This is one specific instance of a far larger problem, and plugging one hole isn't going to leave us any better off. The CFAA is ill-conceived, but it really isn't the issue. We could look at many of the drug-related prosecutions in the United States as additional instances of this problem; many of those cases deal with an even greater injustice in the form of mandatory minimum sentences. Fixing the CFAA is going to be harmful to this cause. Mark my words: It will be patched and the problem will be declared fixed, despite nothing being truly fixed. It's worse than doing nothing.
If this were a Chinese-American hacker stealing schematics from Raytheon we'd all be happy to see the harshest threats/penalties applied. The issue here was bullying at the DOJ. You can't fix that with a few tweaks to the law, and if you lower maximum penalties you will find yourself regretting it when someone actually does do something worthy of those maximum penalties. And if you close these holes, aren't they just going to find others? You have issues with behaviors/attitudes at DOJ that need to be fixed, not just a few sentences in a statute. So, sure, maybe they should tweak the laws a bit; but how does that fix the oversight issues? Seems like a nice way to convince everyone they "did something" without actually fixing the issue.
This represents my field (Biomedical Informatics) so poorly, I actually feel shame reading the CW article, the IU article, and the journal article. From the CW article: "This is not the first time artificial intelligence has been brought to bear on healthcare." Not a bad statement, until they cite Watson (and an ongoing research project which has still produced nothing of value) as their evidence. How about the fact that the journal that published the IU study is called....wait for it...."The Journal of Artificial Intelligence in Medicine" and has been around for almost 20 years? How about MYCIN, eMYCIN (a generalized form of MYCIN which could be applied to any domain), INTERNIST, DxPLAIN, ISABEL, or the whole host of fully developed diagnostic expert systems which were developed over the past 50 years and which can outperform physicians? Maybe you ought to think about _why_ these aren't used? From the IU article: "..we believe that the most effective long-term path could be combining artificial intelligence with human clinicians." Holy crap! IF ONLY THERE WAS A 50 YEAR OLD FIELD OF STUDY CALLED "CLINICAL DECISION SUPPORT" PREDICATED ON THIS EXACT IDEA! I'm so glad myself and my colleagues have the support of these two geniuses from IU. Same article: "The framework here easily out-performs the current treatment-as-usual, case-rate/fee-for-service models of health care." That's also fantastically obvious. Replacing the fee for service model has been in the works for decades. It is a well-known and well-studied fact that we order a ton of unnecessary tests/procedures and this is strongly tied to the fact that we compensate healthcare providers based on the number of tests/procedures performed, and not based on patient outcomes; this brings me to my next point... From the actual journal article: "the goal here (i.e. optimality) is defined as maximizing patient improvement while minimizing treatment costs." Nice one. Why didn't we think of it earlier? Oh. That's right. Ask the Oregon Health Plan how "cost effectiveness" gets translated in the popular media (I'll give you a hint: death panels). Look. I can keep going all day on this naive research and reporting. Suffice it to say, you can't compare current doctor's performance to any model that assumes 0 liability (the AI model does not feel compelled to order tests to cover its ass), no fee-for-service setup (the AI model has no financial incentive to order additional tests and procedures...and in fact has the opposite incentive since it is measured in terms of unit cost), and an explicit cost-effectiveness evaluation model (something that will earn a flat-out revolt if you actually try it), UNLESS (and this is a BIG unless) you are also going to propose a realistic plan for getting rid of medical legal liability, fee-for-service, and widespread negative public opinion about cost-effectiveness measures in healthcare. To make it clear, I think the research is useful and should be explored further. I think researchers whoring themselves out for attention, failing to acknowledge limitations in their study, and allowing their institution to print sensationalist crap about their work is despicable.
This is true. Not sure why the parent was rated "insightful" since they clearly didn't understand what you were talking about. On topic, what do you think they should do with advanced courses? If they weight everything the same, the ability to differentiate between students who worked hard and took advanced courses and those who took easier courses is lost. Since this is an academic achievement scale, it seems like we should be able to differentiate between these groups. Do you advocate a penalty for not taking advanced courses (either in the form of a sub-4.0 max for those courses, or by raising the denominator to 4.33, or 5, or something else)?
You wouldn't need to go to the university for anything, you could just ask for the student's class rank, and the absolute GPA wouldn't matter much. Yeah, GPA scales get screwed up when "better-than-perfect" scores are allowed, but I don't think the biggest problem has anything to do with breaking the scale; a far bigger issue is when the existence of weighted classes completely dictates students' schedules. Forget about weight training, stained glass, an extra foreign language, etc... because those aren't special enough to offer extra GPA points. I took quite a few oddball courses in high school, and I never would have done that if I was partaking in the GPA rat race. My sister, who still managed a top-20 ranking in her high school class, would have been top-5 if she didn't water down her GPA with a few "regular A's" in choir and orchestra. There's something a little sick about that status quo.
"If Aaron Swartz downloaded JSTOR documents without paying for them, it would presumably be considered a crime by the USDOJ. But if U.S. Attorney Carmen Ortiz or U.S. Attorney General Eric Holder did the same? Rather than a crime, it would be considered their entitlement, a perk of an elite education that's paid for by their alma maters." http://about.jstor.org/alumni#Institutions-in-program Many universities on that "elite" list of institutions are public universities with very liberal admission policies, and alumni are "entitled" to use JSTOR after working for several years on a degree and presumably providing tens of thousands of dollars in tuition to the university who is paying the benefit. I know we have a lot of "Occupy" fans around here, but do we have to inject the us-vs-them narrative at every opportunity? Give it a rest. Of course, the author may just be taking their cue from the manifesto itself, where Mr. Swartz insisted that scientists and students have been "given a privilege" in being able to access research articles. Because, you know, the $23000 I made last year as a research assistant and PhD student was such a privilege for someone with an engineering degree and a family to feed. I'm also privileged to carry about $50000 in student loan debt between undergraduate and graduate studies...how wonderful. And when I graduate and hopefully find a good job, the narrative of how privileged and entitled I am will continue. Sadly, despite poor execution, Aaron had a great cause. Open access is the _only_ way to publish, and publishers like Elsevier that make it wholly impossible to do a decent research project without a large budget or affiliated institution are the devil. However, what "children of the global south (quoted from his manifesto)" really need is material like the Khan academy videos followed by Coursera/Udacity/etc... They aren't going to find their way to a better life in the stuffy annals of academic journals which most college graduates cannot comprehend (I'm working on a dissertation and still can't comprehend many of the articles in my field). If you want to make a difference, start asking researchers at your local university if they have considered publishing in open access journals. If not, ask them why. Ask them if they would consider paying the open access fee to a more prestigious journal so that their research can be read freely. At least ask them to do this for work with broad impact and importance, even if the fee isn't deemed worth it for every paper they generate. Finally, start hitting up the sources of funding and try to get them to include open-access requirements for research they pay for. There is a push to do just this with all federally-funded research, and I think that would be a great start. If you are feeling extra frisky, you might even consider making a donation to an open-access journal, but I think the "children of the global south" would probably appreciate a donation of food, water, or mosquito nets even more. It seems to me Aaron jumped to illegal activity when there were plenty of legal options left to pursue. That's unfortunate. This wasn't worth a human life, and the parties who blew this out of proportion should be ashamed of themselves. On the other hand, plenty of people have the 'hammer dropped' on them every day and don't commit suicide. So those in Aaron's inner circle also failed him. In the end, we all lose.
"For comparison, the US consumes 1.39 x10^9 [eia.gov] litres of fuel per day. According to Wikipedia, the energy density of petrol is 49.2 x 10^6 J/L [wikipedia.org], so that's 684 x10^12 J of energy per day... or, expressed in Watt-days (86400 seconds in a day), that's 7.91 x10^9 W-days of energy." Wikipedia actually lists 34.2 MJ/L as the energy density of petrol. Since this supports your case, I'll use it. 1.39 x 10^9 L/day * 34.2 x 10^6 J/L = 47.538 x 10^15 J/day. I'm not sure what you did when you calculated daily energy use, but you were off by a couple orders of magnitude. Converting to watt-days (47.538 x 10^15 / 8.64 * 10^4) gives us 5.502 x 10^11 Watt-days. If we then divide this by 7.68 x 10^12 (20 percent of 6 percent of total sunlight energy falling on arable land, in accordance with your figures), we get about 7.2% of all land needed to meet energy needs, which is a far cry from 1% of all land providing 10 times more energy than we need. Of course, this is all still a fantasy. Fields need fertilizer or to be planted with crops that will naturally replenish the nitrogen in the soil. If the land isn't 'rested' periodically, yields will drop dramatically. Even with proper farming techniques, yields still will not be close to 100% of the maximum possible biomass. All of this assumes that there is plenty of water to go around; since the majority of US farmland suffered from drought in 2012 (http://www.ers.usda.gov/topics/in-the-news/us-drought-2012-farm-and-food-impacts.aspx), and we have known for a long time that aquifer levels are dropping dangerously low, I'm going to suggest that adequate water is not a safe assumption. Another consideration is that 7.2% (hopelessly optimistic as it is) refers to the total surface area of ground covered by crops. Even if we planted the crops such that they covered 100% of the planted area at maturity, we still have to consider the full life cycle of the plant from seed to maturity. So, that 6% figure may be correct, but the denominator is much smaller than the field on which the crops are planted. Also, 4 million square kilometers is way higher than the actual amount of arable land in the United States. You were looking at agricultural land (includes all farmland, including that which is suitable for livestock but not crops). Using your same source, arable land is actually 1,617,800 square km. This adjustment alone would push the 7.2% above to 17.8%, and that is without considering the other factors I listed. Finally, you have only considered gasoline, when it would be appropriate to include ultra-low sulfur diesel, which is used primarily for transportation. According to (http://www.api.org/~/media/Files/Oil-and-Natural-Gas/Gasoline/US_gasoline-distillate-update.pdf), ULSD production from 2007-2011 is 3.5 million barrels per day. Since the US exports a lot of diesel, and I don't know what percentage of that is actually used in the United States, I'll just split it down the middle and say that half is exported. This translates to 1.038 X 10^16 J/day or 1.201 x 10^11 additional Watt-days. If we count other types of diesel fuel (I don't know other types of diesel fuel are used for, so I just played it safe and assumed they could be replaced by grid power) and assume less than 50% is exported, this number could easily double and would more than triple if we used more recent data and assumed zero exports. I could keep going, but I think this is sufficient to show that your calculations were off by at least a few orders of magnitude.
Actually, while 14% may or may not be the right figure, it is well-accepted that ethanol cannot scale to meet all of our needs, even in the ridiculous scenario where we stopped producing food and only produced ethanol. This article (http://www.todaysengineer.org/2010/Jan/Biofuels-pt3.asp) talks about several of the studies which have shown this. I was looking for a journal article I read a few years back that explicitly considered "next-generation" ethanol crops at their theoretical maximum yields planted on the all of the arable land on Earth, but couldn't locate it quickly. And am I the only one who finds it a bit disingenuous to suggest that any research that doesn't support biofuels as "the answer" must have come from Exxon? Do you also believe that the gas companies send agents around the world to assassinate researchers every time they get close to discovering "free energy" or carburetors that will make any car in the world get 100 mpg?
Your information, whether in electronic form or on paper, is already available to health researchers. I just need an informed consent waiver and I can use it for research. If we remove identifiers from it, I can use it and share it freely. There is currently no difference in privacy laws between electronically stored health information and paper records, so anyone your doctor can send your electronic info to they can also fax your records to. Given that very few health information systems interoperate, but everyone has a fax, you are more likely to fall victim to unauthorized sharing of your paper records.