Yep, on your dime. Welcome to modern society, where you don't get to pick and choose what you pay for based if you like it or not. If you are worried about dimes, I'll trade you out of control military spending for a stronger social net and increased funding for basic research and education to create the next generation of businesses and technologies.
I understand it very well. You might not agree that basic health is right of every human being, and you may disagree that I believe that living in a first world country forces people into poverty because of illness is shameful. Rights are not just things the government does not interfere in, they are that which government ensures or should ensure because we are human. Sometimes by non-interference, sometimes by right of law and regulation. I believe that indeed, health is just such a right.
Who pays for it? We all do. Like we all do now. You really think you don't pay for all the free care that hospitals and doctors have to give out because it is not only required by the laws of the land, but by the ethics of the profession? That your employer and you don't pay for that in premiums? Here's the reality, you are paying for it.
Why do I have the right to make a police officer come to my house if my property was stolen or if I was attacked at 2:00AM in the morning? Who pays for that? Or the fireman that comes to a fire? How I can't expect somebody to work on a road that anybody can drive on? I don't use it as much as that other guy. Government exists for a reason. It's awful hard to shrink for a reason.
Oh, and the point is that it shouldn't force anybody to lose their home. Yes, people may be paying a bit more in taxes. But, the people are really losing their homes are those that are forced out of work because of illness and discrimination. And many insurance companies work with employers to make sure that people lose their jobs, then lose their coverage. It happened every day. ObamaCare was a step to stop it. Because it's wrong. Not a grey area of capitalism or modern society, but a simple moral failing that is finally, finally being addressed.
If you are worried about the taxes, I'd just ask why we need to have military spending that overshadows every country in the world. Do we really need to pour billions into the F-35 program while we cut food stamps? Here's the thing. We all pay taxes for things we don't like. I come to accept that my taxes help create ridiculous weapons of mass destruction. But, if I ask you to accept that everybody can go to a doctor and get treatment for a illness, I'm clearly unreasonable and perhaps not in my right mind. I don't buy it.
This. People would be surprised truly how useless many of these cheaper plans were. If you got a chronic illness or injury that had long lasting effects, you'd get some things paid for if you mounted a massive effort to get the insurance company to pay for what they are legally required to, but will try not to do by burying you and your providers in paperwork, delaying payments and pushing deadlines.
Then, when you come up to renew, you would be given a cost you can't afford. So, you lose your plan. You can't get another one.
Yes, insurance companies are jacking up prices, but this is panic driven. What the public will so learn is that most health care insurers can't actually pool risk, and only make money by denying care and pushing people out of the system.
Obamacare is a clear signal: If the health care insurance can't sustain its business by keeping all of the US healthy, it will be legislated out of existence. It's not a matter of if but when and how hard it will be. The rest of world has shown us that. The US will catch up to the idea that every human has the right to health without concern for cost or it will fail.
So, this modern theory just disregards that there is no center of the universe. Well, that's an improvement.
Beat me to it, but walking an expression tree is a great example.
But what would F# really displace? Haskell's not going away just because F# exists. If anything, it'd probably just add more potential users and contributors to the project. I think Microsoft is just trying grow the community around these kinds of languages, and there are real signs that F# is a healthy community at this point.
The nice thing about F# is that it is at the point that you can get past "well, that's just an academic thing you learned in school" conversation and try to talk about why it makes sense for a given problem. It's a hard sell, but it's not impossible.
I don't see how the insight in quoting the old "embrace, extend and extinguish" mantra here (besides playing to the home crowd). I really don't see how it applies in this case. I really doubt that Microsoft is trying to take over Haskell. Especially given how many researchers in MS research work on the project as well.
If you had told me back more than a decade ago that Microsoft would be supporting a commercial version of a language based on ML, OCAML and Haskell, I'd shook my head in complete disbelief. But, here we are, and this is great news as it allows for more engagement from the Haskell and other functional programming communities.
F#, like it's other ML-based dialects, is amazing for solving certain problems in a expressive and concise manner. Of course, it's a powerful language that can leads to abuses. And, don't get me wrong, the additional constructs for full
Frankly, if there was local F# work, I'd jump on it in a heartbeat. I've even considered trying to convince a couple of local shops to give it a try for some advanced projects.
For all those yelling "This is clearly bad science", it's not. The summary is not the paper. The paper notes that there is a correlation between a certain pattern of light exposure and BMI in their sample group. The hard part about the paper is the models they used to capture temporal patterns of light exposure and determining if they are valid. The paper does discuss the model in detail and notes that there are issues that it fails to address.
The rest of the analysis is fairly accepted sensitivity analysis, which factors in the sample size. Also, the paper notes that there have been other studies in animals that have linked light exposure to changes in metabolism, so there is a potential for the mechanism to be causative. But the paper clearly notes in the summary that directionality of the found relationship can't be determined from this study. In other words, the paper just suggests more avenues of research into the links between light exposure, sleep rhythms and metabolism and suggests that the temporal aspects of exposure could play a role.
Finally, the intervention that it suggests is fairly harmless. If people start getting more sun in the morning, that's probably okay overall.
I've seen this argument quite a bit, that Computer Science is a really just a branch of applied mathematics, that it is unnecessary for programmers and so on. Sure, it could be viewed that way, but it is ignoring a lot of the history of how the discipline developed.
The first CS programs always had an applied component. It was not just math and proofs. There was (and still is) math, but there was a lot of engineering from the start. When Ivan Sutherland started the field of computer graphics, it wasn't just a mathematical models for shading. They were also building frame buffers and actual displays and software that used them. The lambda calculus was interesting, but researchers created LISP and actually got it working on real machines.The fact is that the field you call Computer Science did indeed build the foundation of modern computers as we currently know and use them and has done so from the beginning.
Sure, the field has grown and there is specialization. The design of digital circuits has become Computer Engineering in many schools. But, as far as software goes, I'm not sure we are at the point that we can really divorce the theory from application to the level that something like Chemistry and Chemical Engineering has yet. I do believe that programmers need the foundation in theory that most CS programs provide. But, one can get those foundations in other ways that are outside a degree.
In short, I think even the Wikipedia definition ignores the real history of the discipline, focusing on a narrower theoretical subset that is not truly representative of the field as a whole.
Given the long notice on Windows XP end of life, why is this just being considered now? I would expect vendors to announce they have completed or have started their migration to a newer platform. And Linux is a very reasonable choice for this, and it was years ago. QNX, VxWorks was as well. It's not like Linux became a reasonable embedded OS just this year, but it seems like the companies are thinking that. "Oh, hey, maybe Linux isn't too bad after all." Weird.
And, there is Windows 7 embedded, if you want to upgrade not port. I understand being conservative, but this just seems like bad crisis planning at the last minute. Also, with the new card standards coming up, it seems the industry knew there was a need for new systems in plenty of time to create and implement a migration plan.
Part of the problem is that research and development funding in this country is plummeting. Heck, you can't hire more engineers to work on cures for cancer, better healthcare systems if those scientists that are creating the innovations are fighting for (and losing) grants and jobs. You want people to do meaningful work, you need to support meaningful research. This whole "academics are useless" refrain is getting old. You know, that useless PhD did prove that they are capable of original thought and self-directed exploration. Seriously, the state of computer science and engineering research is appalling in the US and other fields have the same problems. Industrial research and development is under attack as well in the few places it still exists. And don't get me started about the long term threats to "liberal arts" and humanities education.
It's amazing how many people are under the spell of economic gain as the ultimate goal. The ultimate goal is a better, educated and thoughtful society. When the focus is too much on wealth and wealth accumulation, history shows us that time and time again, it ends up badly. Popular uprising can be very, very unpleasant for all involved.
Refactoring changed the IDE equation. Now, using a tool that understands that code is code, not text is needed to support Refactoring. Given that, you might as well throw in auto completion (Intellsense) for objects, classes, namespaces, etc. Part of the value of the Java, C++ and
One can become too dependent on anything. But there's no reason to use a better tool. I'm developing in
But, OOXML is (for better or worse) a ECMA and ISO standard, just like ODF. Each has it's own set of tradeoffs, advantages and disadvantages.
So, not why not a list of accepted common formats? ODF, OOXML, PDF is a fine list. Why "one format to rule them all?" Let people use the tools that work best for them, and base the decisions on a real cost-benefit analysis. Or just say we don't want to pay for Microsoft Office and deal with the fallout from that.
Don't wrap it up by picking a favorite format and making it a mandate by law. It doesn't work out well. There's people making a ton of money making software to parse X12 messages for US health care institutions because the HIPPA law demanded that specific format. And, guess what? Technology moved on and it's a total mess. Lots of great things in the HIPPA law, but that was not one of them. You can regulate without making specific technology decisions.
This is an excellent point. It doesn't pick any particular paradigm to be structured around.
Languages like C# and Java have added functional features, but they integrate with their primary object oriented focus. Compare this to Scheme/LISP, whose object oriented features were integrated with its primary functional focus.