My experience in studying Medical Informatics is that they had no idea on how to create an ecosystem. Firstly, they were wrongly insistent on the need for everything to be coded. Take a look at things like SNOMED and LONIC as an example.
HL7 is a completely over engineered mess and it's a standards process driven by too many doctors and other health professionals and way too few computer scientists. It tries to capture the process of health care as a protocol. Completely wrongheaded. By the way, I worked on the UML 2.0 standard committee, which I think is reasonable by comparison to HL7, which is a major user of UML. Let that sink in.
HIPAA also has completely outdated and overly complex requirements as well. It was well intended, but it needs replacement. The law standardized technology, not requirements and that's a mistake.
Epic is a total mess. A local hospital system in my state adopted it and (surprise), it was horribly over-budget and there are still issues. And it's legacy code out of the box. It's all based on MUMPS and bits and pieces hacked on top of it.
Overall, the main problem is insisting that the problem be solved all at once, versus step by step. Step one, establish a system for identification for health providers and patients. This includes a system to get a identity of a patient via known data while providing a high level of confidence that the requestor of information is a health provider. Solve this, and then you can start talking about interchange. And start simple. Forget highly coded documents. Exchange vital history, procedure history, problem list and notes. That's it. Then move forward based on actual user demands.
Frankly, Clinton had the right idea with the national health id. If we could create an ID that everybody had that was only used for medical identification, that'd be great. But I doubt that'll happen, so we will be stuck with a huge data deduplication problem.
It's not easy, but it's more doable than people think. And heck, open source as a means of standardization is a fine part of this equation that is completely ignored.
Seems there was a purpose in letting public education degenerate into nothing more than obedience conditioning.
A fair point. I was thinking the Android price point was more around 69-79. Clearly I haven't been shopping extensively.
Sure, it's fine to be skeptical, but it's easy to verify (or not). You don't think Windows has a big enough market that people won't analyze every bit of traffic that comes out of the next OS?
Plenty of programs have had that customer experience improvement program opt-in for a while. I haven't seen anything that suggests that you really can't opt out of it, that data is sent anyway. I'm sure that if somebody found evidence of that, we'd hear about it instantly.
Sure, it may be required as part of installing the technical previews (but even that's not clear). How it works in the release, who knows. I agree that the best move would be not to have it at all in the RC or RTM builds, but that's not impossible or even unlikely.
This isn't a port. It's streaming the application. It is actually running on their cloud, so you could do the same on Linux, Windows, whatever.
This is just another part of them moving to a cloud-based model. No big deal.
I guess Microsoft's plan to charge nothing for small screen form factors is having a bit on a effect. Even 20 bucks would be a significant impact on that price. At that price, there'd be enough people to see if you get a Linux distro on it, and it's close enough to cheap android levels.
For me, it's cool, because I'm more versed in Windows development and since it's full Windows, I can easily install whatever the heck I want on it (no developer unlock, etc, etc). Save up, get a few and just have them around the house.
The way the rule is stated and repeated in modern culture is a vast oversimplification, and so a critique is fine. As some have noted, the argument was also about the "ability and drive" to put in the 10,000 hours. Certainly, individual factors do play a role. The only reason this is controversial is when people try to apply it to certain populations, where there is no evidence for that at all (in fact, plenty to the contrary). The article itself notes this.
But, it does raise a question: Are there skills require innate abilities to truly master, and if so, what are they and how do they differ from those that don't? There is evidence to suggest that the former is true.
This rule is often linked to how to be successful, but the studies have all been on skills that have no direct links to financial success. Brilliant musicians don't get paid well by default. Chess players aren't sport stars. Artists struggle.
I am curious if programming is a skill that does require an innate mindset to truly master (I do believe these skills do exist), or if it just a skill that demands disciplined practice. I've seen no evidence either way, so anything would be speculation on my part.
Link to Original Source
Link to Original Source
It involves writing tons of boilerplate code for the GUI
That statement alone tells any experienced Mac or IOS developer that you have no idea what you're talking about.
But as I've said many times since then, I'll switch when something better comes along. That time has come. Swift is a major improvement over Obj-C, and it was developed to meet Apple's internal needs, by engineers who know Obj-C inside out.
It's kind of a kick being a beginner again. Swift takes some getting used to, but I expect it to give me as much of a productivity improvement over Obj-C as Obj-C gave me over C++.