Forgot your password?

Comment: There is an easy workaround (Score 1) 162

by Dr. Spork (#48042589) Attached to: Will Windows 10 Finally Address OS Decay?
When I install Windows, I work hard to set up everything exactly as I like on install day. Then I make a backup of the OS partition - which has only programs, no photos, videos, etc. - using Acronis TrueImage. Then I proceed as normal, and when something gets screwed up, I just restore from backup. This completely undoes any effects of winrot, and the system immediately feels like it was installed that day. What I usually do then is update my applications and settings, and immediately make a new backup. A full restore takes about 4 minutes, and a backup with max compression takes something like 12. I find this so convenient that I use no antivirus. When I start to suspect that I may have installed malware, I just restore from a backup, and four minutes later, my system is perfect. I've been doing this since Win2K days, and if this method weren't available to me, I wouldn't be using Windows.

Comment: A Self Imposed Mess... (Score 2) 202

by ndykman (#48039531) Attached to: Back To Faxes: Doctors Can't Exchange Digital Medical Records

My experience in studying Medical Informatics is that they had no idea on how to create an ecosystem. Firstly, they were wrongly insistent on the need for everything to be coded. Take a look at things like SNOMED and LONIC as an example.

HL7 is a completely over engineered mess and it's a standards process driven by too many doctors and other health professionals and way too few computer scientists. It tries to capture the process of health care as a protocol. Completely wrongheaded. By the way, I worked on the UML 2.0 standard committee, which I think is reasonable by comparison to HL7, which is a major user of UML. Let that sink in.

HIPAA also has completely outdated and overly complex requirements as well. It was well intended, but it needs replacement. The law standardized technology, not requirements and that's a mistake.

Epic is a total mess. A local hospital system in my state adopted it and (surprise), it was horribly over-budget and there are still issues. And it's legacy code out of the box. It's all based on MUMPS and bits and pieces hacked on top of it.

Overall, the main problem is insisting that the problem be solved all at once, versus step by step. Step one, establish a system for identification for health providers and patients. This includes a system to get a identity of a patient via known data while providing a high level of confidence that the requestor of information is a health provider. Solve this, and then you can start talking about interchange. And start simple. Forget highly coded documents. Exchange vital history, procedure history, problem list and notes. That's it. Then move forward based on actual user demands.

Frankly, Clinton had the right idea with the national health id. If we could create an ID that everybody had that was only used for medical identification, that'd be great. But I doubt that'll happen, so we will be stuck with a huge data deduplication problem.

It's not easy, but it's more doable than people think. And heck, open source as a means of standardization is a fine part of this equation that is completely ignored.

Comment: Easy to verify... (Score 1) 261

by ndykman (#48028799) Attached to: Microsoft's Asimov System To Monitor Users' Machines In Real Time

Sure, it's fine to be skeptical, but it's easy to verify (or not). You don't think Windows has a big enough market that people won't analyze every bit of traffic that comes out of the next OS?

Plenty of programs have had that customer experience improvement program opt-in for a while. I haven't seen anything that suggests that you really can't opt out of it, that data is sent anyway. I'm sure that if somebody found evidence of that, we'd hear about it instantly.

Sure, it may be required as part of installing the technical previews (but even that's not clear). How it works in the release, who knows. I agree that the best move would be not to have it at all in the RC or RTM builds, but that's not impossible or even unlikely.

Comment: That's a reasonable price point... (Score 1) 181

by ndykman (#48028303) Attached to: HP Introduces Sub-$100 Windows Tablet

I guess Microsoft's plan to charge nothing for small screen form factors is having a bit on a effect. Even 20 bucks would be a significant impact on that price. At that price, there'd be enough people to see if you get a Linux distro on it, and it's close enough to cheap android levels.

For me, it's cool, because I'm more versed in Windows development and since it's full Windows, I can easily install whatever the heck I want on it (no developer unlock, etc, etc). Save up, get a few and just have them around the house.

Comment: Worth questioning... (Score 1) 187

by ndykman (#48023351) Attached to: New Research Casts Doubt On the "10,000 Hour Rule" of Expertise

The way the rule is stated and repeated in modern culture is a vast oversimplification, and so a critique is fine. As some have noted, the argument was also about the "ability and drive" to put in the 10,000 hours. Certainly, individual factors do play a role. The only reason this is controversial is when people try to apply it to certain populations, where there is no evidence for that at all (in fact, plenty to the contrary). The article itself notes this.

But, it does raise a question: Are there skills require innate abilities to truly master, and if so, what are they and how do they differ from those that don't? There is evidence to suggest that the former is true.

This rule is often linked to how to be successful, but the studies have all been on skills that have no direct links to financial success. Brilliant musicians don't get paid well by default. Chess players aren't sport stars. Artists struggle.

I am curious if programming is a skill that does require an innate mindset to truly master (I do believe these skills do exist), or if it just a skill that demands disciplined practice. I've seen no evidence either way, so anything would be speculation on my part.

+ - Microsoft Revives Its Hardware Conference->

Submitted by jfruh
jfruh (300774) writes "Microsoft's Windows Hardware Engineering Conference, or WinHEC, was an annual staple of the '90s and '00s: every year, execs from Redmond would tell OEMs what to expect when it came to Windows servers and PCs. The conference was wrapped with software into Build in 2009, but now it's being revived to deal with not just computers but also the tablets and cell phone Microsoft has found itself in the business of selling and even making. It's also being moved from the U.S. to China, as an acknowledgement of where the heart of the tech hardware business is now."
Link to Original Source

+ - Micron Launches First SSD Based On 16nm NAND Flash->

Submitted by MojoKid
MojoKid (1002251) writes "Samsung made some waves earlier this year with the introduction of its 850 Pro family of solid state drives and the first commercial use of 3D stacked NAND Flash memory. Micron is striking back today with a lower manufacturing process geometry in conventional NAND, however, along with a new Flash technology it claims will accelerate performance more effectively than competing solutions. The new Micron M600 family of solid state drives will launch at capacities ranging from 128GB to 1TB across multiple form factors including 2.5-inch SATA drives, mSATA, and the PCIe-capable M.2 platform. The M600 uses Micron's newest 16nm TLC NAND, which allows the drive to hit a better cost-per-GiB than previous generation drives. The drives are built around the Marvell 88SS9189 SATA 6Gbs controller, which has been used by a variety of other SSD manufacturers as well. The M600 family of solid state drives performed relatively well throughout a battery of tests, though it couldn't quite catch Samsung's 850 Pro. Pricing for the M600 reportedly will be competitive at approximately $.45 — $.55 per GiB."
Link to Original Source

Comment: Re:Not so.... (Score 2) 491

by Dr. Spork (#48010687) Attached to: Utilities Should Worry; Rooftop Solar Could Soon Cut Their Profit
That's a useful chart, but it shows that the growth of coal in Germany is far larger than the total portion of solar. So if you want to call that growth a "fluctuation," you should call the contribution of solar a "rounding error" or something. What I learned from the chart is that in Germany, the burning of household trash produces twice as much power as solar, and this is growing much faster than solar. I bet it's also costing the customers far less and provides other benefits, like municipal hot water. So yes, Germany is having a bit of a trash burning revolution, and I applaud this. The solar thing though, I don't think that's going so well for Germany.

Comment: Re:Really? (Score 1) 491

by Dr. Spork (#48010587) Attached to: Utilities Should Worry; Rooftop Solar Could Soon Cut Their Profit
Um, you tell me about the coal burning plants that Germany shut down, and I'll hunt down the links for the 12 ginormous coal-burning powerplants that have opened up since 2010. The largest of these are designed to burn fucking lignite (brown coal, the dirtiest thing we have ever used for power in the history of mankind). This is not 1812, it was 2012. CO2 emissions are growing faster in Germany than anywhere else in Western Europe, while US emissions are sinking over the same timespan. Germany acts like it's some model citizen because everyone loves to hear about solar this and that, but most of their power comes from coal. Also, most of their new capacity comes from coal. Every year this decade, even the proportion of German power that has come from coal has increased. Yeah, coal. For this I hardly think they deserve any congratulations.

Comment: I love Obj-C. I've used it since 1989. (Score 1) 313

by jcr (#48008177) Attached to: Ask Slashdot: Swift Or Objective-C As New iOS Developer's 1st Language?

But as I've said many times since then, I'll switch when something better comes along. That time has come. Swift is a major improvement over Obj-C, and it was developed to meet Apple's internal needs, by engineers who know Obj-C inside out.

It's kind of a kick being a beginner again. Swift takes some getting used to, but I expect it to give me as much of a productivity improvement over Obj-C as Obj-C gave me over C++.


Competence, like truth, beauty, and contact lenses, is in the eye of the beholder. -- Dr. Laurence J. Peter