Well, I'm expecting a lot from any new Windows version, not because I use it but because it could prevent people from my family to ask for help when everything got broken.
So, I have one question : does it enforce more control on installed software or is it still the jungle of spyware, adwares and viruses ?
When I saw they did a windows store, I thought that finally, I'd have a good way to tell people how to get their machine fast and virus-less : only install software from the store where software is controlled and coming from the original provider (like we do in linux : install everything from controlled repositories).
Unfortunately, the windows store is just a huge mess for metro apps, not a way to install software in a more secure way.
Windows will be a good OS the day it won't auto-destruct over time, won't require an antivirus to suck all performance out of your CPU and kill you hard drive within a month.
Yes, it definitely makes sense for government computers.
But the next question is : does it make sense for any personal computer ? Of course not. SIMD is largely based on puppet (who wants to be NSA's puppet ?
Other governments or organization could have found find this project helpful, but the cost in reading every single line of code (because, you know, it's the NSA) completely kills the interest of reusing someone else' effort.
And yet, I find OPM pretty good in how they handle the situation. Full disclosure is not really a technique of the past and I'm quite surprised to see them contact every person who had data stolen and provide all details about what exactly was stolen.
I'm not sure all gov agencies in the world would act that way.
This guy is right explaining that dumb computation about password strength is stupid.
However, I disagree with the conclusion. Asking people to learn impossible to retain passwords is not the solution. Force them to choose a not-trivial but not hard password (entropy >10000) and apply well-balanced password trying policies (100 tries max per month). Everyone will be happy this way.
I confirm that here in France, homeopathy is very common, and even MDs frequently use it.
But let's be serious. The placebo effect is one of the most effective thing in medical problems. The problem with it is that if you don't believe in it, it no longer works. Building false theories that makes sense for most people is therefore a skill that can be much more effective than finding real cures.
So, in a way, I can't blame people who use it just because, as an ultra-rational guy, I do not have the "chance" of being able to use those things with a positive effect. Maybe using astrology and homeopathy would indeed increase the efficiency of the health system. Not because it prevents illnesses, but just because we have to recognize that it really works by misleading people's brain.
Mod parent up.
Finally a post which get the point right. The title is horribly misleading. Microsoft didn't end win7 support, only new features. That's about time.
Well, after reading the article again, indeed that could work on Linux. I thought there were windows vulnerabilities in the mix, but it turns out I read that wrong.
That said, I think that malware/adware is a major attack vector. And Linux/Android/iOS do not fear adware because applications are reviewed and controlled. Of course, you can always have a vulnerability in the Linux packages / Android Apps, but it makes things much harder and especially for the average guy's PC.
But true, for that special case, linux could as well be a target.
Using windows is currently a real nightmare for the average guy. Most of the computers of un-computer-educated people I know are full of malware and adware.
At some point it was seen as a fatality. iOS and Android just showed people that it was not. That's why Microsoft Windows is (finally) dying. Ransomware may be the thing that will decide people to finally switch to something else.
And maybe 2015 will be the year of linux on the desktop
Full HD was nice when it was on 24 inches screen. When you see a Full HD picture on a big screen, the pixels are so big that you may wonder "is this high def ? The pixels are bigger than my old 1990 TV !".
That 8k monitor only has 160 dot per inch. That's not impressive at all.
For a monitor of that size (55"), having an 8k panel is nothing but hard to do. The difficulty resides in the production of the video (computer images are easy to render, but having a CCD captor at 8k is a different story) and the broadcast of the video (bandwidth, CPU, and HDMI cable at that frequency).
I have to disagree. It really depends what you are doing. I love C and I believe that C will not be replaced for certain pieces of low-level software (kernel, libraries,
However, when you need to write a script or a dynamic web page, using C is painful and actually not a good idea. Python and PHP are much better for that. I'm not a language fetishist, I'm just an average lazy programmer. When I need to do some work, I choose the most efficient tool to do it. I won't try use a new language because the grammar is kewl. Usually, I switch to other languages when I feel it is much more appropriate to my current task.
With enough experience, yes. And you can check the generated assembly afterward to see if it matches what you expect.
Of course there are some cases where you know that you don't know what will get out of the compiler, but for the majority of if-then-else assignments, you don't get surprised by what comes out of the compiler.
There are exceptions on some architecture (e.g. Itanium) where the CPU is so complex that is it very hard to predict anything, but on x86/ARM, it's pretty simple.