Almost all marketplaces are broken. Getting eyes on your website, users to download your app, people to watch your commercial, etc. are all not meritocracies. That's why there are whole categories of professions to handle them (advertising, SEO, etc.). Everyone that makes products knows that if you want to make a ton of money, don't put your money into making a better product, put your money into advertising your currently crappy product.
I got ripped apart a few days ago for making the comment that programming is currently at the equivalent maturity to medicine back in the blood-letting days. This is more proof that we haven't created adequate solutions for common problems like search yet. Sure Google was better than everyone before them and there has been a lot of advancement, but we have a very long way to go yet.
That's why software developers shouldn't insist on using the title Engineer. This kind of accountability is expected of an engineer, it's not an anomaly. When programming matures to the point where bugs are rare, then we will deserve the title.
I write software for a living and I'm well aware that if we were to compare computer science to medical science, the current era is roughly equivalent to the blood letting and leeches era. I can't wait for our penicillin to come around.
I don't see devs being hurt by this at all. Sure, Microsoft has changed what it is pushing, but their support of deprecated technology is still excellent. Not only is WCF still supported, but their SOAP stuff still continues to work just fine (and to be fully supported by Visual Studio), even though it hasn't been pushed for over ten years.
Also, the other technology supported for app store apps is XAML with a limited subset of the API. That's essentially what Silverlight was without the stupid browser plugin concept. So, Silverlight developers weren't left in the cold - 95% of their skillset is still useful for app store development.
The matter came from somewhere. The antimatter also.
Matter and antimatter both spontaneously come from energy. We've seen it happen in supercollider experiments. Current big bang physics posits that all matter spontaneously formed from nothing but energy in processes known as leptogenesis and baryogenesis. The big mystery is that according to the physics we've observed, the matter and antimatter should have mostly turned back into energy. However, none of our experiments come close to the energy levels of leptogenesis and baryogenesis, so nothing has been disproven yet.
On the other hand, the universe coming into being with matter already in it, or matter somehow being moved into it, both would be huge deviations from the current scientific thinking. More importantly, we have a pretty good explanation for how things are without resorting to external forces, there are just a few gaps to fill (like the one that is the topic of this thread). There's no good reason to open the Pandora's Box of outside interference, as it makes meaningful discussion almost impossible.
If we stick to the current research path of assuming the universe is a closed system - we'll eventually find out if it's true or not. But, if we start with an assumption that the universe isn't a closed system, then it becomes impossible to get answers to the hard questions. Any question where there isn't an answer readily available (like "Why is there matter and not antimatter")" will simply be dismissed with "It was there all along" and no one will really learn anything.
Theorizing the state of things before the big bang is for philosophers. Besides, if there was matter in the space our universe occupies before the big bang, it wouldn't have survived intact through the first few milliseconds due to the incredibly high energy density, not to mention surviving whatever process caused the big bang in the first place.
So, when cops have cameras, reported incidents of police using force dropped by half. I believe that means that 50% of uses of force were unwarranted or unnecessary, otherwise why would they have stopped?
This sound like pretty clear evidence that police think they can get away with bending the law as long as no one (except the victim) sees them.
Well, no it's not an outdated attitude -- corporate security is about mitigating risk, not eliminating risk, and part of that mitigation is preventing unmanaged devices from connecting to the corporate "trusted" network through NAC policies -- if your device doesn't pass the NAC check, it's not getting on the network, either let IT manage your device, or you can connect to the guest network.
Corporate security may be about mitigating risk, but IT is about providing services. It shouldn't be security's call to remove a service from the portfolio because they don't want the risk. Your job is to provide the service with as little risk as possible and to provide guidance to the rest of IT. Not allowing BYOD because in the name of security is like wiping everyone's hard drive in the name of security. Sure, you have reduced risk, but also crippled the system.
Most companies already treat insiders as threats, so BYOD on the corporate network isn't any additional risk. If you don't, then that's the outdated attitude I was referring to.
I know an AUP isn't security. I brought it up to say that they only require an AUP, meaning that no additional security precautions are taken.
The "hold you responsible" comment wasn't very clear, sorry about that. What I really meant was that if you are denying functionality then there better be an associated benefit. So, the eventual end of that logic is that if you take an extreme position of "all devices on the network must be controlled by me", then you should be held to an equally extreme consequence of "well, then everything is your fault - not professionally - personally". If you want to only bear professional responsibility then you should have stopped at "here is what it would cost to secure a BYOD environment" and not progressed to "No BYOD here.".
That's a bit of an outdated attitude. Any "secure corporate network" has dozens or even hundreds of compromised client devices on it at any moment (and possibly a compromised employee or two). Not allowing personal devices doesn't increase security all that much. On the other hand, the benefits of BYOD are accepted by most companies that employ knowledge workers. Most places I've worked (some were really big corporations) simply require an employee to sign an acceptable use policy before connecting.
Let me turn that attitude around: are you willing to be held personally responsible when a client is compromised by a zero-day? Control is an illusion in the twenty-first century, it's way past time to start building networks that are able to function properly even with untrusted devices on them.