Neither am I willing to take the word of some random dude on the Internet. Barring any more proof, I don't think we should be putting any stock in this.
They can put them in the room that used to hold all of Hotmail's servers. Plenty of space there.
You can to think about that. So it doesn't prevent gun suicides. The fact aside that someone can commit suicide with something else, the person doing it would be an authorized user of the gun. So no help there.
It doesn't prevent gun homicides. Again, these are done by authorized users of the gun, or people who have time to modify the gun. Remember for all the clever electronics, in the end guns are mechanical devices. So ultimately the electronics have to be something that mechanically disables the gun like a standard mechanical safety. A trigger disconnect, a firing pin block, that kind of thing. Ya well those are dead simple to bypass. So no help for stolen guns, the criminals would just remove the safety.
It doesn't prevent accidental shooting by any authorized user of the gun. Since they are authorized, it will fire. So any drunken games, etc, are still just as dangerous as they were before.
Already here we have, by far, most of the shootings that happen.
It may not prevent shooting where a gun is taken away from someone. Depends on how it works. If it has some way of reading the fingerprint when the trigger is depressed, then ok it could work. However if it works like a safety where you disengage it when you grab the gun, it'll still be disengaged if someone takes it away.
It would prevent accidental shootings where an unauthorized user gets their hands on the gun, like a kid coming across it.
Ok well, that doesn't seem very useful to me. The correct answer to the problem of kids is to lock up your guns. That is much more secure, particularly since something like this would only be effective if you didn't authorize you kids to use it, or remembered to remove their authorization when they were done at the range. Having them secured in a safe fixes the problem nicely. Likewise, that provides pretty good protection against theft.
So I really don't see what this will solve, and it will make things more expensive and complicated. It just doesn't strike me as very useful.
Maybe 6-10 hours of staff time. What I mean is you have to factor what your people cost you. If someone costs $50/hour when you count in salary + ERE (meaning payroll tax, benefits, insurance and all other expenses) then 6 hours of their time costs $300. So, if your transition wastes more than 6 hours of their time, it is a net loss.
You always have to keep that cost in mind when you talk about anything: What does it cost your employees to do? This is the same deal with old hardware. It can actually cost you more money, because it takes more IT time to support. Like if you have an IT guy whose salary + ERE is $30/hour and you have them spend 20 hours a year repairing and maintaining an old P4 system that keeps failing, well that is a huge waste as that $600 could have easily bought a new system that would work better and take up little, if any, of their time.
That is a reason commercial software wins out in some cases. It isn't that you cannot do something without it, just that it saves more staff time than it costs. That's why places will pay for things like iDRAC or other lights-out management, remote KVMs, and so on. They cost a lot but the time they save in maintenance can easily exceed their cost.
Just remember that unless employees are paid very poorly, $300 isn't a lot of time. So you want to analyze how much time your new system will cost (all new systems will cost some time in transition if nothing else) and make sure it is worth it.
Then you've never worked in an enterprise environment that uses it. You'll have a ton of tech support and maintenance costs with Linux. You not only have all the regular user shit, people who can't figure out how to use their computer, administrative stuff, etc. However I've also observed that a good bit of the stuff in Linux requires a lot of sysadmin work, scripting and such. We do Linux and Windows in our environment and we certainly make Linux work on a large enterprise scale, but our Linux lead spends an awful lot of time messing with puppet, shell scripts, and so on to make it all happen. A lot more than we spend with AD and group policy to make similar things happen in Windows.
Licensing savings are certainly something you can talk about savings for, however you aren't getting out of support and maintenance. That is just part of running an enterprise. The question is what would their costs be, compared to Windows? that is likely to vary per environment.
If you aren't a known developer, people want to see some evidence that you have the ability to make good on your plans. Game development isn't simple, and many people are not prepared for what they are going to have to do to bring a successful game to market.
So Doublefine or inXile can get a good bit of funding with nothing but a design doc for a game because people have faith they'll be able to deliver since they are experienced game devs. New crews are going to have to show something to get people to trust them.
Particularly in light of past KS failures in that regard. I've backed a number of games on KS and two of them I knew were fairly high risk: They were being done by an individual who hadn't done a game before, and there wasn't any sort of demo up front, just some basic concepts. I decided to take a risk on it, but fully understood that failure was likely.
Sure enough, both are floundering/failing. One hasn't had any updates in months, the other does update periodically but it is still extremely rudimentary, despite being way past the planned launch date, and it is pretty clear the dev just doesn't have a good idea how to proceed from here.
On the flip side, the games by established studios have either delivered or are well on track (Shadowrun Returns was brilliant, Wasteland 2 ships next Friday, Pillars of Eternity is in beta, etc). Likewise the indy titles that had a demo and were a good bit along with development have delivered, like FTL.
So no surprise many people aren't willing to take the risk. They want a better chance of return so they stick with established devs or with things that have some proof.
People vastly misunderestimate the riskiness of various occupations in the US.
The writers of the US Constitution has the foundational documents of Sparta available to them. They deliberately chose to go in the other direction. This point seems to be one that is conveniently ignored by self-styled originalists.
For one thing that is a work of fiction, and there is plenty in there to give it away as something that will never be reality. Then there's also the fact that Minecraft is a poorly optimized Java game with graphics from the 1990s, not the foundation for some world wide universe.
But funny thing about money, people always want more.
Off the top of my head DirectX, Visio, Internet Explorer, Bungie, and Wininternals all worked out very well for them. Not saying you personally like all the products there, but they were all commercially successful in a number of ways.
I mean seriously, why would you want Mojang? Minecraft itself has already made most of its money. You'd never make $2 billion on it going forward, it's big sales have already happened. So you'd be buying the talent/IP for future games... ya, about that. Mojang seems to have little or nothing at all in the pipe to speak of. 0x10c has gone all of nowhere, Scrolls has very little interest anymore and that's about it.
When you look at Minecraft, particularly what it started as, where it came from (Infiniminer) and how much has come form community contribution, it is fairly apparent that Notch is not some genius game designer, he just had the right idea at the right time, and got lucky that it went viral. Minecraft was not some amazing feat of design, it was a digital lego game that struck a chord with people. Fair enough, and he deserves his success, but that isn't the kind of thing worth buying in to, particularly given 0x10c's complete lack of development.
I can't see what MS hopes to gain. Maybe the Minecraft name? I guess, in theory, that is worth some money but I don't really think so. I think people will happily play a good builder game, regardless of title.
Just seems like a bad use of money to me.
6mbps is about as good as it gets. That's what Youtube and Netflix use for 1080p stuff. So that is the standard you need to worry about for streaming in general. Yes, I know that Blu-ray is higher bitrate, but little if anything streams at that rate. For the web, 6mbps is "high quality". You might not care for that definition, but it is what it is.
If you asked them not to change the definition because "broadband" technically refers to how data is transferred (10gbit ethernet is not broadband, despite the speed, it is baseband) then ok, you can be cpt pedantic.
However this is just you lying. 4mbps is not "enough" for the modern Internet. Currently I find the breakpoint to be about 20mbps. That is the point after which normal users won't notice much, if any, improvement. As such, that is my baseline for recommendation to people. 10mbps is serviceable I guess, but is a pain for video streaming. 4mbps would be a real issue, even low bandwidth streams wouldn't work well.
The minimum needs to keep rising. We keep finding more to do with our net connections. These companies are just whiny because they don't want to have to roll out FTTH, they want to keep doing DSL and pretending like that works.
The way they design their CPUs it is easy to have pretty much any number that is divisible by 2. It isn't a big deal to have something that is any particular amount more or less. So then it comes down to power, thermal, and die size limits.
Apparently 18 cores is what they cap out at, this time around. I'm sure you'll be able to get 16 core, and less, chips, that is just the most they could stuff in there before exceeding whatever design limitations they'd set.