They want people who are easily swayed and don't look in to shit to go out and vote for their guys. They know an election could easily be swung if you can get people like that to vote for you.
When by "Strong AI" you mean "a computer with human like intelligence" that may not be something that can be done. We don't even know. It may well be that the kind of intelligence we have is a strictly biological and that you can't replicate it in silicon. It may be no matter how powerful we make computers, no matter how clever their programming, no matter how much they "think" they are never a Strong AI. We just don't know at this point.
So it is really premature at this point to be doing any kind of doomsaying, or other prognostication, about Strong AI. We don't know if such a thing will ever exist, much less what form it will take if it does. Like even if it can exist we have no idea if it would have emotions as we do. Perhaps those turn out to be biochemical in origin, and thus a Strong AI doesn't have the. So it might completely lack ambition, desire, anger, or anything that would lead it to try anything against humans. It might be completely self aware, rational, and perfectly ok with doing whatever it is told to do and serving humans because it simply has no desire for anything else.
All of this is unknown, so maybe let's chill until we start to see if AI is possible, and if so what it is going to look like, before we get all doomsdayer on it.
Radars produce signal when not active. Normal ones aren't "off" when not taking a reading, they are inactive, which means their components are still warmed up. They emit detectable signals, nothing electrical is quiet when it is on.
Now there are what are called "pop" radar guns that go from off to on real fast... but they are, near as I know, not legal for measuring speeds since such a device cannot be made accurate. You can't make a 20GHz transmitter that turns on and stabilizes in a fraction of a second.
The big ones I can think of are Cadence SPB, Ansys HFSS, Ansys Fluent, Dassault Solidworks, Dassault Abaqus, Rocscience RS3D, Agilent ADS, Bently Microstation, PTV Vision, Intel Fortran, and Xilinx ISE.
There are more, but those are the ones I can think of we use the most off the top of my head.
Probably varies Linux distro to distro. In Windows, the MSU files are all signed by MS so the download path isn't of issue, since if it is compromised any alterations to the file would break the signature.
Want to know a big reason people have been getting Macs, that Apple doesn't like to admit? You can run Windows on them now. The Intel switch made it viable to run Windows on them, natively if you wanted, and good virtualization tech means it runs fast in OS-X. That lets people get their shiny status symbol, but still use the programs they need.
We've seen that at work (an Engineering college). Prior to the Intel conversion, there were almost no Mac users. The thing is engineering software just isn't written for the Mac. There is actually some stuff now, but even so the vast majority is Windows or Linux. Back in the PPC days, there was almost nothing. So we had only really two stubborn faculty that used Macs, one because he did no research and just played around, and one because he wrote his own code and was stubborn. However that was it, you just couldn't do your work on them.
Now? All kinds of faculty and students have Macs. PCs are still dominant, but we see a lot more Macs. However every one has Windows on it. Some it is all they have. Seriously, we have two guys who buy Macs, but have us install Windows on it, they don't use MacOS they just want the shiny toy. A number have bootcamp, and many have VMWare. Regardless, I've yet to see one, faculty, staff, or student, that didn't put Windows on it to be able to do the work they need to.
So that is no small part of how Intel helped Apple gain market share.
Speed matters less with each step up. Going from a modem to broadband is amazing, going from something like 256k DSL to 20mb cable is pretty damn huge, however going from 20mbps cable to 200mbps cable is nice, but fairly minor and going from a few hundred mbps to gbps is hardly noticeable.
I have 150mbps cable at home, and get what I pay for. Games from GOG and Steam download at 18-19MB/sec. It is fun, I can download a new game in minutes... however outside that I notice little difference from the 30mbps connection I stepped up from. Streaming worked just as well before, web surfing was just as fast, etc. The extra speed matters little to none in day to day operations.
Same thing at work. I'm on a campus and we have some pretty hardcore bandwidth, as campuses often do, so much it is hard to test as the testing site usually is the limit. Downloading large stuff it is nice, though really not that much less time than at home. I don't really mind the difference between a 2-5 minute wait and a 15-20 minute wait for a program. Surfing, streaming, etc all are 100% the same, no difference at all, speed seems to be limited by waiting for all the DHTML crap on a site to render, not the data to download.
While geeks get all over excited about bigger better more when it comes to bandwidth, for normal use what matters is just to have "enough" and "enough" turns out to be not all that much. It'll grow with time, of course, higher rez streaming, larger programs, etc will demand more bandwidth but still this idea that there is the difference between uber fast Internet and just regular fast Internet is silly.
It will not create any meaningful divide.
Even in the US such an amount wouldn't be a tax in the sense of raising revenue, but an attempt to stifle usage. That is a lot per GB, even at US income levels. As such in Hungary, this is even more restrictive, given the lower income levels. It is for sure an attempt to stifle usage, and not a legitimate revenue measure.
I mean sure if you use heavy usage games lots then maybe this matters, but most of your use is standby and cell network stuff. I've got my Note 3 lasting 3-4 days on a charge. How?
1) Turning off background services that slurp up battery. Just took some looking at the battery monitor and then considering what I needed and didn't.
2) Turning off additional radios like Bluetooth and GPS when I don't need them. It doesn't take long to hit the button if I do, and even when they aren't doing things actively they can sip some juice.
3) Having it on WiFi whenever possible. In good implementations on modern phones it uses less power than the cell network. Work has WiFi and I have a nice AP at home so most of the time it is on WiFi.
4) Using WiFi calling. T-mobile lets you route voice calls through WiFi. When you do that, it shuts down the cellular radio entirely (except occasionally to check on things) and does all data, text, and voice via WiFi. Uses very little juice and an hours long call only takes a bit of battery.
The WiFi calling thing has been really amazing. When you shut down the cellular radios battery goes way up. Not just in idle, but in use. Prior to that (when I first got it T-Mobile was having trouble with the feature) standby life was good, though not as good as it is now, but talk seriously hit the battery. Two to three hours could do it in almost completely. Now? I can do that, no issue, and still have plenty left.
"People in the past were wrong about what is possible, so clearly the naysayers are wrong about my thing!" See how stupid that logic looks? Trying to argue that cold fusion must be possible because people have been wrong about things in the past is arguing crosseyed badger spit. It is a nonsensical argument used by con men to deflect from their BS.
Here's the thing: With all these technologies that actually exist (#4 doesn't) you see two important things:
1) They are actually available to look at, in a non-controlled environment. You can verify them yourself, without some "researchers" standing over your shoulder, telling you what you can and can't see, what you can and can't touch. They are easy to verify they are real.
2) You can have the theoretical basis for how they operate explained to you, and that is consistent with our understanding of physics, chemistry, and so on. There's no hand waving, there's just science.
So when cold fusion hits that point, call me. When someone can say "Here is how this device works on an atomic/quantum level and why it is actually a fusion process," and when these claims are examined and confirmed by reputable labs at universities, where the researchers are given a device and allowed to do what they please with it, then I'm interested. Until then, STFU.
People have been wrong about things in the past, so clearly they are wrong about this thing!
In Arizona, it is a one party state for recording and you are automatically a party to things on your property. So you can record someone using your phone, without prior notification.
Not that it is the same as tracking someone all over via GPS, just saying recording laws vary greatly by state.
I've seen a surprising number of women that see gaming as a "boys thing". That is slowly changing with age, but it is still more prevalent than with men. When I was a kid, only nerd played video games or PnP games. Real boys splayed sports. That has changed now, and it is perfectly acceptable for all boys to play games, and most people are even coming around on male adults gaming. With girls/women, there is still a more prevalent view of it not being "normal" to be in to gaming.
Funny thing is, it'll come form women who do play games. They play something like Angry Birds or Farmville or the like. Despite being a video game, they don't see it the same as playing on a AAA video game on an Xbox or the like. It is different in their mind, probably because they have a hangup about gaming being an ok activity for a woman.
The good news is that it has been changing, and is continuing to change. I think before long it will be to the point where video games are just something most people play. Different people will have different interests in types of games, but it won't be a "kid thing" or a "boy thing" or a "geek thing" it'll just be an activity that is ok for anyone to partake in, much like TV is now.
It is amazing, given they are a big enterprise, but they really don't get what enterprises need, and just don't care. They want enterprises to use their iToys but don't want to spend any time on it. They just want to treat them like consumer devices and what you to spend your money and fuck off. It is really annoying.
They aren't much better to their people internally, either. Last time the campus Apple engineer came by, several years ago (our college doesn't use many Macs) it was shortly after Apple had suddenly discontinued their Xserve like. I asked him what they were going to do for their own web hosting, since they'd been using those. He said "I don't know, they didn't warn us about this or give us any guidance. We'll probably go back to using IBM systems like before."
The sad thing is Mac fanboys decide they want to use them for enterprise work, even though they are manifestly unsuited to it.
I mean I can understand feeling violated about having sexual pictures of you shared with the world. Many people are very private and shy in their sexuality. That's fine, nothing wrong with that.
However that rather runs counter to having a very sultry picture on the cover of a popular magazine with international distribution. You can't really claim that you feel violated by people looking at sexy pictured of you if you then choose to distribute the same voluntarily.