Hierarchical start menu.
Hierarchical start menu.
If this or a similar technology ends up in guns (and assuming it can actually be made to work), we end up with a computer in the gun that knows who fired the gun. It is not a technical stretch to add time and location detection circuitry and end up with a record of the when, where, and who of each firing.
While I am generally opposed to such technology in guns, I can see one positive aspect: We could prove what we have known all along, Han shot first.
This is either a strong positive or negative depending on which side of the "gun issue" you are on, but I haven't seen much discussion on what the tech could lead to (and its ramifications to each side of the debate). There are many interesting potential ramifications:
- Use of the log as evidence
- Static Geo-Fencing (prevention of gun use in predefined locations)
- Dynamic Geo-Fencing (on demand prevention of gun use in dynamically added locations)
- Firmware updates
- Taxes or fees per round fired
Are you suggesting that said *pacemaker* is storing location information without any method to nondestructively access it? If so, I call bullshit. If not, the cops need only use the same interface to extract the information without killing you.
I am not talking about the technical ability to extract data from the fictional future device, I am talking about the legality. My point is that if some future medically necessary device did for some reason store historical location information, that such data should be covered by the same laws that protect a person from self-incrimination. If I don't have tell tell the cops where I was last Thursday, a medically necessary device that I can't live without and which I can't control the data collected on, should also not be available to the cops to extract the data about where I was last Thursday.
Why does it matter if the device is physically inside you or necessary to live? Why is a futuristic pacemaker any different than a cell phone?
It is about choice. In my opinion, it is different because such a device would not be carried by choice nor would it have data that you voluntarily placed on it. A cell phone or other computer you carry by choice. Data you put on your cell phone (pictures, email, GPS tracks, etc.), you put on by choice. With a pacemaker (or other medically necessary device), you really don't have a choice to have with you (unless you choose to die). Operational data that such a medical device might gather, you don't have any practical control over.
While fingerprints or left behind DNA can indicate that you were somewhere, they don't on their own give a history of the places you have been. You can't take a fingerprint or DNA sample from a person and get a history of all the places they have been. With an embedded device that keeps location history, you could theoretically extract the history of the locations you have been (without having to go to those places to collect evidence).
... There is no new legal questions created by putting electronics inside people rather than simply keeping them detached.
Maybe, maybe not. Let's say that you have some sort of future pacemaker or other medical device implanted that you need to stay alive. For whatever reason this device as part of its normal function also happens to have historical location information in it. Perhaps the device optimizes or alters its operation depending on your altitude or location. This device would be a part of you and having it wouldn't really be a choice. Would forcefully extracting information from such a device be any different than compelling a person to testify against their will?
Then I guess all of the folks of Oregon will just have to grow cannabis and self medicate till this thing blows over.
We have to wait until November to decide if this is a legal option or not. Of course there is a segment of the population not willing to wait...
"Under the leadership of Code.org, explained the ACM, it joined CSTA, NCWIT, NSF, Microsoft and Google in an effort "to reshape the U.S. education system," including passing a federal law making Computer Science a "core subject" in schools.
There are lots of comments here that show concern about mass producing coders and driving wages down. It is important to distinguish between Computer Science and Coding. "Coding", being the act of taking a specification or design and translating it into the syntax of a given computer language, likely is or could be a commodity skill or vocational level activity. "Computer Science", formally being the study and theory of how computers and software work, and informally the development of algorithms and solutions using computers (architecture and design of a specific solution) is a different animal. Computer Science is unlikely to be a commodity skill as it requires advanced skills, training/experience, and level of insight or art that not everyone has or can achieve.
How do you plan to handle 300 cars all trying to pull over and stop at the same time, because they have no idea what to do?
The same way you would handle a traffic jam when everyone ends up parked on the freeway. Presumably as the car becomes less sure of what to do, it would begin to slow down (never violating the primary rule of not running into what is in front of you). As it becomes less and less sure, it would eventually stop. Pulling over is a bonus, but not always required. It is really the same way as a human driver I deal with unsure or unclear driving situations. For example if there is a wall of stopped cars in front of me, I slow down and don't run into them; if it starts raining or snowing hard and I can't see very far ahead, I slow down to ensure that I can stop the car within the distance I can actually see; if I think driving conditions are overly unsafe, I pull over at the first opportunity - potentially creeping along until I find somewhere to pull over. As a human driver, if I need stop in the road or drive significantly slower than the speed other drivers would expect, I do things like tap the breaks multiple times (to flash the break lights) and turn on the hazard flashers. I would assume that an automated driver would take similar actions to alert other cars (both automated and human driven) that unusual operation is about to or is taking place.
In your scenario, with lots of cars all becoming "unsure" of what to do, everyone just stops. Once everyone stops, each car can either start back up (assuming that the car understands "traffic jam") or the human drivers (who have now had time to become aware of what is happening) can take over.
As long as there is a pretense of handing back to the driver in even of an emergency, this is a glorified cruise control, and I'll bloody well drive myself.
If I'm ultimately responsible for the vehicle, I'll stay in control of the vehicle. Because if there's a 10 second lag between when the computer throws up its hands and says "I have no idea" and when the user is actually aware enough and in control, that is the window where Really Bad Things will happen.
I would agree if the human is expected to be able to take over at any time. But what about if automation was to the point that if the computer found conditions too complicated, it would pull over and stop the vehicle. Once stopped by the computer, manual controls would be used to drive in those "complicated" situations. You could have the option to interrupt the "safe stop" process and assume control if the human driver felt comfortable doing so, but If the logic included an unattended safe stop, would it be good enough? (I am not saying that we have the ability to build a system that could always achieve an unattended safe stop, but if we if we could - or at least build a system that could achieve an unattended safe stop at a provably better chance than humans can achieve an attended stop - would it be good enough?)
For each fiber, you need an amplifier every 50 (?) km. You may run into a weight limit where the amplifier pack becomes too heavy to be suspended by the cable during cable laying.
And those amplifiers require power, which is hard to transmit over a cable at those distances. (Well maybe not "hard", but the length imposes practical limits.)
I'm not sure that this is still true, but don't you go blind for a few minutes while the procedure is going on? That's what frightens me - the thought that I might go blind and not have my sight come back.
Yes you do (but it is seconds, not minutes). The part of the procedure they don't really tell you about in advance is that they basically use a vacuum cleaner to suck your eyeball out of your head while they do the procedure. Actually they use suction to slightly pull on your eyeball and hold it still while the laser is doing it's work; while this is happening, you can't see out of the eye -- it all goes dark. This part of the procedure (which really only lasts for a few seconds on each eye) is fairly unpleasant and is probably the reason they give you Valium.
Very specially written? You mean any piece of Powershell, any
or any executable (and yes, if that executable doesn't understand the Powershell object pipeline, you can just hand it plain old text on standard input).