Forgot your password?
typodupeerror

Comment: Re:Solution (Score 1) 273

If everyone just pays a flat sales tax rate, the poor bear most of the economic burden.

No. If everyone pays a flat sales tax rate, the people who spend more will bear most of the economic burden. The poor would pay a smaller part of the overall economic burden (because the poor spend less money in the overall economy than the rich do.)

It is possible and likely that the burden on the poor would be more as the percentage of they money that they have that goes to taxes could be higher, but it is inaccurate to say that poor would pay more of the overall burden to the economy a.k.a. "the economic burden".

Please do not confuse "the economic burden" with "the burden on the poor". It is an important distinction. They are both important issues, but they are different issues, and should have different arguments and conversations behind them.

Comment: Re:Next steps (Score 1) 599

by Nkwe (#47903435) Attached to: High School Student Builds Gun That Unlocks With Your Fingerprint

If this or a similar technology ends up in guns (and assuming it can actually be made to work), we end up with a computer in the gun that knows who fired the gun. It is not a technical stretch to add time and location detection circuitry and end up with a record of the when, where, and who of each firing.

While I am generally opposed to such technology in guns, I can see one positive aspect: We could prove what we have known all along, Han shot first.

Comment: Next steps (Score 1) 599

by Nkwe (#47902707) Attached to: High School Student Builds Gun That Unlocks With Your Fingerprint
If this or a similar technology ends up in guns (and assuming it can actually be made to work), we end up with a computer in the gun that knows who fired the gun. It is not a technical stretch to add time and location detection circuitry and end up with a record of the when, where, and who of each firing.

This is either a strong positive or negative depending on which side of the "gun issue" you are on, but I haven't seen much discussion on what the tech could lead to (and its ramifications to each side of the debate). There are many interesting potential ramifications:
  • Privacy
  • Use of the log as evidence
  • Static Geo-Fencing (prevention of gun use in predefined locations)
  • Dynamic Geo-Fencing (on demand prevention of gun use in dynamically added locations)
  • Firmware updates
  • Taxes or fees per round fired

Comment: Re:This is no different. (Score 1) 206

by Nkwe (#47839903) Attached to: Should Cyborgs Have the Same Privacy Rights As Humans?

Are you suggesting that said *pacemaker* is storing location information without any method to nondestructively access it? If so, I call bullshit. If not, the cops need only use the same interface to extract the information without killing you.

I am not talking about the technical ability to extract data from the fictional future device, I am talking about the legality. My point is that if some future medically necessary device did for some reason store historical location information, that such data should be covered by the same laws that protect a person from self-incrimination. If I don't have tell tell the cops where I was last Thursday, a medically necessary device that I can't live without and which I can't control the data collected on, should also not be available to the cops to extract the data about where I was last Thursday.

Comment: Re:This is no different. (Score 1) 206

by Nkwe (#47839889) Attached to: Should Cyborgs Have the Same Privacy Rights As Humans?

Why does it matter if the device is physically inside you or necessary to live? Why is a futuristic pacemaker any different than a cell phone?

It is about choice. In my opinion, it is different because such a device would not be carried by choice nor would it have data that you voluntarily placed on it. A cell phone or other computer you carry by choice. Data you put on your cell phone (pictures, email, GPS tracks, etc.), you put on by choice. With a pacemaker (or other medically necessary device), you really don't have a choice to have with you (unless you choose to die). Operational data that such a medical device might gather, you don't have any practical control over.

While fingerprints or left behind DNA can indicate that you were somewhere, they don't on their own give a history of the places you have been. You can't take a fingerprint or DNA sample from a person and get a history of all the places they have been. With an embedded device that keeps location history, you could theoretically extract the history of the locations you have been (without having to go to those places to collect evidence).

Comment: Re:This is no different. (Score 1) 206

by Nkwe (#47839199) Attached to: Should Cyborgs Have the Same Privacy Rights As Humans?

... There is no new legal questions created by putting electronics inside people rather than simply keeping them detached.

Maybe, maybe not. Let's say that you have some sort of future pacemaker or other medical device implanted that you need to stay alive. For whatever reason this device as part of its normal function also happens to have historical location information in it. Perhaps the device optimizes or alters its operation depending on your altitude or location. This device would be a part of you and having it wouldn't really be a choice. Would forcefully extracting information from such a device be any different than compelling a person to testify against their will?

Comment: Computer Science or Coding? (Score 2) 59

by Nkwe (#47807807) Attached to: Code.org Discloses Top Donors

"Under the leadership of Code.org, explained the ACM, it joined CSTA, NCWIT, NSF, Microsoft and Google in an effort "to reshape the U.S. education system," including passing a federal law making Computer Science a "core subject" in schools.

There are lots of comments here that show concern about mass producing coders and driving wages down. It is important to distinguish between Computer Science and Coding. "Coding", being the act of taking a specification or design and translating it into the syntax of a given computer language, likely is or could be a commodity skill or vocational level activity. "Computer Science", formally being the study and theory of how computers and software work, and informally the development of algorithms and solutions using computers (architecture and design of a specific solution) is a different animal. Computer Science is unlikely to be a commodity skill as it requires advanced skills, training/experience, and level of insight or art that not everyone has or can achieve.

Comment: Can stuff this small work in the real world? (Score 1) 49

by Nkwe (#47767399) Attached to: Scientists Craft Seamless 2D Semiconductor Junctions
I am curious as to if a conductor that is only a couple of atoms "thick" can be practical in the real world. Normal conductors can withstand all sorts of abuse as they have a large number of atoms and can afford to have a significant percentage of those atoms moved, removed, converted (reacted with), etc. If you have a conductor that is only three atoms thick, each atom is going to count. How do you prevent just one of those atoms from being dialoged due to mechanical stresses, chemical interaction, cosmic rays, or whatever? Does this require that these conductors be sealed at an atomic level in a vacuum or other inert container and is this feasible?

Comment: Re:Never gonna work ... (Score 1) 506

by Nkwe (#47760771) Attached to: California DMV Told Google Cars Still Need Steering Wheels

How do you plan to handle 300 cars all trying to pull over and stop at the same time, because they have no idea what to do?

The same way you would handle a traffic jam when everyone ends up parked on the freeway. Presumably as the car becomes less sure of what to do, it would begin to slow down (never violating the primary rule of not running into what is in front of you). As it becomes less and less sure, it would eventually stop. Pulling over is a bonus, but not always required. It is really the same way as a human driver I deal with unsure or unclear driving situations. For example if there is a wall of stopped cars in front of me, I slow down and don't run into them; if it starts raining or snowing hard and I can't see very far ahead, I slow down to ensure that I can stop the car within the distance I can actually see; if I think driving conditions are overly unsafe, I pull over at the first opportunity - potentially creeping along until I find somewhere to pull over. As a human driver, if I need stop in the road or drive significantly slower than the speed other drivers would expect, I do things like tap the breaks multiple times (to flash the break lights) and turn on the hazard flashers. I would assume that an automated driver would take similar actions to alert other cars (both automated and human driven) that unusual operation is about to or is taking place.

In your scenario, with lots of cars all becoming "unsure" of what to do, everyone just stops. Once everyone stops, each car can either start back up (assuming that the car understands "traffic jam") or the human drivers (who have now had time to become aware of what is happening) can take over.

Comment: Re:Never gonna work ... (Score 1) 506

by Nkwe (#47758669) Attached to: California DMV Told Google Cars Still Need Steering Wheels

As long as there is a pretense of handing back to the driver in even of an emergency, this is a glorified cruise control, and I'll bloody well drive myself.

If I'm ultimately responsible for the vehicle, I'll stay in control of the vehicle. Because if there's a 10 second lag between when the computer throws up its hands and says "I have no idea" and when the user is actually aware enough and in control, that is the window where Really Bad Things will happen.

I would agree if the human is expected to be able to take over at any time. But what about if automation was to the point that if the computer found conditions too complicated, it would pull over and stop the vehicle. Once stopped by the computer, manual controls would be used to drive in those "complicated" situations. You could have the option to interrupt the "safe stop" process and assume control if the human driver felt comfortable doing so, but If the logic included an unattended safe stop, would it be good enough? (I am not saying that we have the ability to build a system that could always achieve an unattended safe stop, but if we if we could - or at least build a system that could achieve an unattended safe stop at a provably better chance than humans can achieve an attended stop - would it be good enough?)

Comment: Re:Only 6 pairs? (Score 1) 135

For each fiber, you need an amplifier every 50 (?) km. You may run into a weight limit where the amplifier pack becomes too heavy to be suspended by the cable during cable laying.

And those amplifiers require power, which is hard to transmit over a cable at those distances. (Well maybe not "hard", but the length imposes practical limits.)

Comment: Re:Uncertainty/fear? (Score 4, Informative) 550

by Nkwe (#47524805) Attached to: Laser Eye Surgery, Revisited 10 Years Later

I'm not sure that this is still true, but don't you go blind for a few minutes while the procedure is going on? That's what frightens me - the thought that I might go blind and not have my sight come back.

Yes you do (but it is seconds, not minutes). The part of the procedure they don't really tell you about in advance is that they basically use a vacuum cleaner to suck your eyeball out of your head while they do the procedure. Actually they use suction to slightly pull on your eyeball and hold it still while the laser is doing it's work; while this is happening, you can't see out of the eye -- it all goes dark. This part of the procedure (which really only lasts for a few seconds on each eye) is fairly unpleasant and is probably the reason they give you Valium.

Most public domain software is free, at least at first glance.

Working...