Professor Pamela Samuelson has previously commented (PDF) on the implications if SCOTUS declined to hear the appeal.
More details at The Verge.
... there was not excuse for what they did. All engineers do have to make trade-off decisions, but the fucking deluxe fix was $11, that is it.They could have built that into the car price with virtually no impact. TFA picked one terrible example...
The problem is that there were probably hundreds or even thousands of $11.00 fixes to the car that would have made it incrementally safer. At some point the engineer has to prioritize which to implement and which not to implement.
Cars with autonomous freeway driving will be out in just a couple of years, according to automotive manufacturers. Nearly all the major players are predicting fully autonomous cars will be a solved problem sometime between 2020 and 2025.
Why is every cool technology always exactly ten years away?
AI will not write books, do programming, etc. Strong AI is a myth.
Unless human brains have some magical powers (like a soul blessed by God), there is no logical reason that machines shouldn't eventually be smarter than humans. The only question is how far off it is.
Not necessarily. It's entirely possible that if we build smarter machines, they will, in turn, make us smarter. If we start implanting computer chips in our brains and nano-electronic optics in our eyes, humans themselves can change and advance, and it's entirely possible that we can "beat" computers indefinitely.
Because claim 14 lacks adequate structural support for some of the means-plus-function limitations, it is not amenable to construction. And without ascertaining the breadth of claim 14, we cannot undertake the necessary factual inquiry for evaluating obviousness with respect to differences between the claimed subject matter and the prior art.
If SpaceX wants to move forward against Blue Origin, this opinion bodes well for them, but they will need to take their case in front of a different court.
Link to Original Source
I want killing to be as hard on people as possible, so they think before they do it.
Making something difficult is neither necessary nor sufficient to make people think about it.
He didn't dodge it. He said "We're not worried about lawlessness. Our job is to make the most secure product we can. Our job is not to help enforce laws" It's a rejection of the premise of the question, sure. But it's not a dodge. It's a clearly articulated moral stance.
Your paraphrase would be a moral stance, but he didn't actually say that. His answer ignores that Tor is used for Evil, it doesn't come out and say that any evil created by Tor is a necessary byproduct of the good that it creates.
There must be more to life than having everything. -- Maurice Sendak