The right way to level things (in all court dealings) would be to have both parties pay into a legal fund that compensates the lawyers for both sides. One side having more money should not entitle them to more power in court. If either side wants to contribute more so that the lawyers on both sides are better, that's great - go ahead. But the practice of buying a verdict by outspending your opponent on lawyer power should not be allowed.
It's happened with arson experts too. I remember reading a horrible story of a guy convicted of burning his family to death because all the experts described these "pour patterns" in the burnt floor, signifying liquid accelerant. After he was put to death, they figured out it was just carpet glue patterns.
Between the way police feel free to shoot fleeing non-dangerous subjects these days, planting evidence in full view of other officers, lying on the stand to get convictions, and the labs and experts from every field falsifying results, I'd say our legal system is a disaster.
And why isn't being obligated to serve on a jury silly? It's actually very much like voting - you are required to offer your opinion for the benefit of society, whether you feel like it or not.
From a practical perspective, required voting takes some of impact of emotions out of elections, which is good thing. It also overcomes the various ways that people are obstructed from voting. These things outweigh the unavoidable issue of people casting random (i.e. self-cancelling) votes or other shenanigans.
I guess you haven't read "The Wisdom of Crowds".
One of the things they talk about in there is how the random noise of idiocy tends to cancel out allowing for a good result - but only if the sampling is done correctly. Required voting is one means to achieve that. Letting people decide if they want to vote or not skews things toward the irrational emotional, which is fairly obviously what has happened in the US.
I've been on reddit so long it took me a minute to realize I can't upvote you. Maybe not a lot of people here will agree with you, but you've nailed it. I work IT in environments with lots of regular folk and the power and flexibility I crave is a) useless to them and b) the source of the vast majority of their problems.
Just played the WebGL version in Chrome. Works fine, looks great. I don't know what you're doing wrong, but it's probably your fault.
So here's a serious question... why can so many other countries do it well? They combine healthcare and government and it's fine. So is the US functionally retarded? I don't think we are, but if this is really the undoable task that half this thread implies, what's wrong with us?
States rights survived the civil war. Or maybe you feel they didn't. In either case, that was the point at which we decided the scope of states rights. Gay marriage is nothing compared to that.
Or maybe... just maybe... it's because it's not as bad as you think.
Of course it's much easier to assume everyone else is wrong than to question your beliefs.
> > "Everyone knew what was in it"
> Prove it.
Prove they didn't.
Oh, is that a stupid reply? Yes it is. You can't prove either thing in a meaningful way but you can look at the situation and draw a reasonably solid conclusion.
1. The basic structure of the law was fleshed out a more than a decade earlier by the Heritage Foundation. It was a well known idea.
2. The basic model was put into effect in Massachusetts years earlier. People knew how it worked in practice.
3. The ACA was discussed for months in congress and even hours on live TV, with all the key players on both sides of the aisle in attendance.
4. For the public there was a easy to comprehend, footnoted summary PDF provided by congress online many months in advance, as well as a nationwide town-hall campaign that completely backfired because of loud-mouthed reactionaries.
5. The people who claim that nobody knows what's in it apparently know more than enough to criticize it.
There was more open public discussion and understanding of the ACA than any other law I can think of in my whole life, and I ain't young. If you want to make the case that some people were willfully ignorant of the contents (i.e. death panels), I'll agree. But that is not the fault of the ACA.
> Electrons move around a nuclei the same way planets move around suns
Not even remotely. This idea was proposed back when humans had no understanding of subatomic behavior, and they were drawing assumptions based things they did know, like the solar system. If you want to actually know how electrons and nuclei behave, try to wrap your mind around quantum mechanics. It's almost impossible as it bears little resemblance to anything else you might be familiar with.
It's an interesting example, though, because it illustrates how whenever humans don't know what they're talking about, they fill in the gaps with things that are familiar. Like chariots carrying fire through the sky and an anthropomorphic God creating the universe.
From there your comment just goes further off the rails. Nobody thinks they're "smarter than everyone else". But observation and reason let us learn about the world, and we've learned over and over that mankind's notion of God is always several steps behind our observational understanding. Everything that has improved in the past two centuries has been at the hands of man. We're slowly figuring out ways to improve our lot in life. God's word was around for thousands of years before the enlightenment and didn't improve anything.
The universe is amazing, and every facet fills me with awe. But that doesn't mean there needs to be a personality behind it. I can take it for what it is without having to project my ideas of meaning onto it.
As evidence of USB being an improvement I submit the explosion of USB devices, from cameras, to mp3 players, to thumb drives, to midi/digital audio interfaces, to webcams, to wifi adapters, to external CD and DVD for laptops, to...
You get the idea. The ubiquity of such devices could never have happened under the old system. The interfaces of yesteryear were a stumbling block for innovation in many ways. Thank god they're gone, and thank god we've got a brain-dead simple (from a user perspective) interconnect that is simple and cheap enough to implement from a developer perspective to have allowed the world to move on.
Remember, you can always roll back to the DIN/COM/LPT/FDC/IDE setup. Plenty of parts still around. Heck, I fire up a C64 emulator once in a while myself
Sorry, but that's not how it works. If you want to say that there's a consensus to operate out of spec, that's fine, but according to the spec 10ms is the minimum amount of time before you can communicate. Which is another way of saying it's the maximum amount of time you have to get ready for communication. The Intel engineer that claimed it meant that they could take longer than that is an idiot - reading the spec that way makes it meaningless.
I think it is probably wise that other systems give a longer delay. In fact it seems a little naive to assume that every device is going to operate on such tight tolerances. There is also a chance, as the previous poster said, it may not really be giving 10ms, depending on how various signals get queued - if that's the case, then Linux was wrong, as opposed to the devices.
Whatever the case, it doesn't have to be an OS pissing match. Linux seems to have implemented the spec with no wiggle room beyond the spec. Other systems left wiggle room beyond the spec, which may have been wise. It's like browsers and quirks mode - rendering tag soup is kind of wrong, but you're not going to get a popular browser if you fail to render a page every time someone mis-nests their tags.
And I'm on a Mac, so this isn't an ego thing for me. I just like to keep the facts straight.
As someone whose computer experience predates the birth of USB by many years, I find all the criticism of USB to be a hoot. I mean, sure, it's a mess compared with an ideal system, but oh my lord it's so much better than the mess we had before I don't even know what to say. When peripheral interconnects are so good that we resort to complaining about USB, it's a better world than I could have dreamed of 25 years ago.
I'm all for a nice and straightforward means to compare chips, but you're wrong that clock speed was a good way to do that - at least after about 1998 or so. There's just way too many other ways in which a chip can be faster or slower. Cache size, cache speed, cache prediction, instruction size, data path latency, pipelining, hyperthreading, multiple cores, etc, etc, etc.
The Pentium IV really put a stick in the idea of comparing clock speed because they actually made it do less work in each cycle so they could have more cycles per second - but the same amount of work. They intentionally inflated the clock speed just so they could fool people. That's one of many reasons my Core 2.4 Ghz smokes any Pentium IV 2.4 Ghz.
To summarize: the world is complicated and clock speed is a lousy metric. It's fine for comparing chips of the same architecture, or for comparing across chips when the clock speed difference is enormous. But no, returning to clock speed as our speed metric is not the end of stupid marketing, and the fact you thought it was just means you bought into the previous stupid collection of marketing.