So here's a serious question... why can so many other countries do it well? They combine healthcare and government and it's fine. So is the US functionally retarded? I don't think we are, but if this is really the undoable task that half this thread implies, what's wrong with us?
States rights survived the civil war. Or maybe you feel they didn't. In either case, that was the point at which we decided the scope of states rights. Gay marriage is nothing compared to that.
Or maybe... just maybe... it's because it's not as bad as you think.
Of course it's much easier to assume everyone else is wrong than to question your beliefs.
> > "Everyone knew what was in it"
> Prove it.
Prove they didn't.
Oh, is that a stupid reply? Yes it is. You can't prove either thing in a meaningful way but you can look at the situation and draw a reasonably solid conclusion.
1. The basic structure of the law was fleshed out a more than a decade earlier by the Heritage Foundation. It was a well known idea.
2. The basic model was put into effect in Massachusetts years earlier. People knew how it worked in practice.
3. The ACA was discussed for months in congress and even hours on live TV, with all the key players on both sides of the aisle in attendance.
4. For the public there was a easy to comprehend, footnoted summary PDF provided by congress online many months in advance, as well as a nationwide town-hall campaign that completely backfired because of loud-mouthed reactionaries.
5. The people who claim that nobody knows what's in it apparently know more than enough to criticize it.
There was more open public discussion and understanding of the ACA than any other law I can think of in my whole life, and I ain't young. If you want to make the case that some people were willfully ignorant of the contents (i.e. death panels), I'll agree. But that is not the fault of the ACA.
> Electrons move around a nuclei the same way planets move around suns
Not even remotely. This idea was proposed back when humans had no understanding of subatomic behavior, and they were drawing assumptions based things they did know, like the solar system. If you want to actually know how electrons and nuclei behave, try to wrap your mind around quantum mechanics. It's almost impossible as it bears little resemblance to anything else you might be familiar with.
It's an interesting example, though, because it illustrates how whenever humans don't know what they're talking about, they fill in the gaps with things that are familiar. Like chariots carrying fire through the sky and an anthropomorphic God creating the universe.
From there your comment just goes further off the rails. Nobody thinks they're "smarter than everyone else". But observation and reason let us learn about the world, and we've learned over and over that mankind's notion of God is always several steps behind our observational understanding. Everything that has improved in the past two centuries has been at the hands of man. We're slowly figuring out ways to improve our lot in life. God's word was around for thousands of years before the enlightenment and didn't improve anything.
The universe is amazing, and every facet fills me with awe. But that doesn't mean there needs to be a personality behind it. I can take it for what it is without having to project my ideas of meaning onto it.
As evidence of USB being an improvement I submit the explosion of USB devices, from cameras, to mp3 players, to thumb drives, to midi/digital audio interfaces, to webcams, to wifi adapters, to external CD and DVD for laptops, to...
You get the idea. The ubiquity of such devices could never have happened under the old system. The interfaces of yesteryear were a stumbling block for innovation in many ways. Thank god they're gone, and thank god we've got a brain-dead simple (from a user perspective) interconnect that is simple and cheap enough to implement from a developer perspective to have allowed the world to move on.
Remember, you can always roll back to the DIN/COM/LPT/FDC/IDE setup. Plenty of parts still around. Heck, I fire up a C64 emulator once in a while myself
Sorry, but that's not how it works. If you want to say that there's a consensus to operate out of spec, that's fine, but according to the spec 10ms is the minimum amount of time before you can communicate. Which is another way of saying it's the maximum amount of time you have to get ready for communication. The Intel engineer that claimed it meant that they could take longer than that is an idiot - reading the spec that way makes it meaningless.
I think it is probably wise that other systems give a longer delay. In fact it seems a little naive to assume that every device is going to operate on such tight tolerances. There is also a chance, as the previous poster said, it may not really be giving 10ms, depending on how various signals get queued - if that's the case, then Linux was wrong, as opposed to the devices.
Whatever the case, it doesn't have to be an OS pissing match. Linux seems to have implemented the spec with no wiggle room beyond the spec. Other systems left wiggle room beyond the spec, which may have been wise. It's like browsers and quirks mode - rendering tag soup is kind of wrong, but you're not going to get a popular browser if you fail to render a page every time someone mis-nests their tags.
And I'm on a Mac, so this isn't an ego thing for me. I just like to keep the facts straight.
As someone whose computer experience predates the birth of USB by many years, I find all the criticism of USB to be a hoot. I mean, sure, it's a mess compared with an ideal system, but oh my lord it's so much better than the mess we had before I don't even know what to say. When peripheral interconnects are so good that we resort to complaining about USB, it's a better world than I could have dreamed of 25 years ago.
I'm all for a nice and straightforward means to compare chips, but you're wrong that clock speed was a good way to do that - at least after about 1998 or so. There's just way too many other ways in which a chip can be faster or slower. Cache size, cache speed, cache prediction, instruction size, data path latency, pipelining, hyperthreading, multiple cores, etc, etc, etc.
The Pentium IV really put a stick in the idea of comparing clock speed because they actually made it do less work in each cycle so they could have more cycles per second - but the same amount of work. They intentionally inflated the clock speed just so they could fool people. That's one of many reasons my Core 2.4 Ghz smokes any Pentium IV 2.4 Ghz.
To summarize: the world is complicated and clock speed is a lousy metric. It's fine for comparing chips of the same architecture, or for comparing across chips when the clock speed difference is enormous. But no, returning to clock speed as our speed metric is not the end of stupid marketing, and the fact you thought it was just means you bought into the previous stupid collection of marketing.
> The religious answer is generally that there is some essential component of you (i.e. a soul) that persists after death
It's true. It's the same place the data in your RAM goes when you power down.
I have no race in this horse (never played Minecraft or Roblox), but why do you assume it was luck that let Minecraft take the lead over Roblox, as opposed to it being better?
And I'm not looking for some subjective response like "Roblox has way better XYZ!" I'm wondering why people flocked to Minecraft. There must be a reason. If the answer is "luck" that probably just means Minecraft is better in ways that are hard to quantify.
After we're done with this thread, we should take Shigeru Miyamoto to task for getting other people to finish his games too.
> it's not feature complete
No software that is sufficiently popular is ever feature complete.
Feature complete basically means EOL.
> I don't publicize what I can't finish.
It's a really good thing the open source software movement did not abide by this thinking.
I agree it doesn't have to be the same way people do it. Could be an entirely different system. I'm just saying it has to be a lot more powerful than language. AI that focuses on language as the bottom layer will always be parlor tricks, not intelligence.
And while you're right we don't need to understand how people achieve intelligence to make an AI, it would sure help if we at least had a definition of what intelligence was, which we don't. Or rather, every time AI meets the definition we realize that it was a lousy definition. I predict the article's suggested tests will be more of that.
To carry your analogy further - I'd say that if we had so utterly failed to cut wood for so long, and in fact couldn't really even understand how wood was cut, we might want to take a peek at how that crosscut saw works before flailing around too much longer.