At many companies, not giving two weeks notice will make you ineligible to be rehired. While you might not care, future employers might. It's a legal gray area, but one of the questions sometimes asked of former employers is, "Is X eligible for rehire?" as a way to skirt the we-can't-give-references issue. A "no" answer raises questions -- the impression it gives ranges from "Well, that company is just a bunch of jerks to their employees" to "He's has a bad attitude and makes it uncomfortable for everyone else" to "He was walking out the door with cash and half of their servers; they just couldn't catch him." If you're up for a position with multiple applicants, this could sink you.
While it might provide fleeting catharsis, not giving notice can't help you. At best it will do nothing; at worst, block you from a job you really want later on. Don't do it.
See this page; the Campanile movie is from SIGGRAPH 97. How is Disney's tech different?
I saw similar technology at CMU in around that same timeframe (late 90s).
My memory will be obviously hazy here, but the resulting output was much less refined. A simple box-shaped house, for example, ended up having wickedly jagged walls. The technology showed promise, but it was far from realistic.
The Disney folks, while not inventing the tech itself, seem to have taken it a step further. Their key claim -- "Unlike other systems, the algorithm calculates depth for every pixel, proving most effective at the edges of objects" -- certainly jives with my memory.
... but not due to the results; this is an example of good, solid science coming out of a secondary school with limited resources. Given what I could read of the translation, I don't think this is irresponsible journalism at all -- think of it more as journalism on the state of education, not science.
It is, of course, an extraordinary result, and will require extraordinary proof. I suspect the claims will not be reproduced; at the same time, I hope these kid-researchers keep their interest level in this experiment up regardless of outcome. From this, they'll learn about experimental errors, uncontrolled factors, and -- most importantly -- to divorce their ego from their results. That last bit is perhaps the hardest for most scientists to achieve.
They're only using the PCIe x8 physical connectors; the electrical signals do not resemble PCIe at all.
Presumably, they're also relocating the actual slot location to avoid stupid errors (like plugging one of these into an actual PCIe x8 slot or vice-versa).
Fahrenheit has its limit of 96 (not 100) set at body temperature (or what people believed it was before more accurate measurements), and 32 at the freezing point of water (i.e. an ice bath) for simple calibration of thermometers when they were being hand manufactured, since you can just split the difference between marks by eye in half to get to the single-degree markers.
More importantly, you can split the difference using geometrical constructions (compass and straightedge), which don't require another calibration source. The change from 96 to 98.6 actually occurred when the boiling point was recalibrated to exactly 212F. The actual original calibration points were 0F for the freezing point of a 1:1:1 water/ice/ammonium chloride mixture, 32F for a 1:1 water/ice mixture.
The development of the Fahrenheit scale is quite an interesting read, and it shows why the seemingly arbitrary points weren't arbitrary at all but dealt with the limited precision of the tools of the day. Not that this is any excuse to keep using it; we've have no need to split coins eight equal pieces for currency exchange and discarded our use of "pieces-of-eight" centuries ago, and a decimalized scale is so much more convenient.
What does Gooood need... with crashing a spaceship?
He was trying to keep us from killing him...
I can see the benefit in doing this for desktops: most cases are non-standard, which means throwing it out when upgrade time comes around. I've toyed with the idea of making a standard ATX case out of paper pulp.
But servers? Ideally, they would be mostly caseless: think blades, or using the rack as the case; just slap a face on the front (to maintain proper airflow), and you're done.
Now, if we could make circuit boards more recyclable, that would be terrific. Though FR4 is already fiberglass; I suppose it could be dissolved in hydrofluoric acid and the metals recovered, though I have no idea how environmentally (un)friendly that is.
IBM PCs of the era had a similar option: attach the RAM to the ISA bus via an add-on card. Like the Amiga (and most computers of that era), the expansion bus was the processor bus (with a bit of buffering and maybe a tad bit of glue logic, but not much more).
As processor speeds increased, this became a problem. Many peripherals just weren't designed for the increased speed, so they divorced the bus speed from the processor speed by making it a fraction of the processor speed (ISA) or going asynchronous (Amiga Zorro III). This became quite pronounced with PCI (max 66MHz, even if you're running a 3.0GHz CPU); you can add memory onto the bus, but it will slow you down if you try to use it as main memory.
That doesn't mean it can't be used at all these days. The cluster computer folks have a concept called NUMA, or non-uniform memory access, where memory isn't considered necessarily equal in speed. Or you could treat it like a very fast SATA drive, provided you have the necessary means of keeping power to it during power failure events (or use it only as temp or swap space).
From Ted Ts'o's commentary, it's an optimization ("jbd2: don't write superblock when if its empty") gone awry:
The reason why the problem happens rarely is that the effect of the buggy commit is that if the journal's starting block is zero, we fail to truncate the journal when we unmount the file system. This can happen if we mount and then unmount the file system fairly quickly, before the log has a chance to wrap.
Basically, this optimization has the side effect of not updating the transaction log in this rare case. You can end up replaying old transactions after new ones, which will scramble metadata blocks. Given the rather unique conditions needed to hit this one, I'm not going to lose any sleep over any servers running without Ted's fix (though I'll certainly apply it once RedHat releases the patch).
Aereo is doing this for their TV-to-internet service: each user gets his/her own antenna, in the hopes that it avoids legal issues. They create stacks of mini antenna arrays and set them up somewhere in Brooklyn. The wavelength for TV is 30 cm to 5 m, depending on the channel; both dimensions are much larger than the dime-sized antenna shown there.
How this exactly works, well, I can't exactly say. Although I am an electrical engineer, I have to admit that antenna design has always been out of my league.