I've know a lot of really food engineering managers.
Obviously you meant "good" here, but it made me pause: is there a correlation between food and good managers? I've been reading more than a handful of materials (e.g. Peopleware ) which have mentioned eating together as a helping to build strong teams (arguably the most important job of a manager). A number of companies have caught on, from the big (like Google) to startups (one of my favorites, The Omni Group here in Seattle even has a full-time kitchen staff who are listed by name on their about us page).
Obviously, it's not a catch-all solution; heck, I suspect it's more correlation (that is, the managers who get their teams to eat together are more likely to care about their teams) than causation. But still gave me a pause.
Japanese companies [...] genuinely want to know how to make the business better by finding out how people actually work.
Their website actually bears this out. The good use of this technology will map out how well teams are communicating (which can sometimes make or break a project). We say that inter-team communication is good, and sometimes have meetings to this effect; but this can show whether the company is practicing what it preaches.
Alas, I share the same concerns as the naysayers. I highly doubt this would be used by any American company in a way other than to penalize individual workers for brief moments of inactivity.
Does p-doping Indium Gallium Nitride seem like a trivial process?
It's trivial with the right equipment and materials. Figuring out that you need to p-dope IGN to make an LED, on the other hand...
Reminds me of the story about the fancy car which, no matter what the shop mechanics tried, wouldn't start. So they call in an old mechanic buddy who had retired a few years ago to come take a look. He studies the engine carefully, making a note of the various fluid levels and temperatures as well as the sounds made by the engine. After going at it for a few minutes, he takes a bit of chalk, marks a spot on the engine, and then hits it with his hammer. With that, the engine roared to life.
He then handed the customer a bill for $100. Aghast, the customer replies, "I'm not paying you $100 for hitting the engine with a hammer!"
The old mechanic replies, "Hitting the engine was free. Knowing where to hit it is $100."
This guy managed to get it into 145 bytes (142 on his website, but he printed "Hi World" instead of "Hello world") with no external dependencies.
The smallest ELF executable I've seen is this 45 byte example. It doesn't print anything and it violates the ELF standard, but Linux (or at least his version) is still willing to execute it.
That said, there isn't much point in optimizing away libc except as an academic exercise. Yes, it's a few megabytes in size, but it's shared across every running userspace program (likely including init). Sluggish and bloated programs, in my experience, are almost always the result of poorly thought out algorithms, data structures, and use cases. (That said, the analysis on how to achieve the 45 byte ELF program is very interesting and educational.)
At many companies, not giving two weeks notice will make you ineligible to be rehired. While you might not care, future employers might. It's a legal gray area, but one of the questions sometimes asked of former employers is, "Is X eligible for rehire?" as a way to skirt the we-can't-give-references issue. A "no" answer raises questions -- the impression it gives ranges from "Well, that company is just a bunch of jerks to their employees" to "He's has a bad attitude and makes it uncomfortable for everyone else" to "He was walking out the door with cash and half of their servers; they just couldn't catch him." If you're up for a position with multiple applicants, this could sink you.
While it might provide fleeting catharsis, not giving notice can't help you. At best it will do nothing; at worst, block you from a job you really want later on. Don't do it.
See this page; the Campanile movie is from SIGGRAPH 97. How is Disney's tech different?
I saw similar technology at CMU in around that same timeframe (late 90s).
My memory will be obviously hazy here, but the resulting output was much less refined. A simple box-shaped house, for example, ended up having wickedly jagged walls. The technology showed promise, but it was far from realistic.
The Disney folks, while not inventing the tech itself, seem to have taken it a step further. Their key claim -- "Unlike other systems, the algorithm calculates depth for every pixel, proving most effective at the edges of objects" -- certainly jives with my memory.
... but not due to the results; this is an example of good, solid science coming out of a secondary school with limited resources. Given what I could read of the translation, I don't think this is irresponsible journalism at all -- think of it more as journalism on the state of education, not science.
It is, of course, an extraordinary result, and will require extraordinary proof. I suspect the claims will not be reproduced; at the same time, I hope these kid-researchers keep their interest level in this experiment up regardless of outcome. From this, they'll learn about experimental errors, uncontrolled factors, and -- most importantly -- to divorce their ego from their results. That last bit is perhaps the hardest for most scientists to achieve.
They're only using the PCIe x8 physical connectors; the electrical signals do not resemble PCIe at all.
Presumably, they're also relocating the actual slot location to avoid stupid errors (like plugging one of these into an actual PCIe x8 slot or vice-versa).
Fahrenheit has its limit of 96 (not 100) set at body temperature (or what people believed it was before more accurate measurements), and 32 at the freezing point of water (i.e. an ice bath) for simple calibration of thermometers when they were being hand manufactured, since you can just split the difference between marks by eye in half to get to the single-degree markers.
More importantly, you can split the difference using geometrical constructions (compass and straightedge), which don't require another calibration source. The change from 96 to 98.6 actually occurred when the boiling point was recalibrated to exactly 212F. The actual original calibration points were 0F for the freezing point of a 1:1:1 water/ice/ammonium chloride mixture, 32F for a 1:1 water/ice mixture.
The development of the Fahrenheit scale is quite an interesting read, and it shows why the seemingly arbitrary points weren't arbitrary at all but dealt with the limited precision of the tools of the day. Not that this is any excuse to keep using it; we've have no need to split coins eight equal pieces for currency exchange and discarded our use of "pieces-of-eight" centuries ago, and a decimalized scale is so much more convenient.
What does Gooood need... with crashing a spaceship?
He was trying to keep us from killing him...
I can see the benefit in doing this for desktops: most cases are non-standard, which means throwing it out when upgrade time comes around. I've toyed with the idea of making a standard ATX case out of paper pulp.
But servers? Ideally, they would be mostly caseless: think blades, or using the rack as the case; just slap a face on the front (to maintain proper airflow), and you're done.
Now, if we could make circuit boards more recyclable, that would be terrific. Though FR4 is already fiberglass; I suppose it could be dissolved in hydrofluoric acid and the metals recovered, though I have no idea how environmentally (un)friendly that is.