Makes you wonder what type of constraints they were working under to come up with a solutions like [the power switch rod]...
If memory serves, this was to meet UL certification rules. For some reason, line voltage was not allowed to cross the case to the switch. That said, my first PC was a whitebox clone that completely violated these rules, so don't be surprised if your no-name PC from that era also lacked the Rube Goldberg rod linkage.
The ATX form factor solved this by using a low voltage signal to control the power supply -- the wires crisscrossing the case for this carry no more than 5V (with a large series resistance). Shorting that to ground turns the power supply on; this (plus a 5V standby signal powering a small supervisor microcontroller) is how your motherboard can control the power to the system.
It's called an Antenna....
Increasingly, no. Many sports are switching to subscription channels (ESPN, Fox Sports, Root Sports, CNBC/MSNBC, etc.) with limited or no legal options for streaming. MLB is there today and has a decent product; Olympics are ok; NFL is pretty much absent. I'm not sure about the NBA or NHL.
Most of the channels I can receive over the air (networks) don't carry much of interest to anyone in my household. But we watched so little TV that we went ahead and got rid of cable anyway. I did subscribe to MLB.tv, though (and am much happier than what we had from basic cable since I can now watch the teams of interest to me).
I suspect cable companies will be in trouble should the NFL decide to start streaming their games. I doubt this will happen, though; they're getting a ton of money from ESPN to stay right where they are (and most folks are resigned to just keep paying for cable to watch NFL games).
However, if I reach that limit I'm pretty sure I can pick it up like every other programming or markup language that I've needed.
Unfortunately, this is only sort of true. The basic syntax is easily learnable and readable -- certainly easier than mentally parsing most regular expressions.
But, oh god, does CSS have a ton of implicit modes. Are your sizes content box or border box? Is this div we're positioning being displayed as a block, inline, or inline-block element? Is there a float active? Has it been cleared? Did we duplicate the appropriate styles with -webkit- and -ms-? Why is it working in Firefox but not Chrome?
Layouts that would be a simple command in Tk (button
The purists then snootily point out, "Well, your problem is you're trying to build a GUI from a markup language." Fine, then: Give me a freaking proper GUI toolkit already. I'm reminded of Jamie Zawinski's quote (though he was referring to XWindows): Using these toolkits is like trying to make a bookshelf out of mashed potatoes.
So when you plug in a cable, the logo on the top is always correct. When it is a sideways plug, you are on your own.
I have a few cables which violate this spec (despite the USB spec being quite clear on this point). I'm not sure if it's a manufacturing error (cable assemblies sent to the molding process upside-down) or the manufacturer just being egotistical ("We want our logo to be visible to the user"). Western Digital, I'm looking at you...
I really ought to toss them (along with my collection of USB 1.1 cables and hubs).
I have a FTA system which is half setup, cobbled together from some spare parts plus a new receiver and LNBF.
The terrain near my house has proven to be unfriendly. I live on the west side of Puget Sound, so the satellites are already fairly close to the horizon. We're in a old-growth forest area; most of the trees around my house are around the 100' mark. We're just on the other side of a few hills which block antenna reception from any of the local networks, hence my tinkering with FTA equipment.
Actual link to the source appears to be http://www.computerhistory.org/atchm/microsoft-ms-dos-early-source-code, but it's throwing 503s at me right now.
Why not DOS 6.22? They're not making a bundle on that, either.
Distributing the source code to a proprietary product has a number of potential legal hurdles. If there are parts of the source which were licensed from another company (as would be the case with MS-DOS and SCP, IBM, Stac, and possibly others), those agreements need to be revisited and you may need to get permission from that company (or its successors) to do so. (I include IBM because, I believe, they took over much of the development for the 4.x series.)
MS-DOS 2.x might be the latest version they (currently) feel confident in being able to release free of these restrictions.
This brings a different kind of problem, which is that there becomes a whole new management level of keeping the two groups in sync. Otherwise, the "Canonical Labs" group might run off and do all kinds of things that are great, but which never get integrated into the main project.
But PARC was so successful! Oh, wait...
Your point is well taken. I believe it's a problem they already have, though: the Mir slip, shipping Unity before it was really ready, etc. Reorganizing -- even if it's done purely in Shuttleworth's mind and not on paper -- would bring these issues to the forefront.
I've found (as a rule of thumb) that, when asking a grad student "How much time do you think you have left before you can write up your thesis?", if the answer is two or more years out then it really means "I don't know." The student honestly believes this answer, but in reality he/she doesn't know how much he/she doesn't know.
I'm starting to feel about the same with Mir and Canonical here. Shuttleworth is the tenured but aloof professor who casually coaxes his students (employees) toward completing milestones but without too much urgency. Money's not plentiful, but the professor has enough contacts and contracts to keep his lab going and give a stipend to his students. They put out a few papers (releases) each year, and each time the students think this grand project is "almost done"... only to discover that there's still more left to do.
There's tremendous value in this kind of exploratory research. I'm just not sure it makes sense to package it up for end users.
If I were Mark Shuttleworth's technical advisor, I'd suggest examining RedHat's Fedora model. Create a small group called Canonical Labs where stuff like Mir and Unity can flourish, with continuous releases and without the artificial constraint of a set release date. (If this makes the environment too lackadaisical and development isn't progressing fast enough, find some other way to instill discipline and/or motivation; don't make it the threat of moving alpha code to end-users.) When it's stabilized (no longer shuffling menus and window icons around, for example), then integrate it with the main Ubuntu branch. Something a bit more edgy and up-to-date than Debian Stable or RHEL, but not so much that it constantly upends your users.
I've know a lot of really food engineering managers.
Obviously you meant "good" here, but it made me pause: is there a correlation between food and good managers? I've been reading more than a handful of materials (e.g. Peopleware ) which have mentioned eating together as a helping to build strong teams (arguably the most important job of a manager). A number of companies have caught on, from the big (like Google) to startups (one of my favorites, The Omni Group here in Seattle even has a full-time kitchen staff who are listed by name on their about us page).
Obviously, it's not a catch-all solution; heck, I suspect it's more correlation (that is, the managers who get their teams to eat together are more likely to care about their teams) than causation. But still gave me a pause.
Japanese companies [...] genuinely want to know how to make the business better by finding out how people actually work.
Their website actually bears this out. The good use of this technology will map out how well teams are communicating (which can sometimes make or break a project). We say that inter-team communication is good, and sometimes have meetings to this effect; but this can show whether the company is practicing what it preaches.
Alas, I share the same concerns as the naysayers. I highly doubt this would be used by any American company in a way other than to penalize individual workers for brief moments of inactivity.
Does p-doping Indium Gallium Nitride seem like a trivial process?
It's trivial with the right equipment and materials. Figuring out that you need to p-dope IGN to make an LED, on the other hand...
Reminds me of the story about the fancy car which, no matter what the shop mechanics tried, wouldn't start. So they call in an old mechanic buddy who had retired a few years ago to come take a look. He studies the engine carefully, making a note of the various fluid levels and temperatures as well as the sounds made by the engine. After going at it for a few minutes, he takes a bit of chalk, marks a spot on the engine, and then hits it with his hammer. With that, the engine roared to life.
He then handed the customer a bill for $100. Aghast, the customer replies, "I'm not paying you $100 for hitting the engine with a hammer!"
The old mechanic replies, "Hitting the engine was free. Knowing where to hit it is $100."
This guy managed to get it into 145 bytes (142 on his website, but he printed "Hi World" instead of "Hello world") with no external dependencies.
The smallest ELF executable I've seen is this 45 byte example. It doesn't print anything and it violates the ELF standard, but Linux (or at least his version) is still willing to execute it.
That said, there isn't much point in optimizing away libc except as an academic exercise. Yes, it's a few megabytes in size, but it's shared across every running userspace program (likely including init). Sluggish and bloated programs, in my experience, are almost always the result of poorly thought out algorithms, data structures, and use cases. (That said, the analysis on how to achieve the 45 byte ELF program is very interesting and educational.)
Promising costs nothing, it's the delivering that kills you.