No, I can imagine how electronics can be designed to handle audio in a PC environment. But I realize that it is not often not done well. Yamaha made a very nice series of audio cards in the 1990s that were clean and well designed, for instance. I think in the end it comes down to the price/performance tradeoff - there is not a need to provide top-notch audio on a motherboard because mostly people would not appreciate it enough to pay for it. It's simple enough to provide the digital chips, but the analog part costs a bit more, in both space and money. And my top-notch preamps that I use with some performances occupy boards almost as large as a typical small-factor motherboard all by themselves.
Integrated audio isn't good enough, isn't great, and isn't for me. I have a pro-level sound studio, and there's no way your going to tell me that the noisy environment that is the system motherboard is going to give me results I can be proud of. Not even for gaming, thanks.
Discreet card? Ok, maybe, but generally you need to jump up to RME or some such before you can really call it good. I have a an RME RayDAT - This means that that all my AD and DA happens somewhere else, and not in the computer. It all goes digital over ADAT to my mixer (a Yamaha DM2000) where the conversion happens. Or it goes digital over ethernet (audinate Dante) to an X32, again where the conversion happens.
There are a ton of good external boxes to handle sound - some quite reasonable. Stay away from the onboard and cheap USB sound dongles. If you have the speakers to handle it, then why put up with bad sound?
I'm not an expect in functional programming, but I am an expert in other (object, etc) styles. While I appreciate the functional toolbox in languages such as Scala (which I use every day), I don't really see a way to do my day to day job in a purely functional way. Others have mentioned the I/O dilemma, but I think it goes deeper than that. Functional != Efficient for many of the tasks I perform, which are rather iterative. For many of my tasks, the overhead of the functional structures required are either much more memory intensive, or impose a run-time overhead that isn't acceptable. In the end, when what I have to do is move 300 fields from one data structure to another with edits, COBOL would be sufficient...
In related news, Mr. Greenspan has no clue about inequity in stratified markets. If you push on the top, you just compress the layers into smaller layers, with the bottom filling until it can absorb no more. Then you get slums, riots, and chaos. The only way the market works is with a strong middle class with buying potential. Without that there is no market, and hence no profits or growth. Once that contract is broken, it's not a long way to the bottom for most.
FAIL. Try Interplanetary.
Listen to the Beastie Boys...
Interstellar would be a cool trip also, and more probable of finding life than under 100 miles of ice on Europa. Of course there is that extra mileage charge on the rental, and the roaming fees would bankrupt you...
I was a big fan, and a game developer for the C64. Those were the days that a machine could be fully understood by an untrained person with a knack for programming. When the C128 came out, I was interested, especially in the 80 column screen and CP/M software compilers. But there were too many limits on the machine (no hard drive easily added, no real OS, etc.) and it didn't feel like enough of an advancement over the C64. My grandfather did buy one, and I had some time with his, but that never really sparked much either. My next machine would be the Amiga, and as soon as that become somewhat affordable by a college student (the A500), I never looked back.
This is all well and fine, until they herd us all into some kind of processing center and then hook us up like some kind of "D" cell in series to power the mastermind machine...
Richmond Science Museum, on the E&S DigiStar projector - we could play a space-war variant on the dome. No color, of course, but the resolution was pretty good if my memory serves me right. Plus the dials of the control panel were just about perfect for controls.
Drain the fuel, set it upright, patch it up, tow it to Atlantic City - Profit!
(Drop it Lake Mead - Profit!)
(Park it outside Boston - Lawsuit!)
Among my customer base? Yes, it's used internally. A lot of them are IT shops dealing with very old equipment, like 10 year old PC's. Some of them have internal intranet apps that only work on IE6. It will be awhile before those move.
I'll pop the cork when my customers get off IE6. Until then I need to sink development resources into maintaining and testing on IE6, no matter how painful it is.
Unfortunately my customers' IT departments are slow moving and not motivated in moving quickly off XP and IE6. Most of them are understaffed and underfunded and dealing with PC's that are sometimes more than 10 years old. I suppose they have more pressing problems, given that...
I think the fallacy in this argument is not that quality doesn't win out, but that quality isn't always important.
The problem is that the determination process is flawed.
I might make the decision that I need lesser quality (whatever that means) for an internal time-keeping application than I do for something customer-facing, such as my sales portal. The article is of course arguing that I shouldn't be making that decision based on initial cost but on longer-term factors, but on the management side of things as long as I've got a fixed budget rooted in the short-term I can't make that decision equally. Like many financial equations, X dollars today vs. X dollars tomorrow is in play.
There are two types of fools:
1. The fools who trust in the optimization skills of the compiler/JIT compiler
2. The fools who trust in their own optimization skills
Yeah, but there's rules for them:
1. Don't optimize.
2. Don't optimize YET.
Rule 1 is for type 1 - and is generally the best case. Then you can come along and after rule 2 has expired, make the improvement where it matters. Type 2 fools skip both rules and make a mess.
Jack was a charismatic person with an infectious personality. He always was genuine, and had a passion for teaching astronomy. I was traveling and visiting various planetariums up and down the East Coast, with a final stop in Miami to visit the Space Transit. Jack made me feel very welcome and gave me a ton of his time explaining what made his planetarium special. Eventually I came to know that it wasn't the equipment (although that draws the public in initially), but the people that make these programs successful. Jack Horkheimer brought the wonder of the universe down to earth for many people, and I'm glad to have known him, even if only for a short while.
... that THAT didn't go on for too long and they got 'em in a timely manner - I mean if that had kept up, millions of machines could have been compromised! I say, good thing they had LOTS of people investigating so we could catch these crooks before the damage was done.
(Yes, for the impaired, that's sarcasm!)
Two years to track this down?! Give me a break...