Didn't bother me then. Doesn't bother me now. In future WaPo articles, I expect my eyes to glaze by them as if they didn't exist.
I'm not having a serious problem with this.
I hate today's commercials so much, I mute them if I can't fast forward them, and am almost forced to only watch DVR'ed content, and tend to avoid watching live TV now. I run adblock. When its a site I go to frequently, I whitelist it, and quickly block it again once I see an ad that does popups, or automatically plays audio/video, or otherwise detracts from my reading.
I would go nuts if a "buy it now" button popped up while reading fiction, but this is a newspaper article. I don't find the button intrusive, because I'm not trying to follow artistic nuance in a newspaper article. It doesn't really take up the screen, and they're placed in front of products to sell, namely "Charlie and the Chocolate Factory" and "The Great Gatsby".
It seems to me no more intrusive than a banner ad, and I'm much more annoyed at large rectangular ads that break up article paragraphs. So what am I missing here?
The problem is that game design is art, and game engines follow hardware developments. It becomes pretty difficult to figure out what the prevailing technology will be past a year; thus the decision of game engine to adopt becomes a crapshoot as well. It gets even more difficult when a gaming group "knows" the engine they're using will be out of date in two years, and have to cobble their own hacks into the old game engine, to have some feature available in two years. What happens when the "new" engine they've been anticipating doesn't get released "on time", or when it comes, they find out its utterly incompatible with their work in the previous year?
Telling indie companies to force their delivery schedules to one year increments makes it really difficult to put out an eye catching, large scale games. Its relegating them to a financial ghetto niche, even though it may be the "safest" and most predictable way to "ensure" success in a game launch.
We've known for at least a decade now that Pluto/Charon's barycenter is outside the mass of Pluto. That was one of many arguments used to delist Pluto from the Solar System planets. Those same "Pluto is a planet" fossils probably would demand Ceres be restored to planetary status, if they lived two hundred years ago.
I'd say right now there's a performance/pricing window that the box stores haven't matched right now. If you want something cookie cutter, for office work, then yeah, you'd be nuts not to buy from a box store or mail order. But right now, the ones that will optionally include an SSD, serious graphics card, bluray burner, etc. are charging a premium for them.
Its a plus (right now) to select a good motherboard to deal with things like UEFI/GPT, legacy BIOS, chipset issues like memory bandwidth. You don't get to select your mobo from the box stores, and that may limit what you can do, or how well the box will perform even if you buy a base box, and then add premium components. And then there's that ugly, "do I pay a premium for windows 7, or get a windows 8.1 which "I don't like?".
The timid should never build their own PC, unless they want to overcome their timidity and/or desire an excuse to be cognizant of new hardware minutiae. Some people are better off being consumers.
Technically speaking, computer science was being conducted long before the existence of a computer.
George Boole published his (Boolean) algebra studies in 1854. Logic circuits started as mathematical expressions, not originating with computing machines. Charles Babbage had no computer to work with when he designed the first Analytical Engine, followed by the Difference Engine (which was never built). Turing defined the computer in 1936, before one actually existed. There was quite a body of algorithmic and binomial research before the first (modern) computing machine (Say Attasanoff-Berry in 1939).
I have happily put together my latest desktop a year ago, returning after many years from abandoning the personal construction route. I figured at the time that it wasn't possible to extract significant price or performance advantages, with the introduction of Intel's Core2 hardware supremacy and box stores relentless commodification of the PC. So my purchased PC machines after that point weres desktops, and then laptops replacing the desktop.
Nowadays, I believe the motivating benefit are the performance advantages in selectively-purchased hardware, like SSDs, multi-terabyte drives, cutting edge graphic cards, and i7/Xeon CPUs. Yeah sure, you can select those features into a pre-ordered box, but it doesn't result in price savings or "optimal" hardware.
As for putting together computers, its "same as it ever was". Its even less ideosyncratic than 8 years ago. But you're still stuck learning the new part connectors, BIOS/UEFI details, hardware trends, etc. Along with the great suggestions like pcper.com, I'll throw in pcpartspicker.com. I recommend it, not so much for the construction articles, but for the database of consumer computer components. It gives you an idea of the prevailing prices of specific parts, and a handy personal page of your parts purchases. You can also then sift through other people's construction sheets and compare your purchases.
Yeah, I envy your plunge into workstations. Its significantly expensive hobby though, much more so than gaming machines.
seL4 is probably a subset of MACH. It wouldn't be an insurmountable problem to port HURD to run on top of seL4. What might be exceptionally difficult would be to rewrite HURD to take advantage of seL4's design, to produce a more "correct" version of a microkernel based OS.
IIRC, the HURD effort to replace MACH with L4 had nothing to do with difficulty salvaging HURD code to run on top of L4. It had to do with known security flaws with inter process communications in MACH and the original L4 implementation. There was a grad student looking to replace MACH with a prototype secure variant of L4 called coyotos, which was eventually abandoned.
Fuck HURD. HURD was a failure. HURD was a vanity project Richard Stallman wanted implemented to undercut the popularity of the fledgling linux OS. He abandoned his cheerleading effort for it over a decade ago. (I doubt Stallman even contributed code to the original HURD implementation.) Since then, its been whored out to every grad student looking to use it as a platform for their thesis. The whole academic drive towards microkernel OS is obsolete research, like using PROLOG to implement AI systems. Microkernels have been supplanted by hypervisors and secure ipc implementations. Really, if HURD worked, what would it be doing that would make it uniquely valuable when compared to all current operating systems?
Personally, I wish I could avert my eyes from this collision between two behemoth machines trapped in an event horizon.
Sorry, patents close to expiration.
The Russians can't pull the same crap the Chinese might still be able to do. They aren't the Soviet Union anymore, with an encapsulated economy. They operate as a capitalist nation, and they're not going to be able to "copy/clone" hardware, like they used to. They'll be shutdown economically by the WTO. They're going to have to leverage outdated designs that have copyright close to expiration. What they should do is partner on some level with the Chinese, so they at least can access modern fabrication facilities and techniques.
And no one will even speak the true threat the NSA poses to the world.
No one rational thinks that Merkel represents a credible ally of Al Queda. Its all about finding out what Merkel is doing, in order to surreptitiously or politically thwart Germany's political or financial actions which the NSA disapproves of. The NSA will undermine the democratically elected will of any nation, all in the name of US "security". Its not the first time the US tried to do this. Just ask Iran and Chile.
There is no jury trial in a FISA court.
Do you even think its possible to field a jury with the security clearances required to expose them to the information presented at a trial?
See the problem is that you think only in terms of binary. Ballmer was a failure or success. I don't think did as well as Jobs especially when it came to a larger strategic vision. Jobs had the vision. Ballmer was just managing things.
Incorrect. I'm pointing out that not everyone can perform like Jobs, and its obvious he was an industry anomaly. Most CEOs are more like Ballmer; they are managers, not pioneers, and they don't get the credit when they do a good job, or steer the best possible result in a losing cause. I use the basketball analogy to point out that there can be a lot of basketball superstars, and not match the performance of a Michael Jordan. That doesn't make them failures, and it doesn't make Ballmer a failure for not matching Jobs' performance.
Its unfair to compare Ballmer's tenure to Microsoft's peak in market cap, [...] Where's the contempt for Sam Palmisano
And what did Jobs start with? By any measure, Apple was nowhere near the same position as MS when both men took over their companies.
Who cares? We've already established that Jobs is a uniquely successful CEO. Jobs is the anomaly, not the standard expectation of a successful CEO.
No its not, especially if you're old enough to have seen the history firsthand.
There certainly was sort of an odd chauvinism among the mainframe/minicomputer group against personal computing. Luckily, the newbs to the industry didn't even bother listening to the "professionals".
Ironically, I'd have to agree with the Beeb on this one, even if it may be a propaganda newscast. They're absolutely right; if you don't know enough about 3D materials & basic firearms, you could end up blinding yourself, or damaging you hand. I could definitely see a teenager trying this, using their parents/brother's/friend's 3D printer.
But we all know the march of technology is inexorable.
Don't you also think its unfair to conclude Ballmer was a failure based on what Jobs was able to do? Its like saying Patrick Ewing was a failure because Michael Jordan was always in the way. (Except when MJ wasn't, and Ewing still failed, to the Rockets and the Pacers. Ewing was still a player to be reckoned with for most of his career, even if I think of Hakeem Owajuan as moreso, as well as having a higher opinion of Charles Barkley, Reggie Miller, and Shaq.)
Its unfair to compare Ballmer's tenure to Microsoft's peak in market cap, because that peak occurred before 2000, crashed real hard during the the 2000 bubble. Based on when Ballmer took over, and when he left, it only lost a fraction of its market cap at the time. Basically, Ballmer guided Microsoft into mediocrity, not really growing its value. But that's not the mark of a failure, considering that Sun Microsystems, Palm, etc. don't even exist today. IBM increased in market cap over the past 10 years, but it was a huge player in tech over a decade ago, and now is kind of a market afterthought. Where's the contempt for Sam Palmisano?
Lets face it, not every company can have a Steve Jobs for CEO. Does that make every tech company that didn't quadruple its market share a failure?