You're presenting that a bit disingenuously; it reads more that he was writing the e-mail and used the opportunity to review the data, which is what caused him to change his mind. It's not like he arbitrarily decided from one paragraph to the next that “well, I guess minimize and maximize are going away today!”
This seems more like the GNOME folks don't have a solid idea how to integrate minimization into their new UI paradigm, so rather than saddle users with an implementation that seems poor (i.e. the complaint would probably have been “minimization in GNOME doesn't make sense any more”), they just took it out. I'm not quite sure why the maximize button was removed, although perhaps because they felt it would be odd to have only “maximize” and “close.” The functionality remains, however, and I will say that in Windows 7, I actually always maximize windows using the mouse gesture (or keyboard shortcut) and not the button.
There's always 2/2/2222...
So? Men do the same to me, and I am a man. That's how men communicate. Is it rude? Yes but that's how men are - constantly interrupting one another. It's not because you're a woman but because the men are treating you like any other man. You need to learn to interrupt them too, if you want to be heard.
That has nothing to do with sex. If you're constantly interrupting and talking over people, you're a rude asshole, and it's definitely not just “how men communicate.” It's perfectly possible for men to have good manners and follow appropriate turn-taking when having a conversation.
However, I will say that there are many assholes out there who have not mastered this basic form of courtesy, so I can see how you might get the impression that it's the norm. I've also known some chauvinists who would be more likely to talk over a woman than a man, so I can empathize with the grandparent poster.
I agree, which is why I'm viewing this page in Web Browser 11.4 on Operating System 7.9. Now if you'll excuse me, I need to get back to E-mail Client 4.0 and Text Editor 22.3.
Apple announced its partnership with Cingular in January 2007, but Wikipedia lists the acquisition by AT&T as occurring in December 2006.
It's likely that Apple was negotiating with Cingular before the AT&T deal was finalized; however, they were probably aware that the merger was going through, one way or another.
While I admit that it is indeed par for the course for US cellphone carriers, the pricing is a little funny.
Although you say the real, unsubsidized price of the iPhone is $400 for the 16GB version, if I want to upgrade the phone on my plan (I have an iPhone 3G) for that price, AT&T indicates they'll sign me up for another 2 year contract. Additionally, the prorated ETF for my contract after 12 months is $115; it would actually be cheaper for me to break contract, pay the $115, then sign up as a "new" customer to buy the phone at $200. Their pricing scheme would make a lot more sense if they just charged you the new contract price plus the prorated ETF (and it would grant the "they subsidized you, so suck it up!" argument a little more weight).
All that being said, none of the features are really compelling enough to upgrade for $400. I do wish I had a bit more space (I only have the 8GB model), but I'll just live without any lossless encoding on my iPhone until next year.
The socket AM3 Phenom 955 is a good budget choice for a completely new computer compared to the i7 920, but it still requires a new motherboard and RAM for an upgrade, and then it loses on price/performance.
The Phenom 955 is backwards compatible with socket AM2+, so you don't necessarily have to purchase a new motherboard and RAM to use it. You do miss out on some features (DDR3 RAM, I think HyperTransport is slower, that sort of thing), but I believe they have a fairly minimal effect on the workload of average users.
Although AMD is having a lot of difficulty competing with Core i7, one does have to admit that they've taken a lot of care in providing a nice upgrade path for older PCs.
I don't think OS9 was a UNIX-system. Ten years ago, that was Apples operating system.
Mac OS X Server 1.0 was released on March 16, 1999. Naturally, it wasn't a desktop OS; however, it was still NextStep (and therefore UNIX) based, and Apple did have it.
They were only prepared for dismal sales. They said the server initially ran 'less well' with 10s of thousands of people online at once. They sold 18,000 copies. All of those people will want to be online at once at the start, so they weren't even really prepared for the real sales they got.
Saying they were only prepared for "dismal sales" is a bit misleading. They expected that the initial launch wouldn't have a huge number of people—it is an independently published game in a niche market with almost no advertising budget, after all—but that the numbers would continue to grow as word-of-mouth spread. This would've given them plenty of time to worry about scaling issues as they started to appear.
It's true that this scaling would've needed to happen eventually if the game took off, but there's no disputing that the launch has suffered a bit from the unexpected popularity. If they were all paying customers, that would be one thing; however, as it stands, the developers had to go out of their way to support a bunch of freeloaders and deal with criticism saying they're unprepared for a launch. It's a pretty rotten way to treat a company that's been very customer-friendly and supportive in the past.
Not only that, they also have to put up with these absurd justifications. "The website didn't tell me enough, I don't trust reviews, and there's no demo—piracy is my only option!" "The pirates helped them identify their scaling issues!" "If only they'd had a serial code then we would've respected their rights!"
I don't mean to single you out—the first quote there isn't even something you said—but we really don't need more people trying to spin piracy as "not so bad" or whatever. 100,000 people are assholes who probably weren't customers anyways, and there is no romantic "sticking it to the man" tale to be had here. I hope that this doesn't discourage Stardock and Gas Powered Games from making PC games in the future.
Blizzard always does that. It's just part of their, uh, charm, I guess.
Although, it has worked swimmingly for blizz and WoW.
That's largely because their competition, at the time, didn't even do that. To take FFXI as an example, the crabs (yes, seriously) you fight at level 60 are visually identical to the crabs you fight at level 1. It was actually a nice improvement to pick up WoW and see differently colored trolls and things.
Modern MMOGs tend to have substantially more art assets, but then of course they have the difficulty of competing with nearly five years of live game content development. Large game publishers lust over Blizzard's subscriber numbers, but how do you break into that market with a new game and survive for longer than 6-12 months?
Although Spear of Destiny is the prequel to Wolfenstein 3D, it was released about six months afterward. Not that Wolfenstein 3D itself was particularly serious, what with mecha-Hitler, the zombies, etc.
The series has always had a pretty fantastical bent to it. It's actually worked rather well, in my opinion, and I just hope that they have something similar to Enemy Territory (or original RTCW) multiplayer included.
Technical sloppiness is really the last problem I think Ludlum fans would have with the Bourne movies.
The plot of the films is essentially entirely different than that of the books, with the exception of some character names and the fact that the main character suffers from amnesia for a time. By the time you get to worrying about technical details, you've already accepted that it's just another Hollywood action vehicle, so who cares?
I'd rather just believe that it's done by little elves running around.