at least, they have to be able to play very well at around 1680x1050 or 1440x900 on one of those lower-power-draw cards (e.g. Nvidia 650 or AMD 7850).
I'm not sure what your desktop resolution is (I'm guessing it's around there). I feel like that's a bit much to expect a computer speced to run a desktop operating system (when using the 3d portion it's only doing basic texturing/compositing) being asked to run modern 3d game at full resolution. Commodity desktop computers have always lagged behind even mainstream modern games. Quake 1 required a floating-point math co-processor I didn't have, then games required 3d cards. There was usually a transition (software rendering mode when 3d cards were needed, PCI 3d cards when AGP started getting popular), but generally they had to be played at a significantly lower resolution and frame rate if they could play at all.
If this model wasn't profitable, they wouldn't be doing it. While there's obviously a market for games like The Sims and Myst (they're some of the top-selling PC games of all time), that's not the same market Crysis 3 is going after.
Chrome supports 10.5(sidebar)
Firefox supports 10.5
Safari supports 10.5
I did read an article the other week that Chrome is thinking about dropping 10.5 support in a few months.
Is this a ppc Mac? I just looked it up, and that was announced 7 years ago. It looks like they were selling them until Aug 06 (6 years ago), but if you purchased one in that time you can't really expect newer software to work.
All of the banks I've used require 4-5 days, verification, and send email notice (sometimes a post card) when you're transferring to an account for the first time. I'm not saying people should be flippant about their bank login info--but banks seem to do a decent job of notifying users about risky changes to their account (as long as your email isn't compromised, too).
I have LibreOffice downloaded, but only use it once every few months...so I haven't followed too closely (or really care too much about how efficient it is). But I thought one of the first things the LibreOffice team planned to do was remove the Java dependency everyone had been complaining about for years for causing bloat and slowing things down.
That's why Apple swapped their monolithic data blobs (for iPhoto, Aperture, etc) in to smaller files when Time Machine was released--like Sparse Bundle Disk Images, for example. Since the data is banded across multiple files, when some data is changed, you only need to back up the affected bands. I believe they marketed this as "Time Machine Aware" and advised third-party apps to adopt this approach. I thought Time Machine's approach was clever given the constraints of a traditional file system (in lieu of using something like zfs).
*embarrassed* I read them over but missed the data options mixed with the voice plans. About 6months ago I went over all of the options I knew about and was frustratingly disappointed.
Thanks for reading more carefully than I did. I'm looking to sign up with one of these guys after work today.
Maybe it's just me being picky and not a competition issue...but I rarely get close to 400min/mo. That's the smallest plan offered (unless I'm a senior) and I pay $39. I have a smart phone, but don't have a data plan (use wifi only)--but would love to have one. I think a data plan would cost another $20-30 (about $70 total before taxes).
None of those plans would really fit my needs. What I would prefer is similar to what I saw in London; a pay-as-you-go talk and data system. Nobody in the US has pay-as-you-go data and the pay-as-you-go talk stuff has weird rules where things expire at the end of the month or you get charged a dollar a day.
I'm willing to believe that I'm an outlier and one of the few who can look past that impulse to pay nothing up front in exchange for a contract and high monthly bill.
I can't remember the specific article I read, but hopefully the Loudness War is coming to an end because of the rise of digital streaming (and their normalization and other volume adjusting algorithms), but who knows, that could start a Dynamics War.
Wikipedia has some info on it:
http://en.wikipedia.org/wiki/Loudness_war#2010s
Is this the one you type in the lock screen? I just found and read the article and it's unclear. If so, I thought the iPhone makes you wait longer and longer after consecutive failed attempts which would slow down a brute-force attack quite a bit. Also, I can't remember if it was an Exchange policy, a feature on the iPhone (or of Android), but I thought I remember seeing a setting that would wipe the phone after 10 consecutive failed attempts.
Steve Jobs said it at the WWDC keynoe when it was announced in June of last year:
http://www.youtube.com/watch?v=HP37O0horpY#t=6m44s
"We're going to the standards bodies starting tomorrow and we're going to make FaceTime an open industry standard."
Sorry for the YouTube link, I couldn't load Quicktime streaming here.
http://www.apple.com/apple-events/wwdc-2010/
In the larger and and more critical config files usually we'd validate and keep it under revision control. Unless it's writing out an ascii config file, it's hard to diff a gui or work with example configs. It would be a major pain to take a screen shot of every tab if it's as vebose/complex as you described (and even harder validating from a screen shot of an old config).
The gui can have niceties like drop downs for enumerated lists or auto-correct for formatting (such as phone numbers).
I find it kind of funny that HDR means the opposite thing in photography versus video games
http://img194.imageshack.us/img194/7391/1244894383293.jpg (pulled from some old digg post)
Traditionally games render the world and keep it between 0 and 1 (zero being black/completely dark and 1 being white). HDR is computing values above and below and clipping so things that are blown out (like reflections and highlights) are super white. I think it was an update to Half Life 2 that first did this in a commercial game.
In photography, they take multiple exposures and stick them in to an HDR image. Then, they use tone mapping to convert it to an 8-bit visible image. Tone-mapped images are generally called HDR, even though that's a misnomer.
Roger Ebert asked the same thing (on page 4)
http://www.newsweek.com/2010/04/30/why-i-hate-3-d-and-you-should-too.html
I think there's a couple reasons. The first, and probably most significant, is nostalgia by film makers. They love the motion blur of 24fps. It helps evoke the "feeling" of film. Every film student I know either wants to shoot or convert their footage to 24fps. There is a noticeable difference. When you start increasing the resolution and frame rate, you lose motion blur and it starts to look like home video or video games (when generally don't compute motion blur at all).
Another big issue is the amount of light. When you have more frames in a second, each frame has less light to suck up. It's a big issue with high-speed film. Having sensors that are more light-sensitive is a fairly recent thing (combined with advanced noise reduction) and will continuously get better.
The stuttering is something cinematographers keep in mind when shooting (or at least, they should). I read an article about shooting imax and they said the biggest problem was the stuttering. They're also using 24fps, but the screens are much larger. When you pan, the object could jump 2 to 3 feet per frame. They intentionally had slower pans to compensate. You noticing this is probably a side-effect of larger theatrical screen and larger tvs at home.
Only through hard work and perseverance can one truly suffer.