sells the steam box, as most non console games can be driven at any arbitrary resolution supported by your display. Sure textures may be crappy at that scale, but texture filtering and up sampling can go a decent way there.
This fact is especially interesting when you look at the V8 source code, a lot of which makes use of SIMD by either inline asm or separate asm files for things like jpeg rendering. A lot of this is pulled from libjpeg turbo and other libraries, but there is a fair bit of that going on.
Solaris "solved" the 32 / 64 bit issue the same way OS X did: a 64 bit kernel with an entirely 32 bit userland. OS X doesn't do this anymore but OpenSolaris did for quite a while. I haven't checked if this is still the case in the most recent OpenIndianas. So the seamlessness that you are referring to is simply a matter of adopting one as opposed to the other. This also limits the total address space for any given process to be the 4 GB limit. This can be problematic for something like a CAD or 3D rendering package, in which consuming 5 or 6 GB even with just memory mapped IO is the norm.
And for what it's worth, all the people saying that chrome is 32 bit only, that is likely the case only for windows:
Even with all of the asm included in the chromium source code it can still be built cleanly as 64 bit. Of course the same can be said about the Linux versions of Firefox, I suppose. I too am baffled why 64 bit windows support is this complicated. When developing for Windows I find all sorts of weirdness between their 64 bit and 32 bit compilers. I found the 64 bit C compiler strict about where declaring string buffers (had to be at the beginning or it wouldn't compile). The 32 bit compiler of the same version of Visual C compiled and ran the code just fine.
Yeah and somehow products only officially supported with redhat rpms don't work on Slackware and the likes...
Ubuntu is close enough, ABI compatibility and packed shared libs are all we really need to ensure it works on another distro.
Why do they need more revenue, isn't Mark Shuttleworth a multimillionaire still? I thought he was about philanthropy not profits. Why couldn't they just generate ad revenue from their web page or their built in Ubuntu One music store or something? This just seems wrong.
Following up here's just one example of one of his comments from a 1997
I consider linux the second most important platform after win32 for id. From a biz standpoint it would be ludicrous to place it even on par with mac or os/2, but for our types of games that are designed to be hacked, linux has a big plus: the highest hacker to user ratio of any os. I don't personally develop on linux, because I do my unixy things with NEXTSTEP, but I have a lot of technical respect for it.
What happened to you, John?
for me to see my childhood hero throw FUD about market viability for my platform. John Carmack was once an open minded individual who cared about technical feats and versatility in the engine (read some of his former
John Carmack used to be a man of principle and not cater to tempestuous marketing. With all of his influence now he says this garbage that has the potential to destroy the momentum that Valve has been generating toward a formerly unsuccessful effort? Developing games for Linux, even if it isn't a marketable success it will be a technical success and a step forward for games. When software development firms can work this closely with hardware developers and inspect EVERY piece of the stack games have the potential for more efficient hardware utilization and smoother effects.
For day to day operations does it really matter? I do not find a person's dress code at all reflects their abilities and I am damn sure not distracted by what someone wears. Then again I am looking at code all day, not people.
Asus doesn't inexpensively license the technology to other board oems. Not sure how much of this is software and how much is hardware, but if there is a special USB-SCSI command set that is separate from plain SCSI then they will need to be open and supporting on that front for all OS's as well.
You're just now coming to this conclusion?
considering the absurd amount of compression applied to the video during upload. Seems like transcoding between at least two different lossy codecs after applying already pretty conservative compression parameters.
Correction that should be write, not right.
Premature optimization does not refer to hacked sloppy solutions as much as it does illegible and neglible counter intuitive code practices that account for little to no gain. You should only be trying to squeeze out fewer microseconds when the program calls for it. It is a widely known rule of thumb to first right things clearly and perhaps even naively, profile and then optimize when the performance is not within acceptable range.
Spit out plain and simple bribed legislation I don't know what does.
Tape has always been a increasingly attractive solution to my backup needs, and certainly better than bdrs. However until I am in a scenario where I need to perform a full restoration from tape I will wait this one out. Backup zpools and simple rsyncs I have done but dealing with all the potential mechanical and electromagnetic mishaps of tapes I have no experience with. I have read some of the main criticisms of that backup media are the failures during the reads and writes.