Comment OK, here goes (Score 1) 62
<3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3
(virtual hearts for cardiac patients)
<3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3 <3
(virtual hearts for cardiac patients)
While I'm not sure getting rid of the big notes would solve anything, I thought not minting 1c and 2c coins was brilliant. Just a trip to another Euro nation makes you remember why as your walled is stuffed with them.
(The system works thusly: prices can be whatever, but when you pay the total it is rounded to the nearest 5c - it might be just a bit more or just a bit less than the total of your purchases, but it'll even out in the long run. If you're picky about paying the exact amount, use your card.)
What happens when you try to make a cutting edge pretty 2014 game run 60FPS at 1080p on a four year old graphics card?
This. But granted, that's a remake of a 2013 game.
OK, so you work for Ubisoft, on to the topic at hand - Unity is not going to be on PS3 or Xbox 360. When we are talking about the current consoles (PS4 and Xbox One), they are basically the same - the difference being one has a slightly higher clocked CPU and a massively inferior GPU, along with much less memory bandwidth (yes, there's the "massive" ESRAM totaling a whopping 32MB), versus a slightly lower clocked CPU, much better GPU and a memory bandwidth in a totally different ballpark.
Given these specs, how can you justify the parity? And in before the PC master race comes to claim their superiority: we bow to thee, not questioning that, I'd just like an explanation from an Ubi developer WRT current consoles. The backtracking the official PR has given so far has been amusing, but I'd like to hear if there's an actual technical reason.
I wouldn't use it an a production server, it's too bleeding edge for that and there is no such thing as LTS on Arch. But it's awesome as a dev system and as a general state of the art desktop/laptop.
Very much this. We use some quite rapidly-developing technologies like node.js at work - the production servers are obviously updated very conservatively (and don't run Arch), but as a developer I can check out the latest version pretty much immediately after it is released and see if the updates cause some issues, or if there are new features that would benefit us in the future. And as said, works very well as a desktop.
Also, I was pleasantly surprised a while back when this laptop had to be repaired for a while - I had an older laptop that I hadn't used / updated in over six months, and thought getting it up to date would cause a lot of pain (Arch had moved into systemd during that time - no, not getting into that debate here). All that was required in addition to a regular pacman -Syu was to alter my boot line a bit to use systemd.
Machines have less problems. I'd like to be a machine. -- Andy Warhol