Forgot your password?
typodupeerror

Comment: Re:It's not the absolute values that matter (Score 2) 104

by marsu_k (#48193551) Attached to: Which Android Devices Sacrifice Battery-Life For Performance?
But I'm really not sure what they're measuring with the arbitrary battery life score. As an anecdotal example, I have an Xperia Z2. I get a full day of heavy use out of it, or 2-2 1/2 days of moderate use. According to pretty much every review, Z3 has slightly better battery life. So, assuming the scale is linear (and my Z2 would thus score slightly lower than a Z3), OnePlus One should have a 4-5 day battery life? Which I really don't think it has.

Comment: Re:Methinks there's more (Score 2) 314

While I'm not sure getting rid of the big notes would solve anything, I thought not minting 1c and 2c coins was brilliant. Just a trip to another Euro nation makes you remember why as your walled is stuffed with them.

(The system works thusly: prices can be whatever, but when you pay the total it is rounded to the nearest 5c - it might be just a bit more or just a bit less than the total of your purchases, but it'll even out in the long run. If you're picky about paying the exact amount, use your card.)

Comment: Re:There's a lot of horse shit in this summary (Score 1) 338

...and you're missing the fact that Unity, on consoles, will be only on Xbox One / PS4. So yes, while theoretically the One will be able to calculate a bit more (given its slightly higher clocked CPU, were we only calculating), it will fall behind due to severely gimped GPU and memory bandwidth. Yet the consoles are at a parity. Funny thing, that.

Comment: Re:Cell (Score 2) 338

OK, so you work for Ubisoft, on to the topic at hand - Unity is not going to be on PS3 or Xbox 360. When we are talking about the current consoles (PS4 and Xbox One), they are basically the same - the difference being one has a slightly higher clocked CPU and a massively inferior GPU, along with much less memory bandwidth (yes, there's the "massive" ESRAM totaling a whopping 32MB), versus a slightly lower clocked CPU, much better GPU and a memory bandwidth in a totally different ballpark.

Given these specs, how can you justify the parity? And in before the PC master race comes to claim their superiority: we bow to thee, not questioning that, I'd just like an explanation from an Ubi developer WRT current consoles. The backtracking the official PR has given so far has been amusing, but I'd like to hear if there's an actual technical reason.

Comment: Re:Dislike Arch (Score 3, Insightful) 303

by marsu_k (#48100667) Attached to: What's Been the Best Linux Distro of 2014?

I wouldn't use it an a production server, it's too bleeding edge for that and there is no such thing as LTS on Arch. But it's awesome as a dev system and as a general state of the art desktop/laptop.

Very much this. We use some quite rapidly-developing technologies like node.js at work - the production servers are obviously updated very conservatively (and don't run Arch), but as a developer I can check out the latest version pretty much immediately after it is released and see if the updates cause some issues, or if there are new features that would benefit us in the future. And as said, works very well as a desktop.

Also, I was pleasantly surprised a while back when this laptop had to be repaired for a while - I had an older laptop that I hadn't used / updated in over six months, and thought getting it up to date would cause a lot of pain (Arch had moved into systemd during that time - no, not getting into that debate here). All that was required in addition to a regular pacman -Syu was to alter my boot line a bit to use systemd.

Make headway at work. Continue to let things deteriorate at home.

Working...