The resolution matters more than the 4:2:0 chroma. Try it with A/B testing.
The resolution matters more than the 4:2:0 chroma. Try it with A/B testing.
On a 4K display, antialiasing is already completely pointless. It's uses a huge amount of rendering horsepower for a blur effect that is impossible to notice without A/B comparisons. Competent system testers use it in benchmarks only to put more load on the systems, and incompetent ones to prove that SLI/CF builds are inadequate for 4K. Popular but incompetent review sites like IGN like to do that latter part regularly, which is really counterproductive because it only increases FUD and slows down 4K adoption.
Personally, I've been running games in 4K without antialising since 2013. First with an overclocked GTX 660 which required lowering the fidelity settings of new games. Afterwards, with a single GTX 980 which could run every game on maxed settings. About 6 months ago I built a GTX 980 SLI rig which could handle some useless antialising too, but instead I elect to put the cards in powersave mode which makes the rig quiet while gaming.
Disclaimer: 4K particularly on maxed settings will require you to forget about "stable 60fps" because high end graphics settings like that cause framerate drops unrelated to raw GPU performance.
Having used a 55" 4K 60Hz panel (Sony 55X9005A) as my gaming display since 2013, I can say that high resolution gaming is pretty much the same thing as high refresh rate gaming or VR gaming: you won't "get it" until you try it.
Furthermore, in my time I've observed three primary types of gamers:
If you're not in category 2 then I'm afraid you'll never "get" these very expensive high end products/builds nor do you need to.
People who are a balanced combination of categories 2 and 3 are the populous target audience of Asus, MSI, Alienware, etc etc and those people keep those companies afloat. It's not the high end customers who are interested in dual GPU setups or 4K at this time.
If you don't mind my asking, what's the difference between QA and preprod for you?
It runs on some people's laptops (not even gaming laptops) at reasonable quality settings and resolutions like 1080p. That makes it easier than most to run at high resolutions on a desktop gaming PC. It's caused by two things. Firstly it's obviously well optimized, and secondly, sometimes it looks like crap for a 2015 game. Which is because the base game is from... 2013? And for consoles. The best example of it is during the tutorial, when you get in a car that looks like things haven't moved on at all since GTA: San Andreas. Overall, the game is a mixed bag of great high poly models, average models, and terrible eyesore models.
On my PC it's the easiest game of late to run at 4K. It runs smoothly on GTX 980 SLI without sweating the GPUs. But it has some very strange framedropping happening occasionally which I can't pinpoint but would assume is the content streaming tech working (badly). In terms of system resources it might be VRAM running out and having to be repopulated, since the GTX980 is light on VRAM (only 4 gigs). The other hardware should be fine (i7 5930k at 4,2 GHz, 32 gigs of DDR4, 1TB SSD with more than a third of it empty).
The 4K experience in GTA V isn't as incredible as all the hype makes it out to be. It's nice for detail in certain scenarios (cutscenes with closeups of people, flying, offroading, looking out in the distance). Otherwise 4K works much better in open world environments with lush foliage and high details, like the latest Dragon Age, Skyrim, Tomb Raider and so on. 1080p just can't resolve the details of foliage and that makes the 4K experience so amazing. GTA on the other hand is mostly cityscapes, desert and ocean and while it's nice at 4K, it's not mindblowing, because it's old hat by 2015.
I have a couple of screenshots on my onedrive if you want to have a look. It's at 4K almost maxed settings - yes, even the Ultra settings which some people have missed. IIRC one of the advanced sliders didn't go all the way up because the VRAM-meter very helpfully prevented it. Anyway, compare the graphical fidelity to Inquisition for example, and judge for yourself.
Like any smartwatch the purpose is to get notifications on your wrist. It's super useful when your hands are full, you're driving, you don't want to start digging for your phone, you want to know whether the notification is actionable, and so on.
It's also well implemented. I tried the beta with a friend a few days before the final release. It has a way shorter delay than Twitch, about 8-10 seconds only (last time I broadcast with Twitch, the delay was something like double that). It was also very stable and bandwidth-efficient, both for the broadcaster and the viewer. It didn't stop to buffer even once during our test stream which was on full quality (I think about 3000kbps - a very nice quality 1080p gamestream). Both of us were on quite normal broadband connections, and quite regularly suffer unstable streams with Twitch. I think the only criticism from the broadcasting side was that it caused some microstuttering in some games, like Skyrim. In others it doesn't do that. I also doubt it's ever going to be as light as a streaming mechanism as ShadowPlay, but I hope I'll be proven wrong on that one.
What Steam reviews are actually filled with is information about the games... exactly what you should be interested in, as opposed to a score or a conclusion of some kind.
The aggregate score in the style of "very positive" etc. can be useful in filtering out the genuinely terrible games, but outside of that, not so much. What's needed for decisionmaking is a lot of information, a search engine, and your own thinking. Steam provides descriptions, tags, and now reviews, and for me anyway it's been incredibly easy lately to figure out whether I want to buy a particular game, or at least investigate it closer elsewhere.
Scores are almost completely worthless. Doesn't matter what kind they are (Metacritic, user review average, magazine review score). Steam has already done enough for the scoring system. What is there to fix? IMHO they should concentrate on important things like search, GUI and customer service, all of which are pretty terrible for 2014.
I don't think they mentioned official battery capacity or battery life numbers, but they did say "very easy to charge at night". That tells me it has 1 day of battery just like the Moto 360.
Honestly, the battery is the worst part of smartwatches currently. It ruined the Moto 360 for me and it comes close to ruining to Apple Watch, if it actually is only 1 day.
I would settle for 3 days, my Sony sw2 goes 4 days without charging. I was expecting the same from Apple, looking at the criticisms of the Android Wear watches which are all focused on the 1-2 day battery life. I don't want to charge a watch every night!! I get it, it has a nice screen and it's slim, and it's running a lot of sensors and wireless transactions, but still... just awful battery life!
I hope the phone and watch will work better than their webcast. It was terrible. Worst I've seen in years. You'd think with Apple's resources they could manage a big webstream without dropped connections, website going down, audio tracks on top of one another, constant buffering, etc.
To be fair, it's not quite so dire. There are plenty of shooters that do things differently. Shooters with RPG elements, shooters with stealth elements, shooters with puzzle elements... To ignore those is unfair because your ideas will probably fall into the same category - shooters with a twist (or many twists) to make them a little different than (most of) the shooters that came before.
My favourite shooters over the last few years have been "shooters with a twist". I've still got a backlog of them. There are more coming out all the time, just some are more polished than others, and some fit my tastes better than others. In fact, taking everything into account, I honestly think now is the most exciting time period ever to be a gamer. Powerful gaming hardware readily available, really deep games being made and being successful, big companies taking gaming very seriously, VR finally maturing, DRM as an annoyance has been reduced in a major way since the 2000s, indies are blossoming, PC games are really cheap really fast after release... the list goes on.
90% of any industry is crap, especially in the software industry. It's so easy to make a buck selling promises in the software industry - games included - that a lot of companies do it.
I would be more worried about console hardware limitations, ridiculous budgets and the fact that a lot of shooters have super-dark or gross worldbuilding lately. It's bad enough that the real world is not doing great, now suddenly games have to have grim stories and apocalyptic worlds too. Also, gaming as a hobby is just as uncool as ever.
you either get rid of advertising and pay to watch each video, or you put up with advertising.
I choose the former. 100%. Now how do I do that on Youtube?
Have a look at how a properly designed gullwing door is designed.
When the door is open there is a huge drain to direct water etc. from the roof to the ground (around the actual doorway).
Also when the door is open, the far end of the door is hovering outside the range where water etc. could drip inside the car.
In addition (unlike traditional car doors) when the door is open, it's hovering above the gap, acting as a roof, so that the actual rain doesn't get inside the car either.
Since Youtube was taken over by Google, server speed has increased immensely, they've moved to HD, they've removed time limits on videos, they've allowed live streaming of shows, they've given away hundreds of millions of dollars through the partnership program they introduced (including many shows that are simply vlogs)... Et cetera.
I have a rather more cynical view of that. Better latency, HD, longer videos and live streaming are basically all just effects of one good thing, better servers. That's probably the only good thing that Google has given to Youtube.
I don't count the partnership program as good. Basically it radically influences channel content for the worse. Either it introduces money, which ruins everything*, or it introduces legal protection from U.S. entities which makes for some pretty bland content**.
* I'm a viewer of some channels that thrive on the partnership program (Drive Network and TotalBiscuit for example) and they all do worse and worse the more money is involved. Instead of being fueled by passion they are fueled by ratings and money. Which is what utterly killed Hollywood and Television for me and got me into Youtube in the first place. Examples from the Drive Network: the three-minute car reviews, which are blatantly not at home on that channel, and the product placement (like Pirelli) / advertisement videos that pop up every once in a while. Examples from TB: where to even begin. Makes videos based solely on the highest ratings, to the point where it comes close to ruining his personal life. Adjusts video content and kills off series based on how much revenue they bring in and not based on what he enjoys playing/shooting, which blatantly shows in his commentary.
** Partnership channels are pretty strictly regulated in regards to what they can show in their videos. They are trying to dodge takedown requests like this one and copyright strikes which may stop their cash intake. So... anything remotely inflammatory or controversial that could be in any way interpreted as slander, copyright infringement, etc... just won't appear on a channel like this any more.
Also: the GP was right about the ridiculous "updates". They're almost all terrible. The layout changes, the default setting changes, the player changes... Just the facts that buffering still doesn't work, quality settings were broken for months, subscriptions break all the time, are all great examples of the incompetency of the devs or the misguided priorities over there. Youtube is constantly becoming more corporate, better at generating revenue, and worse for the users. And users hate it more all the time and only use it because hardly anyone could ever afford to make a better Youtube clone.
The greatest thing Youtube has introduced lately is HTML5 compatibility and I have complete confidence Youtube could've and would've implemented that without Google's "help".
I've actually got a Sony X9005A as a desktop display for my PC and no, the 29Hz refresh rate does not make it "unimpressive". If you're looking for getting impressed then the resolution will vastly overpower the refresh rate. When you have a window-like view to your games, photos etc. you just instinctively ignore the slow refresh.
The worst thing is probably the input lag introduced by the low refresh rate. The thing has one of the lowest input lag scores on the market, but the slow refresh still makes cursor input really laggy. It's not the kind of lag you see but the kind you feel. It's gone if you switch to 1080p, but you won't if you have a 4K panel, will you.
FWIW the Sony supports hdmi 2.0 and thus 4k@60fps, but good luck finding a GPU that outputs it. I'm stuck waiting for the eventual NV GTX 800 series which probably will. NVIDIA haven't even confirmed it.
On the topic of Youtube, I thought they'd supported 4K since 2010. In fact 4K vids on Youtube were one of the first materials I tested my panel on. They stream fine over 24mbps ADSL2 but the bitrate is not great (the vids are noisy).
"Ignorance is the soil in which belief in miracles grows." -- Robert G. Ingersoll