Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror

Comment Re:Power efficiency is good in some places, not al (Score 1) 313

Here's the thing though. Even if chips remain equally powerful or 10% slower... if they could fit a 40 core Xeon into a 10watt atom power profile that would be a MASSIVE performance increase in mobiles. I'm relatively satisfied with CPU performance these days with a dual Xeon. If it meant I could get a current workstation in a mobile form, great! However I'm assuming that GPUs do keep improving and we finally see openings for specialized chips for physics and raytracing--the last two areas that would really benefit from dedicated hardware. Neither have ever caught on because Intel keeps improving quickly enough that a small specialized chip market can't get to market before Intel outpaces them.

Comment Modularity (Unix "Do One Thing" philosophy) (Score 1) 233

The only option to achieve that through process not physical security is to write everything sufficiently modularly that every module is untrusted and interfaces through documented APIs. This can actually be a good requirement since it should make updating any one feature relatively easy. I know of at least one large fortune 500 company that is rewriting everything on the assumption that the network is publicly accessible. This has the nice side effect that you can actually make it publicly accessible to mobile employees.

If every developer is assigned a specific module to write that does a very narrow set of goals then all they need to do is take in data of format XYZ and output data of format UVW. At some point you'll need someone in house for architecting what modules you need developed and an integrator to handle the bits you seem to be paranoid about exposing to developers but it would limit the potential exposure to any one developer going AWOL.

The downside of course is that depending on the task it can get very difficult to break up a large project into discreet chunks/interfaces.

Comment Re:vote with your wallet (Score 1) 302

Apple will be the gatekeeper for music, and Netflix for video.

Apple is so far from the gatekeeper for music these days that it's laughable. Ironically it's the Zune subscription that has succeeded. It was just Spotify who made it happen because of its cross platform nature.

Microsoft saw the future and then tried to sell the past (mp3 players) as a bundle and failed. Spotify did the exact same thing (minus songs you get to keep) for $5 less and did gangbusters.

Now Apple is trying to play catchup in the music realm.

Another point on this. When Apple started locking in the music industry, the labels also matched to avoid too strong of a monopoly. So Zune music service pretty much just said "Give us what you're giving Apple" and was able to build out their library. Spotify wouldn't have stood a chance in hell of negotiating those contracts if iTunes wasn't already getting sweet digital rates. Netflix wrenched open the door and now Amazon often has the same films because the studios both want a bidding war and they also aren't going to spend too much time negotiating most of their catalogue.

Comment Re:Why would anyone tolerate this bullshit!? (Score 1) 720

Hearing these conspiracies is like listening to right wing nuts go on about how everybody is eventually going to be driven into a FEMA camp.

And as to $0.99 to change your theme color? Well, I guess that's not paranoid since that happened in 1996 with "Windows 95 Plus!" which mostly just added theme colors for a few bucks. So nothing new there. But "Want to install a third party browser?" 1. They would get sued out of existence. 2. only if the third party browser charged a price in the store which none do on any other platform. You don't see chrome on charging money so no it won't cost anything.

Comment Re:Oh good. (Score 1) 154

Microsoft already allows Win32 in store apps -- you do have to rebuild, but create a UWP app and bring in the Desktop Extensions SDK and that give you Win32.

That's only a very small subset of Win32. Project Centenial http://www.brianmadden.com/blo... is in the works though to allow full Win32 apps to run in the WinRT sandbox.

As to battery life, if you can play games on your phone you can run a Win32 App without worry.

Comment I've converted millions to Windiws apparently. (Score 1) 165

I've worked on ads that were seen by tens of millions of people, who knew that in the process they became windows users! I should apparently be hired as a Microsoft Evangelist. /s

Windows was also used by Weta on those films. Every film has at least one windows license doing something. That means every Ubuntu user is simultaneously a Windows and OSX user.

Comment Re:HDR (Score 1) 37

Narrow bandwidth is exactly what you want from a display's color primary. The more pure the display is the more saturated it is and the wider the color gamut. If someone could really create an LCD with a perfect 630nm red, 550nm green and 450nm blue without any other frequencies it would be a fantastic display. The ultimate displays currently available today are lasers because they naturally produce extremely narrow frequency bands exclusively. The second best are probably though LCDs + Quantum dot. Which are already almost achieving the full Rec2020 gamut.
http://static1.squarespace.com...

Comment Re:What the? (Score 1) 37

Full LED displays (NOT LCDs with LED backlights) can easily reproduce a wider range of colors if they use the right LEDs (some displays have added an extra color like yellow, some simply move the RGB LEDs further apart on the spectrum). Other display types can do this as well, but it's not as simple as with LEDs.

Full LED displays are only used in like ballpark score boards and billboards and their color is terrible. Unless you mean OLED, in which case just call it OLED. Almost every LCD display today uses a white LED and an array of filters.

Most "Full LED" displays are just 6500k LEDs with a colored filter. So a white LED + LCD filter array is pretty much the same as a white LED + Filter coating. 'Regular' LCDs are also just as good as OLED at color gamut if not better. You take a UV LED and you use quantum dot emission as your color mask and you have an incredibly pure color output. Which by the way is what you want out of your red green and blue primaries, not them "further apart" (further apart to where infrared, ultraviolet and... not sure where you want green to move to.). Rec2020's green isn't "further apart on the spectrum" compared to Rec709's green from blue it's just a narrower, more saturated green. It's a question of purity not 'distance'. Saturation is product of the width of the spectrum emitted not its wavelength.

Comment Re:What the? (Score 1) 37

It is impossible to separate "HDR" photography from "HDR" displays because they both do the same thing - fuck with contrast in different areas of an image differently in order to overcome limitations of the resolution/gamut of the format/display.

Wow ignorance and attitude what a lovely combination.

Neither inherently 'fuck with contrast'. HDR photography is just capturing a High Dynamic Range of values. So SDR would be 0.01 nits -> 100 nits. HDR would be 0.01 nits to 1,000 nits. Almost every decent camera today can capture at least that much dynamic range.

. The display you used as an example is "HDR" via the "Peak Illuminator" feature. It's just dimming the LED array, as all "HDR" displays are.

Nope. HDR is overdriving the LED array not dimming them. Yes, most HDR displays do use localized dimming but OLED doesn't, it just displays each pixel by itself and it can get up to 600+ nits which isn't as good as LED but every other pixel can be black black to 600 nits. Yes if you don't have enough LED resolution then there can be some localized tone mapping errors but on really good displays with 120+ zones the haloing of bright objects on dark is pretty subtle, comparable to flare in a good DLP rear projection display.

There is definitely a benefit to HDR. Just as it's different to look at a photograph of a street lamp vs looking at a street lamp, having that real peak values up in the thousands of nits gives your eyes and brain the illusion of being 'more real' because it's not trying to trick you into thinking you're looking at a bright object... the object is actually bright.

Comment Re:Sync stability (Score 1) 37

Nooooope. I have a Vizio UHD tv and apparently fullscreen Netflix actually changes your display settings and even through 2.0 when I maximize or minimize the Netflix window I lose input and sometimes the TV completely loses sync and says "no Input" until you change to a different HDMI and back. Sadly it seems to have gotten way worse.

Comment Re:What the? (Score 1) 37

No, HDR is about fucking with contrast in one part of the image and fucking with contrast differently in another part of the image.
It looks terrible every fucking time, and it's less accurate than just linearly plotting everything after setting your curves once for the whole image.
It's absolutely retarded to have a curve that is different over different parts of the same image.

Effectively every sentence you wrote is completely wrong. What you are calling "HDR" is actually "localized tone mapping" it's a filter effect like adjusting the histogram, it's not HDR. Your reaction is like someone looking at a red/green anaglyph stereo image (http://www.designcommunity.com/scrapbook/images/125.jpg) without glasses on and saying "This 3D business is AWFUL, the colors are all weird and the image is doubled. It doesn't look dimensional at all!"

HDR just means "High Dynamic Range". HDR displays are about accurately reproducing the world as captured on a display. So if you have a bright flashlight, the bulb could theoretically be so bright on screen that you have to squint or shield your eyes. That would probably be a poor aesthetic choice but it's not about doing localized tone mapping, it's about reproducing the world as it really exists slightly better. The trick is that they want to reproduce 16+ stops of luminance data and they don't want to break HDMI or BluRay. So their trick is to heavily compress the say 20 stops of dynamic range into only 10 bits. That means they have a gamma curve which is non-linear. And that's nothing new. Almost every single image in existence is gamma encoded in either Rec709 or sRGB's gamma. Because in order to store a SDR (Standard Dynamic Range) image like a jpeg you would be wasting a lot of data on precision the human eye can't see when stored in linear.

HDR photography does not require multiple exposures. HDR photography is just high dynamic range capture. A modern sensor like Alexa or RED can capture 18 stops of dynamic range. That is on the low end but in a single exposure already HDR.

HDR is not shit. HDR is exactly what you want "A wider range" of contrast. Instead of displaying black to gray, HDR can display black to blinding white. It's arguably and in my experience far more impressive than higher resolution.

Comment Re:Say what (Score 1) 37

Brightness is important to color rendition. If you have a pure saturated red screen. But it's at 50 nits it's going to look like an unsaturated drab screen. Crank up that pure red screen to 5,000 nits and the color will be perceived as eye searing fire. If you can only ever reproduce a hue at 100 nits you're missing out on all colors which are very bright. Take a simple gradient: http://onlineteachingtoolkit.c...

If your display could only handle the bottom half of the luminance you wouldn't be able to reproduce the top "blue" blue it would always be a "deep blue".

Brightness is an additional dimension to the vibrance of color separate from the wider gamut and more saturated primaries of 2020. That's why we can express 3 dimensions to every color: hue, Saturation (Rec2020 primaries) and Value (HDR).

Also HDR adds 10bit color almost by necessity as much as Rec2020's larger gamut so HDR material is at least 10 bit.

Slashdot Top Deals

MSDOS is not dead, it just smells that way. -- Henry Spencer

Working...