Ultra HDTV on Display for the First Time 314
fdiskne1 writes "According to a story by the BBC, the successor to HDTV is already out there. The resolution? 7680 x 4320 pixels. Despite the 'wow' factor, the only screens capable of using Ultra High Definition Television are large movie screens, and no television channel has the bandwidth needed for this image. Some experts, in fact, say the technology is only a novelty. Until the rest of the necessary technology catches up, the only foreseen use for Ultra HDTV is in movie theatres and museum video archives." From the article: "Dr. Masaru Kanazawa, one of NHK's senior research engineers, helped develop the technology. He told the BBC News website: 'When we designed HDTV 40 years ago our target was to make people feel like they were watching the real object. Our target now is to make people feel that they are in the scene.' As well as the higher picture resolution, the Ultra HD standard incorporates an advanced version of surround sound that uses 24 loudspeakers. "
Goddamnit... (Score:5, Funny)
Backlash? There's a cycle for this stuff (Score:5, Insightful)
Digital photography was pre-announced. Looked great, even at megapixel rates. Kodak scoffed, so did Fuji. Both hedged their bets and it's a great thing they did or they'd be in Chapter 7. It took about the same time from pre-announcement to mass market approval. Now you can go to Brookstone and get a 640x320 matchbox-sized camera for $50, and digital 'disposibles' are arriving.
Cool-it is anti-consumption. Do we need television AT ALL? That's a question still to be answered. I'm all in favor for advancing technology, especially if it feeds the poor and gives quality of life a boost. While an UltraHD TV might have only speculative value, it pushes the boundary, and that's what humanity is all about.
So fie on your 'fringe' technology PCs were 'fringe' when I was soldering together and wire-wrapping motherboards in the pre-IBM and pre-Kaypro days. What we did, goofy as it sounds, is the reason you can post on
Re:Backlash? There's a cycle for this stuff (Score:5, Funny)
Can I get off your lawn now?
k thanx
Re: (Score:3, Interesting)
Re: (Score:2)
Re: (Score:2)
Well, and seeing how it really is NOT being adopted by the general public as a 'have to have' new technology, I really doubt that Ultra HD will be in much demand in 30-40 years.
There just simply isn't that much demand for it right yet....and I gotta admit. I LOVE new tech toys...I like to play with the newest and neatest, but, I'm frankly not in THAT much a hurry for HD. I've recently gotten a DLP projector that can do 720p resolutions...and I'm curr
Re: (Score:2)
I also agree with your opinion about qu
Oh good! (Score:2)
Now I won't have to lose geek credibility when I say SD is "good enough."
The device (Score:5, Funny)
Re: (Score:2, Insightful)
Re: (Score:2)
The outline of the story of nine movies was written before any of them were shot. Lucas picked the middle of the story partly because he felt it was the only portion that could be carried off successfully with the technology of the day, and partly because he felt it would be the most palatable.
This isn't to say that the
Re: (Score:2)
The truth is, he wrote a long movie which started in the middle to feel like a Saturday serial, and upon realizing it was too long to shoot, he took the first thrid of his idea and created Star Wars.
Empre and Jedi did not follow the remaining script ideas which he had written to the letter. For example, the scenes with the Ewoks were orig
Re: (Score:3, Informative)
Han Solo was a late addition, that up until shooting was supposed to be killed by Jabba. Star Wars was supposed to be much darker. We used WWII films like Dam Busters as inspirations, as well as Kurosawa's Hidden Fortress. Both of those he credits. But he also lifted heavily from Dune, which he doesn't credit.
The early drafts focus more on the spice trade
Re: (Score:2, Informative)
Re: (Score:2)
Re: (Score:3, Insightful)
Does anyone actually believe this anymore? It became patently obvious by Return of the Jedi that Lucas was making up the story as he went along. What other reason would there be to reuse the plot of the first film? Blow up the Death Star? Where have I seen that before?
Or perhaps he actually meant, all along, for Luke and Leia to be sisters and for Anakin to have built C-3PO. That thing about Leia remembering her moth
Re: (Score:2, Funny)
Re:The device (Score:5, Funny)
[Spoken to the RIAA at the door]: "These are not the pirated copies of Star Wars you are looking for...(waves hand)
Anyone have a video link of the demonstration? (Score:5, Funny)
640 (Score:2, Funny)
hrmph! (Score:3, Funny)
Atleast books will always have a higher (mental) resolution, it's to bad nobody reads anymore.
Re:hrmph! (Score:5, Funny)
It's a shame that writing skills are on the decline, too.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Atleast books will always have a higher (mental) resolution, it's to bad nobody reads anymore. I'm sorry, I can't understand the strange symbols you've used in your post. Mind posting a summary of your comments to YouTube?
Re: (Score:3, Funny)
I would but I couldn't find any videos about it.
Ahead of their time (Score:2)
Whoah,
Re: (Score:2, Offtopic)
Re: (Score:2)
The final resolution jump? (Score:5, Interesting)
I wonder, can the human eye even see such high resolution; does it even matter at that point? I mean,
According to this page [72.14.205.104] it would appear that each human eye is a 15 megapixel camera.
If my maths are correctish then 7680 x 4320 is 33 million pixels.
So then, the question is - does this mean that by adding both eyes together, at best humans have 30 megapixel resolution vision?
Could this be considered "full human" resolution?
Re: (Score:2)
Not all at once, of course. The point of having higher resolution is so that when you attend to one part of the scene, you won't perceive a reduction in image quality.
Re: (Score:3, Insightful)
Re: (Score:2)
Rather like the way scientists increased the speed of light in Futurama.
Re:The final resolution jump? (Score:5, Informative)
IAAVN (I Am A Visual Neuroscientist). The answer to your question is, "no." The article pointed to claiming 15 million pixels specifically states the pixels are variable resolution. The photosensors in the central part of human (and primate) vision are packed at a much, much higher resolution than those at the periphery. The standard resolution in central vision for people with 20/20 vision is about 3 minutes of arc; at 3 degrees away from the fovea, this drops to 1/2 that figure; and at only 20 degrees eccentric (about two fist widths held at arm length), it's at 1/10. (If you've never heard that vision is variable resolution, try this trick: open a book or newspaper and stare at a single word in the middle of a paragraph; then, without moving your eyes, see how far to the left, right, up and down, you can read. You will find that the limits are astonishingly narrow. Evenly sampled high resolution vision is a powerful illusion based on the extreme resolution we have in the central part of vision, the ability to move our eyes, and some incredible circuitry in our brains.)
More importantly, saying you have N by M pixels alone doesn't give visual resolution, it gives object resolution: it is not possible to resolve individual pixels in an 8x10 photo printed at VGA resolution held 10 meters away, despite the relatively low resolution of the image. It is necessary to know not only the resolution of the image but the viewing distance as well to be able to say if the combination approaches the limits of human vision.
Re: (Score:3, Funny)
This is Slashdot, where a legitimate reply involves the word "binoculars."
How about a simple Variable Resolution spec? (Score:2)
Am I missing something here? I can play 1080i MP4 files on a 720p screen. Ignoring the lack of source material for a minute, I could hypothetically have a file @ 2880 * 1620 and still play it on my current setup. Further, if
Re: (Score:3, Informative)
It's not that simple, mainly because the human eye's resolution isn't uniform. Basically, because of the fovea [wikipedia.org], in the center of our vision we have an area about 2 large (4 times the appearant diameter of the moon) offering us in the area a resolution of about 28" (seconds of arc), the resolution outside of this area being lower. Since it was projected on a 7 x 4 meter screen [cdfreaks.com], each pixel is about 0.9 mm x 0.9 mm.
Which means that if I got my maths right, you would have to be 6.94 meters (almost 23 feet) away
Re: (Score:2)
we have an area about 2 large
Snap! I meant 2 degrees of arc but the degree symbol didn't come out right...
Re: (Score:2)
60" HDTV at 8':
Resolution: 1920×1080
Eye limit: 1.17M pixels
Display limit: 2.07M pixels
Eye & display limit: 1.06M pixels
Why the difference? Because you have
Re: (Score:2)
Re: (Score:2)
A very simple reason, distance. When was the last time you looked at your photo album from 10-50 feet away? Or how about watching your 50" TV or 30' cinema screen from 2 feet away?
Re: (Score:2)
Re:The final resolution jump? (Score:5, Interesting)
I don't know the exact numbers, but we'll use the number of 15 megapixels per eye... just because a single eye is 15MP doesn't mean that both eyes working in tandem is going to be 30MP. In Astronomy, you can drastically increase the resolution of a picture you're taking by taking a dozen pictures spread out over a large area. If they're at the same time, then you can interpolate the missing data and produce a *really* high resolution picture. I'd be surprised if we aren't subconsciously doing the same thing with our eyes.
Re: (Score:2)
Re: (Score:2)
Film a perfect movie scene in 35mm film at 24fps and on HDTV camera at 60fps.
Show both side by side and the viewer will prefer the film because the mushyness and slower framerate makes it more dreamy and therefore more immersive to the viewer.
the razor sharp crispness and super fast framerate of 1080p destroys quite a bit of the illusion and therefore makes viewers not like it as much.
This is why films are still in 24fps instead of the higer rate. the digita
Re: (Score:3, Interesting)
I was also under the impression that theaters originally started cutting down the frame rate to help shrink the size of the film reels. 24FPS was the slowest (and thus cheapest and smallest) before they started drastically reducing quality and making the image look choppy. In my experience people prefer the digital theaters to their film counterparts for
Re: (Score:3, Informative)
I sense you misunderstood about high-resolution imaging in astronomy here.
(1) The resolution of an image is primarily determined by the optics, not the detector.
Eyes don't always work in tandem (Score:4, Interesting)
To emulate how I do it, just close one of your eyes and do things that way. I can see out of the other eye, of course, but the brain treats it as peripheral vision unless I'm using it to focus on an object -- I can swap which eye I use to focus at will.
[1] I was born with one of my eye muscles screwed up, so I was the opposite of cross-eyed.
Doesn't one eye make it harder to focus initially? (Score:2)
25 years sounds about right (Score:3, Insightful)
Re: (Score:2, Informative)
Re: (Score:2)
TVs are getting cheap (what you paid for a "big screen" TV back in the 1980s would get you a nice HDTV these days, IIRC), contents is starting to become available (as equipment etc. become much cheaper), bandwidth usage remains si
Re:25 years sounds about right (Score:5, Informative)
I'm probably the only one here who is 1) old enough to remember, and 2) actually paying attention to the HDTV fiasco from 1985 onwards.
Analog HDTV was rolled out in Japan in the 1980's. A bit stung, the American television manufactures and the networks hammered together a proposal to broadcast 1080p in the following way: standard def over the usual VHF channels, while the HD component would be broadcast over unused channels. Thus, Channel 2 CBS would go out as normal, while an HDTV set would take that signal and add information broadcast over channel, say, 3. All analog. All broadcast. The rollout would have been around 1990 or so.
A funny thing happened. Digital video. The broadcasters saw what digital compression could do for them. Why just one channel, using all that bandwidth, when we can now use the same two channels and broadcast 4 programs simo? We promise that sometimes we'll broadcast in HD; just most of the time, we'd like to make more money with four low-def channels. And they demanded, and got, 1080 (i), to halve the signal and enable more channels on the side thereby.
And their wish was granted. These were the years of no-regulation, after all. The issue of public ownership of the airwaves was going bye-bye, and the government would like to auction off those frequencies anyway, which leads us to
Cable. Since so much programming was going over cable, the Gov decided that public regulation of public airwaves was silly and undermining competition. So long Fairness Doctrine, so long limits on corporate ownership and monopoly control. And so additionally, why force public airwaves to go digital when cable could deliver it so much better than they?
And network TV didn't really want to pay to upgrade, either, so that slowed it down a lot. Delay after delay...
THEN the kicker. The "content owners" saw that in the digital age they had a chance to lock down signals and force people to pay each time they accessed their "property". They wanted taping to go away as well -- they hated VCR's and almost killed the tech in 1984. They could win this one, and so was born the Broadcast Flag, a digital lock on transmissions that controlled the use of the program. Cue a big delay as HDMI, HTCP and all the other locks were developed and approved by the "content" industry.
Now... it's the 21st century. almost 20 years late, and we've crappy 1080i signals going over the air, infomercials clogging all those channels we can access for free, and we can't record the standard 1080i signal.
Remember, the public airwaves are supposed to belong to we the people, and the broadcasters and producers are supposed to dance to our tune. Somehow they are now the masters, and we those begging for mercy.
no. (Score:3, Interesting)
No, there was never a 1080P analog broadcast standard in the US. There never was any serious attention paid to delivering HDTV over the air in the US until digital compression came around. This is because it was expected to take 5 regular channels to send one HD channel. At this point it became a war between compressed 720p and compressed 1080i.
Both were considered the best that coul
Typical (Score:4, Funny)
what I hate about TV is how the specs are so hardware-dependent. all kinds of numbers and letters and if it differs by 1 character your thousands of dollars might have been wasted.
imo it should be more like computers: you basically have a processor that determines your data processing and a display device that determines your viewable resolution. almost everything else is software and thus improvements are continuous and ongoing. it's a much better model than upgrading every couple of decades, with a half-decade period when your TV is too good for the signal.
once TV is based on more internet-like digital technologies this will hopefully happen.
Re: (Score:2)
And then we will have movies starting with the message "This movie will be enjoyed most on an 1024x768 TV." And if your TV has a higher resolution, the movie will be shown in a small rectangle in the middle.
At least that would be the analog to many of today's web pages.
Re: (Score:2)
with computers you can handle any source your CPU is up to and can always click full screen.
and you have source options; ever watched a trailer from Apple? you get the choice of small, medium, large and HD res.
with TVs it's either HD or not. and HD versions are often completely different channels. it's lame, just like those "+1 hour channels" are lame - if you had decent scaling/shifting abilities in the first place you could do 10x as much 10x as easily.
Pitty (Score:3, Insightful)
As technology matures there's a race for bigger, faster, and finer. But this race is not eternal: in few years the sweet spot is hit and people are not interested in higher resolutions.
With TV resolution this sweet spot is already somewhere between DVD and EDTV, way below 1800p. So yea, don't expect "technology to catch up" in that respect, as the summary suggest, since noone cares for it to catch up in this way.
Re: (Score:2)
40 years ago!? (Score:3, Interesting)
Whoa! 40 Years ago!? Amazing! Crazy how long it took to go public/mainstream. I guess it's one thing to design something and quite another to build upon it.
bandwidth (Score:2, Insightful)
x*y*bytes per pixel*frames per second gives bytes per section
7680*4320*3*25/1024/1024/1024 = 2.3174 gigabytes per second
that's quite a chunk for streaming video. of course, there will be compression techs and other tricks, but that's pretty impressive.
Re: (Score:2, Informative)
So 7680 x 4320 x 5 at 60fps = 9.3GB/s.
Another comment said that this was 25 years away, although I wouldn't be surprised if it was only 15 years away the way things are progressing. 9.3GB/s is offered on even low-end graphics cards these days, but the bandwidth problem is between the player and the display, i.e., the HDMI eq
Moon movies (Score:2)
Yay!!!! (Score:2)
1080i/p pixels = 4x4 pixel block of real pysical pixels, 720p pixels = 6x6 block of physical pixels, 480i/p pixels = 9x9 block of physical pixels.
1080i pip is still 1080i at 1/4 of the screen. no down scaling.
THIS folks is what I've been waiting for ever since HDTV was announced.
If only I could afford to be an early adopter on this technol
Re: (Score:2)
Ooooh flat! (Score:2)
Give me gyroscopes and holograms! It doesn't matter if that Klingon bat'leth is UHD resolution...all you'll see is a low-res blur before your very high res intestines spill out before you!
(Of course, those aren't really your intestines, but this holodeck goes for intensity in imagery.)
Re: (Score:2)
Since the safety protocols have no doubt broken/been bypassed/been shut off/been overridden by a rogue AI inside the holodeck program itself, those are, in fact, really your intestines.
Sony vs Microsoft (Score:3, Funny)
Meanwhile, CMDR Taco [deceased] writes on how playstations "neural implant connect-kinetic extremity dongle [N.I.C.K.E.D]...was 'actually just a rehash of the Wiiiiiiiiiiiiiiiiiiis controller.
That's 31 Megapixels! Camera optics ready? (Score:2)
What's more, while all the el
Re: (Score:2)
*This statement is to be taken strictly in the context of improving on current optics for higher resolutions. Obviously, the
Re: (Score:2)
Also... (Score:2, Funny)
But why? (Score:2)
I guess I'm just iggernant (Score:2)
How is this different than me defining a video spec that operates at 1048576 x 589824 pixels x 120 fps, non-interlaced? Is it just that they spent the mo
WHUXGA (7680 x 4800 pixels) (Score:2, Informative)
From http://en.wikipedia.org/wiki/HUXGA [wikipedia.org]
WHUXGA 7680×4800 16:10 37M
WHUXGA an abbreviation for Wide Hex[adecatuple] Ultra Extended Graphics Array, is a display standard that can support a resolution up to 7680 x 4800 pixels, assuming a 16:10 aspect ratio. The name comes from the fact that it has sixteen (hexadecatuple) times as many pixels as an WUXGA display. As of 2005, one would need 12 such displays to render certain single-shot digital pictures, for
The Law of Diminishing Returns (Score:2)
Many people, when shown a 60" screen at a reasonable viewing distance, can't tell the difference between 720p and 1080p. The added resolution of UHDTV would only be of benefit on a large movie screen from a close viewing distance. But movies implemented the ideal screen resolution decades ago... It's called film.
Re: (Score:2)
ONLY theaters? (Score:2)
Film (Score:3, Informative)
Triptychs (Score:2)
I'd expect this kind of rig to already be stand
It _is_ useful for normal screens... (Score:2)
Comment removed (Score:4, Informative)
This is old... Prototype was out in 2003... (Score:2)
See the announcement from 2003 [cdfreaks.com].
Totally, utterly pointless... (Score:2)
It's only when you want to examine a small details of the picture, you need more pixels, therefore the landscape photographers use all the megapixels they can get. But for TV or projection, it is just plain silly.
The coolness factor is high, though.
Re: (Score:3, Informative)
BTW, it's not true that you get it with unlimited resolution. There are several limits to the resolution you get. First is the wavelength of light. Red light has a wavelength of about 800 nm, so you can't see any more than that in red. Violet light has about 400 nm, so you have twice the resolution there, but it's still limited.
The second limit is in
Re:Great... (Score:4, Informative)
This is pure nonsense, because our brain doesn't work in pixels. It works in concepts, and what you think you're seeing is actually constructed in your brain from a combination of what your optic nerve feeds to your brain, and what you remember about seeing similar things before. YOU DO NOT PERCEIVE REALITY. You perceive your brain's model of reality. This is the most important thing to remember about your senses, and most people have never heard it or are all too willing to forget and pretend that yes, they are directly connected to reality.
Do some research on saccades [wikipedia.org]... but here's the meaty part of the wikipedia page:
In other words, you have no idea what you're talking about.
Re:Great... (Score:4, Insightful)
Where in my whole post did I speak about the brain?
Your brain usually doesn't say "pixel" even when you look at a screen with pixels large enough to see the difference. Just like your brain doesn't say "low frame rate", but "flicker".
And your quote from Wikipedia doesn't change anything from what I said: Your retina determines the resolution you get. The fact that this resolution is not constant throughout the visual field doesn't change that basic fact. Nor does the fact that you unknowingly move your eyes around in order to get a larger area in high resolution.
You simply don't get more information through your eyes than your retina gives you. The fact that your brain manipulates this information by filtering, adding from memory, and even modifying due to expectations, does in no way alter that fact any more than it does alter the fact that your TV has a limited resolution (despite the fact that your brain tells you there are people or things which move on the screen of your TV, instead of a rectangular array of colored dots).
No, you are the one who has no idea what I'm talking about.
Re: (Score:3, Informative)
My point, which you handily missed, is that you cannot talk about vision without talking about the brain. Vision doesn't live in the eyes, or even in the optic nerve. That's simply where the data used for vision comes from, and where the preprocessing occurs. Vision exists in the brain, and your brain composites data from your eyes and from memory to produce an internal representaion of your surroundings that you perceive as visual data.
As such, talk
Re:Great... (Score:5, Funny)
No, see you're missing the point. I don't want REAL LIFE. I want LIFELIKE. Because let's face it, no matter what happens in real life, I doubt I'm ever gonna have the opportunity to bend Elisha Cuthbert over the closest piece of furniture and give her the worst 30 seconds of her life.
But if we can make screens mimic reality, then we're one step closer to every twisted geek's fantasy - the Holodeck. And I guarantee you, Holodeck-Elisha is more open to experimentation. One just has to hope that Real-Holographic-Simulated-Evil-Lincoln doesn't spring to life and goes on a rampage, wrecking the ambience.
Re: (Score:2)
Re: (Score:3, Funny)
You remind of something local journalists in my country started using way too much in news reports, odd given it's a nonsense.
They like to say that some actual event that happened in our actual world is "like a real reality show"...
"Driving on the roads with your car is like a real reality show".
There should honestly be minimal intelligence requirements for one to be a reporter, I think.
Re: (Score:2, Funny)
Re: (Score:2)
There was a BBC study that suggested 1280*720 was good enough for most people, based on studies of visual accuity. The study suggested that viewers would continue to watch television from a long way off. (The 16*9 aspect ratio was in part selected to encourage people to sit closer to the screen).
Replacing IMAX? (Score:3, Informative)
Re: (Score:3, Funny)
You think that is something. I'd like to see if they could transfer over the "Stones at the MAX" they did of the Rolling Stones concert (Steel Wheels?) in IMAX. I swear,
Re: (Score:3, Insightful)
And still, it's completely irrelevant. Yes our eye may be able to sense very small amounts of light, but that's nothing to do with resolution; the eye must be able to pin point the location that the photon landed, and that is limited by the 6 million or so cones we have, and a lot of parallel/serial pr
Re: (Score:2)
Re: (Score:2)
Sa
Re: (Score:2)
Gah (Score:2)