Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment: Re:Still a hack, but way better than nothing. (Score 1) 53

by Arkh89 (#49763509) Attached to: Software Patch Fixes Mars Curiosity Rover's Auto-focus Glitch

You could also not have to do it at all and use an EDOF system (such as shown in this demo). Its just not a software solution and has to be constructed from the beginning with the lens and the camera (you voluntary insert aberrations that will make the system blurry "the same way" in some larger range, but this blur is easily invertible by a simple image Wiener deconvolution).

+ - Oculus Founder Hit With Lawsuit->

Submitted by Anonymous Coward
An anonymous reader writes: Palmer Luckey, founder of VR headset-maker Oculus, has been sued by a company accusing him of taking their confidential information and passing it off as his own. Total Recall Technologies, based in Hawaii, claims it hired Luckey in 2011 to build a head-mounted display. Part of that employment involved Luckey signing a confidentiality agreement. In August, 2012, Luckey launched a Kickstarter campaign for the Oculus Rift headset, and Facebook bought his company last year for $2 billion. TRT is seeking compensatory and punitive damages (PDF).
Link to Original Source

+ - YouTube Live Streams Now Support HTML5 Playback And 60fps Video

Submitted by Anonymous Coward
An anonymous reader writes: YouTube today announced it is enabling HTML5 playback for live streams. At the same time, live streams can now be viewed at 60 frames per second (fps). A few puzzle pieces had to come together to make this possible. On October 29, YouTube quietly turned on 60fps support for videos uploaded on that date and later. While clips uploaded before that date remain at 30fps, new videos shot at 60fps suddenly started playing back at their proper framerate.

Comment: Re:Not Holograms (Score 1) 99

by Arkh89 (#49588623) Attached to: Microsoft Announces Windows Holographic Platform

I can construct a 2D image that has proper DOF cues.

Yes, you can construct it but it your eye is still focusing on a plane and the depth cues you are feeding it (them) do not match this. If you "force" your vision on something blurry, the device better has some way to find out this and tell it to the rendering engine. Retinal tracking allows the rendering engine to know what part of the scene is observed (in the center of the FoV) which helps for finding the actual depth and focusing parameters.

Yet, whatever tricks you use, the crystalline lens will always come back at the same position to have the in-focus image while you will perceive a change in DoF. This is another mismatch some people will perceive and there is no way to correct for it in stereo-vision. LFD might be slightly better at this but holography is the ultimate solution here.

I don't think that a stereo-based device you can use, with discomfort, for may be one hour and before getting really nauseous will have a good commercialization potential.

Comment: Re:Not Holograms (Score 1) 99

by Arkh89 (#49587811) Attached to: Microsoft Announces Windows Holographic Platform

Each retina collects photons on a surface and with a single eye you get a 2D image*. Your brain combines the images from your eyes in very complex ways to create a 3D internal model, but as far as what needs to get shined into your eyes, it's just the 2D image constructed on your retina that matters.

That is incorrect. There are numerous 3D depth perception cues, among which are stereo-vision, depth of field (things far from what you are looking at appear blurry) and prior knowledge of the objects size (knowing the average size of a car, you know that if you see it "small", then it must be far away). With only one eye, the last two are perfectly valid. The very last one is very simple to reproduce but the depth of field is far from being trivial to implement. For VR head set such as the Oculus, you would need retinal tracking, map to the corresponding depth of the observed object and adapt the rendering of the whole scene to this depth of field (with of course, very small latency), see http://3dvis.optics.arizona.edu/research/research.html. Having different cues in a system can cause serious discomfort to a large portion of the population.

Comment: Re:Not Holograms (Score 1) 99

by Arkh89 (#49586781) Attached to: Microsoft Announces Windows Holographic Platform

but the chances are it's not actually holographic.

You can be certain about it. There is no real-time holographic display as of today. For LFDs (Light Field Display), NVidia had a prototype a few years back and it is reasonable to think that Magic Leap is pursuing something similar. Yet, I don't think the technology is mature enough to be able to generate dense light fields needed for high quality scene rendering.

Comment: Re:Quick note (Score 4, Informative) 37

by Arkh89 (#49540751) Attached to: Virtual Telescope Readied To Image Black Hole's 'Ring of Fire'

Nope, you need the reference phase to still be coherent with the observed object (temporal and spatial), so the interference is only possible between two parts of the same wave (of light), different in space (think two pinholes in a plane through which you collect the light) and/or in time (think delay line, let one part of the line you collected run a longer distance). The first is the famous Young's double slit experiment, the second is the Michelson interferometer.

Also, for reference : frequency of the visible EM fields is in the order of 300THz (300,000GHz).

Comment: Quick note (Score 5, Informative) 37

by Arkh89 (#49539791) Attached to: Virtual Telescope Readied To Image Black Hole's 'Ring of Fire'

This telescope operate in the radio bands (sub-millimeter) and not in the visible. That's why it is easy to make interferometry over very long base line. In the visible domain this is very tricky to realize over a couple of 100m (such as with the VLTI).

You can think of it as completing piece by piece the Fourier transform of the image you want to observe. Every pair of telescope gives you a measurement in the so-called UV plane (spatial frequencies). The furthest the observations point are (the telescopes) the smaller details you can get. Except this is only valid if you can measure the amplitude and phase of the electromagnetic radiation (or find a way to reconstruct it in some way). This is easy in the radio bands. But this oscillation is just too fast with visible wavelength and thus, we can not record and adjust offline, we have to interfere the waves right away...

+ - iOS WiFi Bug Allows Remote Reboot Of All Devices In Area 2

Submitted by BronsCon
BronsCon writes: A recently disclosed flaw in iOS 8 dubbed "No iOS Zone" allows an attacker to create a WiFi hot spot that will cause iOS devices to become unstable, crash, and reboot, even when in offline mode. Adi Sharabani and Yair Amit of Skycure are working with Apple for a fix; but, for now, the only workaround is to simply not be in range of such a malicious network.

If I have seen farther than others, it is because I was standing on the shoulders of giants. -- Isaac Newton

Working...