Ah... You mean you are a founder of Crowdtilt, not Kickstarter.
Ah... You mean you are a founder of Crowdtilt, not Kickstarter.
There are 2 ways that modern projector based planetariums work. The easy way is with one projector and a fish eye lens. The lenses tend to run about 100k and the single projector will have to be very bright because of how spread out it will be. The hard (but arguably better) way is by mapping multiple projectors together. This will allow for a much brighter image because the brightest projectors available today are about 40k lumens. 8 20k projectors are obviously much brighter.
It takes quite a bit of work to map a dome like this. I spent close to 48 hours straight mapping a 90' dome for a party for Putin and I am considered very fast in the industry. Basically you project a grid and twist the points till the line up correctly allowing for about 20% overlap of the projections. You can use a modeler like Gmax or custom warping programs that most professional media servers have these days... We use Coolux - Pandoras Box.
Ideally all the warping was already done for these guys and all they had to do was plug their system into a live input card (capture card) and route their systems through the media servers at the planetarium. More likely they had to re-map it. They did an ok job, but you can definitely see distortion as the image moves between projectors. The bigger problem they are having is with sync. This is always a really difficult issue between multiple systems and one of the main reasons to use a quality media server. You can clearly see the computers are wildly out of sync at the end of the video. Even 1-2 frames of sync loss will be clearly evident in a projector blend.
Either way, the project is really cool. If anyone is interested there is a free open source media server out there capable of mapping domes and other 3d objects called vvvv (although it is a bitch compared to the commercial solutions). Pure Data is also worth looking at. It is an open source alternative to Max Msp which does related interactive video things.
For my friends bachelor party we hired a stripper to come play LAN games with us. She lost.
This is completely true. But it is fun telling them that it is illegal and encrypted. I am certainly not going to build a box to get around copy protection for some client that does not have their content together in time.
The real problem is EDID. DVI and HDMI are always a fight with EDID. It never works the way it should, especially when you bring in DVI detectives and fiber connections. It is a monstrous pain in the ass. HD-SDI woud be so much better. Copy protection is someone else's problem when it comes to professional AV. I have no problem drawing a line there as large corporations dropping hundreds of thousands of dollars on an event should be paying for content.
There are a few devices that convert hdmi/dvi to hdsdi for a reasonable amount of money. Blackmagic makes one I think it dubs a dvi extender. The problem is sync and reliability. Neither of which Blackmagic is known for.
The ultimate professional solution is an Imagepro. They run about 8k and work perfectly. They add about 2 frames of latency which sucks for live music events and lip sync, but they are reliable.
We have talked to engineers at Nvidia for years trying to convince them to make a decent hdsdi card. There is just not a big enough market. The one they have now is terrible. It costs as much as an imagepro with the same latency and without the cool options that come with the imagepro. Also as graphic cards get better you can always move your imagepro along with the new card.
Hopefully one day there will be a serious push for hdsdi. It runs huge lengths, has no stupid EDID issues, and locks in place.
I work in the high end media server/ video event industry.
I have seen a difference in color space from NVIDIA to ATi. To notice it I had to use a 2 million dollar LED wall with 2 media servers controlling different parts of the same image with 2 different graphic cards. They were both high end cards, but they tend to have slightly different color spaces and sync rates. LED will show off differences in color more than any other medium.
Sometimes you can notice a degradation in VGA vs DVI. A lot of it depends on the quality of the VGA cable and the length of the run. We regularly use 300' VGA cables and the image still looks fine... The other options at those distances are DVI fiber or ethernet (dvi -> ethernet -> dvi converters). VGA is almost always our first choice if the output is within 300' of the computer. It looks fine unless you like studying pixels with a magnifying glass.
During Katrina the power went out throughout New Orleans fairly early on. I still had cell service for about 3 hours after the city went black - if I remember correctly. The POTS system worked, albeit somewhat sporadically, the entire time. The phone at the house we were staying at surprised the hell out of us by ringing during the peak of the storm when everything else had gone dead for hours. I also remember seeing lines half way down the block for pay phones the day after.
I remember hearing somewhere that the cell towers were on a battery backup. I am not sure if this is a normal thing or something installed because everyone knew they would lose power when Katrina hit.
I assume that both system have some sort of backup, but the cell tower takes too much power to run off batteries for long.
Apple is a very safe platform, but the safest software in the world can't protect against Stupid.
Like my boss always says. We try to build things idiot proof, but they keep building better idiots.
Line up as many pixels as you like. For video playback at least, we are still unfortunately limited by video cards, hard drive access time and bus speed. We cannot yet playback a video at that resolution. Most video cards cap out at 2560x1600 per output, hard drives can be put in a RAID and solid state drives are damn fast, but the file sizes for videos at that resolution are absurd. Bus speed is always an issue - all the information has to be passed from HD to processor and it has to wait its turn.
I work for one of the top media server companies in the world. We max out at a single file at 4k resolution. Even that is a compromise of quality and speed. Anything bigger and it is actually multiple video files broken up and synced across several systems to several displays (blended projectors, multiple LCDs, whatever)
No doubt. One day we will be there. But the bottle neck is not the displays.
With higher quality scaling algorithms it is a good work around till the technology catches up. I know the article said this trick works because of it's 8bit nature, but I still hope for advancements in real time scaling algorithms. At least for my industry it would be a great help.
I am curious about the displays you refer to though. Those are some damn small pixels at 7-10".
A human corpse is considered an especially sacred inanimate object in our society. It is also a health issue. The debate over burning a flag provides another possible example - although still protected at this time.
Who knows? Maybe one day we will provide legal protection for flags, religious symbols, excessively cute robots, and pictures of talk show hosts... but I hope not. And I really hope we still never hold game systems to the same standard.
The Samsung Galaxy S comes with Eclair. It can be upgraded to Froyo if you have the patience to work with Kies... Maybe you are thinking of the Nexus S.
Correct. This has little or nothing to do with free speech. And the death of 'the old model' may strengthen free speech.
But it will kill investigative journalism...
They are both dear to me and I would not lose either. The solution cannot be litigious or it risks harm to the 1st amendment.
If the problem is linking isn't there a fairly simple technological solution (HTML? PHP?) to the commercial news media not wanting their text directly linked to? Can't they simply have a generated page name that goes missing when copied and linked back to? Sites that host images sometimes do this to avoid being an image server. When the link is followed one simply finds themselves at the New York Times or The Washington Post homepage.
When we write programs that "learn", it turns out we do and they don't.