Sony's HDV 1080i Consumer Camcorder 223
An anonymous reader writes "Sony has just announced a high-definition video camcorder that records in 1080i. A site was just created with a lot of information about the camcorder. The camcorder uses the HDV spec which records to standard MiniDV tapes. It includes 3 CCDs and along with the announcement it appears Apple and Adobe are now supporting the HDV standard. The camcorder carries a steep price at $3,700 though. See the original press release as well, though it doesn't contain much information."
Actually, not THAT expensive (Score:5, Informative)
Re:Actually, not THAT expensive (Score:4, Informative)
Uh, aside from the minor fact that the XL1 records 720x480 SDTV resolution while the Sony records 1920x1080 HDTV resolution!
Re:Actually, not THAT expensive (Score:4, Informative)
Re:Actually, not THAT expensive (Score:5, Informative)
Squeezed 16:9 is not that unusual, considering the price of the camera. If you want native 16:9 resolution, you might want to get this one [sony.com] instead of the new camera.
Re:Actually, not THAT expensive (Score:2)
Re:Actually, not THAT expensive (Score:2)
HD for distribution, like on the WMV DVD-ROM titles, also typically use 1440 wide anamorphic as well, to reduce decode CPU load. Although this will be less of an issue with Windows Media Player 10 out, which can offload a fair amount of decode onto a hig
Re:Actually, not THAT expensive (Score:2)
Re:Actually, not THAT expensive (Score:3, Insightful)
Re:Actually, not THAT expensive (Score:2, Insightful)
is there's no disc replacing dvd to promote HD, the future of HD isnt that clear yet.
for example, NBC broadcasts olympics HD signal of previous day's stuff, instead of the current day's stuff on the non-HD channel.
Data transfer and storage (Score:3, Interesting)
Re:Data transfer and storage (Score:2)
Re:Data transfer and storage (Score:2, Informative)
I suppose it's limited to the speed of the tape reader...
Still, considering how long transitions, wipes, and other effects will take to render even on a 2xCPU G5, importing speeds will be the least of your worries
(BTW, AFAICR the standard digital cinema projection size is 1280x1024.. This cam will beat that, and with post processing the results should look pretty damn good.. Hopefully prosumer 3D modeling pkgs will keep up too
JVC did it first... (Score:4, Informative)
Re:JVC did it first... (Score:5, Informative)
The JVC:
doesn't support 1080i (argue as you may the merits of 720p vs 1080i, the generally accepted wisdom is that progressive is better for shooting sports events and interlaced higher res is better for drama)
doesn't support OS X
doesn't have a Zeiss lens
has only 1 CCD
has a 4x3 CCD, not 16x9
The JVC doesn't compare. And this from someone who actively avoids Sony stuff unless it's the best in class (as the 200 DVD changer was in its time).
Re:JVC did it first... (Score:2)
The fact that the CCD on the JVC is 4x3 is irrelevent as it has more than enough pixels for them to take a 720p centre crop out of it. It's a hybrid CCD with enough resolution for widesceeen 1280x720 and 720x480 SD. However, it is one chip, and the image looks like crap as it's over-sharpened.
The Sony camera is 1080i, but only 960 horizontal pixels, which get stretched to make a 16x9 aspect ratio, s
Re:JVC did it first... (Score:3, Funny)
doesn't support OS X
I think you mean OS-X doesn't support the JVC! What is it with you Mac folks that tries to spin everything wrong?
Re:JVC did it first... (Score:2)
Re:JVC did it first... (Score:2)
Which is quite a difference
Porn? (Score:2, Funny)
Re:Porn? (Score:2)
I will be impressed when (Score:2, Insightful)
-I get a digital camera that uses the X3 sensor and has a true 8MP CCD, not this 1.5MP x 3 garbage that you see.
Re:I will be impressed when (Score:2, Insightful)
Re:I will be impressed when (Score:2)
Re:I will be impressed when (Score:2)
Go fuck yourself you little bitch. Go pony up $1350 and buy the camera body already if you're so aware of it.
Re:I will be impressed when (Score:2)
I don't understand 1080i (Score:2, Interesting)
Am I missing something?
Re:I don't understand 1080i (Score:2, Insightful)
Cost. Look at the bandwidth requirements of 1080p, 'til recently satisfying that on a consumer screen was pretty much more than anyone was willing to pay, given the dearth of HD programming. The cost balance was forged at the beginning of ATSC deliberations.
Hell, try driving UT2004 at 1920x1080 on your widescreen computer monitor with less than a Geforce FX5900!
I submit that if you use a progressive computer monitor and deinterlace 1080i it'll look OK, but I also submit that ve
Re:I don't understand 1080i (Score:2)
1080i = 540p (Score:2, Interesting)
Basically, 1080i = 540 lines / refresh.
720p has 720 lines per refresh.
The problem with interlacing is that it introduces or exacerbates certain visual artifacts. This is one of the reasons some of the networks are sticking with 720p for their HDTV broadcasts.
Whether this interlaced standard is a carryover from the consumer electronics folks or not, I would stick with 720p until something nicer comes out. Be interesting to know th
Re:1080i = 540p (Score:2)
Re:1080i = 540p (Score:5, Interesting)
You can't really say 1080i = 540p. You are right that 1080i is 2 540 fields interlaced, but those are FIELDS, i.e. offset horizontal lines. Combined they do produce 1080 lines of resolution. Native 540p is basically just NTSC, and 1080i can easily said to be amazingly higher quality than NTSC. Most people can also spot the difference between 720p and 1080i too. I can tell when watching ABC/ESPN 720p football compared to CBS/HDNet 1080i football. I don't have a native 720p display though, to be fair, and 1080i does have more motion artifacts. It's generally agreed that 720p is best for fast-moving sports, and 1080i for slow shots, documentaries, 35mm film transfers, etc.
Re:1080i = 540p (Score:2, Interesting)
How much $$$? I'm sorry, Costco has spoiled me, I want >50" 1080p for less than $4000...
You can't really say 1080i = 540p.
Close enough for mouthbreathers
Seriously, if you get your hands on a 1080p (like a 23-24" WUXGA screen) display and preprocess 1080i, it should be OK... A good line processor should be able to buffer enough to compensate for jitter, and that kind of thing is getting built into PC vidcards nowadays...
It's generally agreed that
Re:1080i = 540p (Score:2)
Not really, when you consider who came up with all the specs for DTV in the first place. No one in the broadcast industry wanted to eliminate interlace and change to progressive originally. Just like when the digital cameras for filming movies came out, the movie makers wouldn't use them until they supported 24fps instead of only 30fps. The reasons for support don't usually make sense other than it is what the people are comf
Re:1080i = 540p (Score:2)
Re:1080i = 540p (Score:2)
Consider though... (Score:5, Funny)
The defacto HD production format, HDCAM, records something like 140-180MBit/sec, the uncompressed signal is something like 996MBit/sec.
The most likely market for this camera will be indie filmmakers, documentaries, and industrial/corporate promotional use. The price makes complete sense--and most of the market buying VX2100's and XL1's will probably look seriously at this.
Most broadcast/network HD will still be HDCAM, DVCPRO HD (off the popular Panasonic Varicam) or 35mm transfer.
Calum
Re:Consider though... (Score:2)
Re:Consider though... (Score:2)
Re:Consider though... (Score:2)
I could see using it for our labs (Score:2)
At $4,000, it would be an actual viable option, though expensive. A DVPro unit is just out of the question, no way we are paying $50,000 just to get clear video between two labs.
I'm ju
Re:I could see using it for our labs (Score:2)
The problem is how you'd link it to the projector, and what the native resolution and quality of the projector is.
The projector either needs a Firewire transceiver and the HDV codec
Not full resolution 1080i? (Score:2)
Each CCD measures 960 x 1080 pixels.
1080 is supposed to be the vertical resolution, with horizontal at 1920. This is less than half the horizontal resolution.
-Adam
Re:Not full resolution 1080i? (Score:4, Informative)
Most likely they offset one CCD by half a pixel, which is a common technique in video cameras to improve resolution with small CCDs. That way they can get a good approximation to the full 1920x1080 luminance signal by mixing the signals from the three CCDs... the chroma signal is probably only being recorded at half resolution anyway, so it's less important.
Re:Not full resolution 1080i? (Score:2)
Re:Not full resolution 1080i? (Score:3, Informative)
Not if they intend to stick to the HDV standard: I may be wrong, but from what little is available on the web it appears that the the only recording option at 1080i is 1440x1080 anamorphic.
Re:Not full resolution 1080i? (Score:2)
Apple has been supporting HD for a while now (Score:5, Informative)
It is cool to see a 1080i camera out there though. Give it a few weeks and there will be a consumer affordable model.
For now I will stick with my Canon Optura Xi.
Re:Apple has been supporting HD for a while now (Score:2)
That is the consumer affordable model. The Broadcast Non-Consumer model goes for at least 10 times that much.
Re:Apple has been supporting HD for a while now (Score:2)
Very nice (Score:4, Interesting)
On a serious note. I have been thinking about things like this for a while. It's not exactly a highly original thought, but more and more of high end hardware/software/electronics/mechanics are becoming available to the normal joe. This has been widely known and considered with apache/linux/mysql/php/etc., but it is happening in many realms other than software.
I think that we are stepping into a creative boon as a result of this. When only large profit-intensive, single-minded corporations have access to these types of materials you don't see much creativity in how they are used. However, you stick that power with a vast majority of the public and you are going to have some incredibly original and creative ideas. I am looking forward to the creativity too....Doggy style is so 20th century.
Re:Very nice (Score:5, Insightful)
The idealist in me wants to agree, but realistically, what we'll see is more crap:
-More angsty rich kids making "indie films" that make no damned sense.
-More HD/DVD wedding videos filled with tacky transition effects and shaky handheld underlit shots.
-More slanted special interest group propaganda, filled with hate, revisionism or evangelism.
Now, all of you are probably sharpening your keyboards, saying "who are you to judge"? If publishing a book, presenting a scientific paper, writing a screenplay twenty years ago had one merit, it was the fact you had to get it through some sort of editorial process. Someone did judge, and usually it was someone in the know. You couldn't spout hate on digital video and expect it broadcast on community cable. You couldn't make up pseudo science and have it published to an audience because real scientists would review your paper. But today, there is no review. You're free to host PDFs of your cracknut theories of science, or stream videos of you in your bedsheet over your head burning people at the stake.
Part of me wants to believe that the result of today's technologies (desktop publishing, digital video, the web) means that stories that are underrepresented will be told, that we'll all benefit, but for the time being, I suspect all we'll get is more trash.
Whiny BS (Score:2)
The real question... (Score:2)
My money's on no, but it's still cool to see companies working at getting these products to market. The next few years are gonna be exciting for filmmakers as desktop HD comes online.
Not surprising, except the recording medium (Score:2)
My questions are more about the loss in compression, and how it interacts with existing editing suites? Standard 400mbps firewire? When your capturing from firewire on a host, and it tries to render the live video stream, is Premiere going to blow up? (Well, Premiere blows up on it's own constantly without wierd hardware (Premiere XP
HDV - Nice format, weak cameras (Score:4, Insightful)
This is not suprising - I have always found the image and color quality of DV cameras to be much lower than even medium-end pro cameras (such as the elderly SVHS Panasonic Supercam). The prosumer cameras do not have $3,000 lenses. They do not have the amazing amount of color DSP going on as the pro cameras.
But at the same time, HDV cameras are better than nothing, and certainly good for "riskier" shots where a $100,000 HDCAM camera being lost would be a problem. You just can't skydive with a full-size camera, for instance...
One other issue is that 25 Mbps is really limiting for MPEG-2 HD (heh, so is 19.4 Mbps, but that is another topic).
If you are into a lot of action with lots of uncorrelated motion vectors, you might be better off with upconverted DV, as 25 Mbps is fine for inraframe coded DV.
Close, but not quite there. (Score:2, Informative)
Re:Close, but not quite there. (Score:2)
Ok first off lots of people in the Indie film industry are full of crap. Most of the winners at the past festivals have been on DVD and NOT FILM. More and mor
What they don't tell you (Score:2)
What this discussion doesn't tell you is how perfectly acceptable (I would say fscking GREAT) regular mini-DV is right now.
Yes, I know, lots of people want high-res, high-def, high frame rates, gorgeous colors, minimal artefacts etc. Some of those even NEED those things.
However, if you are reading this discussion and you don't have experience with what "plain, vanilla" mini-DV can do, then just don't worry about new fancy-pants cameras for now.
Instead, get a mini-DV camera and a Mac (especially) or a PC.
Re:Interesting (Score:2, Informative)
Re:Interesting (Score:2)
Re:Interesting (Score:5, Interesting)
On the other hand...if you think about how much camcorders cost around 20 years ago...adjust for inflation...this really not all that expensive. I'm sure I'm not the only one who remembers the separate cameras and recording decks of days long past.
I'd be particularly wary of buying any NTSC/PAL camcorders with the new HD standards that are going to be set in the next few years. I'm hoping that by the time I have kids, there'll be more choices on the market with this kind of recording quality.
Re:Interesting (Score:2, Funny)
I'm hoping that by the time I have kids
What? I thought this was slashdot!
Re:Interesting (Score:3, Informative)
The basic reason they don't is due to the bandwidth of the amount of information you'd have to transfer is double that of 1080i. And then you have to compress that information. Compressed HDTV runs at 19.8 Mega Bits/second. To do 1080p you would have to run at twice that 39.6 Mbps. This amounts to one full 4.7GB disk every 15 minutes. Not many devices can record that fast, except raid arrays. If you wan
Re:Interesting (Score:2)
Re:Interesting (Score:2)
True, they are displaying the pixels differently, they aren't turning on half of t
Re:Interesting (Score:2)
Re:Interesting (Score:2)
So, 1080p at the fastest rate of 30Hz that's 1080 lines/field x 30 fields/second = 32400 lines/second.
Compare that to 1080i's 540 lines/field x 60 fields/second = 32400 lin
Re:Interesting (Score:2)
Also, 40Mbps is what, 5megabytes per second, which even a 6 year old hard drive can record at quite easily!!! No raid needed!!
Re:Interesting (Score:2)
Compressed HDTV runs at 19.8 Mega Bits/second. To do 1080p you would have to run at twice that 39.6 Mbps.
I don't see any reason why you could not take the video in at 30 frames per second progressive. This would have the same bandwidth as 60 fields per second (30 frames per second) interlaced, which is what this camera shoots natively. You would have to alter the front end a little to accomodate this, but 30fps progressive is in the HDTV standards for 1080-line mode. It's called 1080i because it gets
Re:Interesting (Score:2)
You could, but that wouldn't give you the nice "film look of progressive". The big problem with interlaced for capturing moving objects is combing (see the adverts for "comb filters" in TVs?) The object moves a little in between the two halves of the interlaced frame being recorded, and the visual result is a nasty jaggie effect on object borders, which looks a bit like the teeth of a comb.
That's a problem wit
Re:Interesting (Score:2)
Right, and that is the reason you capture progressively, then and only then (if needed), interlace it. I believe that the current DV camcorders that sport progressive scan already do exactly that.
Re:Interesting (Score:2)
Re:Interesting (Score:2)
I doubt you will see 1080p24 consumer cameras offered soon, out of fear that they would cut into sales of professional 24p cameras.
The HDV spec involves super-lossy compression so it won't really hold up to th
Re:Interesting (Score:2)
the Canon XL-2
and I consider that an inexpensive Video Camera.
Try shooting with a Sony DVCPro camera with a $30,000.00 Fujinon Lens..
Having $100,000.00 on your shoulder or flinging around a el-cheapo $5K camera??? I'll take the $5K camera that get's almost as good video (to the point that even the networks are switching to the Canon XL line of cameras for news and sports) and is basically disposable compared to the workhorse camera's of yesterday.
giv
Re:Interesting (Score:2)
Re:Interesting (Score:5, Interesting)
It's time we dropped interlacing completely (funnily enough, I was told that was one of the big benefits to digital, that you get 60 discrete full-resolution frames per second, and not 59.94 or 29.97 or some fucked up number). Had I been in charge of the FCC, when CBS threatened to pull their HD over the broadcast flag, I'd have told them, "hahaha, go ahead and pull it, 1080i sucks cock anyway, and so does the broadcast flag".
Plus, devices that are natively 1080i will have to upconvert 720p, which will cause an immediate resolution loss of 1/2 the full 1080i pixel array, since you're converting from 60 full frames per second, to 60 fields per second. And that's not even figuring in the resizing process from 1920x1080 to 1280x720.
I'd rather see 1080i downconverted to 720p, so that the 720p signal will run at native resolution . 720p is the current sweet spot for quality in HDTV, and people completely miss it because "1080 is bigger, durrrrr".
Interlacing should have NEVER BEEN ALLOWED INTO THE DIGITAL STANDARD AT ALL. Legacy interlaced material running at 59.94 fields/sec can be converted to 480p/29.97fps with absolutely NO loss, only problem is you get mice teeth (but they could just bob it in the receiver). For material shot and produced for HD, there should NEVER be any interlacing, EVER. Interlacing was only used as a cheap analog way of compressing the signal at a 1:2 ratio. Now that we have the bandwidth, there is no reason we can't have 60 discrete frames per second.
Oh, and don't even get me started on why we are already locked into MPEG-2 for DTT, despite the availability of better compression methods. Or why companies that broadcast on two separate NTSC licenses (commonly known as 'duopolies') are only being given one 19Mbps ATSC license? Due to this, such companies can NOT offer true HD for both stations. If the analog side of your station broadcasts on two 6MHz channels (discounting translators, etc - just the main transmitter), then you should get two 19Mbps ATSC licenses, point blank.
Digital TV sucks. It will be the end of television, as we know it. Mark my words.
Re:Interesting (Score:5, Informative)
Sony HDCAM: 1080i
Panasonic D5: 720p
Half of broadcasters went one way, half went the other. Keep in mind the existing business relationships at networks and stations.
But 1080i really does seem to provide a higher-resolution experience (when watched on a real 1080 monitor...) HDNET went 1080i, and most PBS content is 1080i. But I will admit it is really a religious issue.
I've never heard about the duopoly issue with DTV channel assignments. It is my impression that every analog broadcast channel is entitled to a DTV channel as well during the transition. Do you have a reference on this?
I can assure you that MPEG-2 is the ONLY codec that is broadcast-ready. Certainly when ATSC specs were defined, they weren't even thoughts.
I've seen the best H.264 and Windows Media live encoders on the planet, and they can barely get the same quality at the same bitrate as the best mid-level MPEG-2 live encoders.
Keep in mind that MPEG-2 encoders have had years to get better. People keep coming up with ways to cut bits, you now have live 2-pass encoders, pre-filtering, etc. MPEG-2 live encoding quality has improved 100% in the last five years in terms of equivalent bitrate quality.
I expect 2-3 years before the live H.264/WMT encoders can catch up with live MPEG-2 encoders.
Re:Interesting (Score:3, Funny)
From your mouth to God's ears. The end of TV as we know can only be a good thing.
Re:Interesting (Score:2)
I have an HDTV and 1080i looks incredible, with the exception of fast moving video shot at 1080i, because the 4:2:0 color compression screws it up, and creates color banding and unwanted blending. If you watch 1080i originated material frame by frame in MPEG2 format, in fast scenes the color information is sometimes a frame ahead of the fast moving object... quite obvious when you pause the HD TiVo, but hardly noticable at all at 59.94 fields per second.
The main reason 1080i is used and not
Re:Interesting (Score:2)
I have to agree that the compression sucks, fast action just breaks down, but last I checked at least Comcast was not doing rateshaping yet (this was early this year). Has that changed? I know they bought some equipment for it in some markets. I think I might need to go to ov
Re:Interesting (Score:2)
I couldn't have said it better
Re:Interesting (Score:2)
Re:beware: sony is too proprietary (Score:3, Informative)
Re:beware: sony is too proprietary (Score:2, Informative)
Re:beware: sony is too proprietary (Score:5, Informative)
"The HDV spec was agreed upon as a standard by Sony, JVC, Canon, and Sharp for new high-definition consumer camcorders last year. Along with the announcement of the new Sony HDV camcorder comes support from major video editing software companies including Apple and Adobe"
Go on the DV boards like 2-pop or creative cow and find me all the people who are unable to use sony's "not recognized and not standard" DV VTR's and cameras. They ARE standard and any editor that can capture DV can get video from them just as easily as from a JVC, Panasonic or Canon. No drivers necessary.
Re: ACTUALLY sony is a pain to work with (Score:2)
Maybe you're thinking of USB (Score:2, Informative)
However, if you actually go out and buy a 1394 cable, it works in all applications just like any other DV device would.
Don't You Mean MicroDV?? (Score:2)
Re:HMM I wonder... (Score:2)
Re:HMM I wonder... (Score:2)
Re:HMM I wonder... (Score:2)
DV is lossy, but it compresses individual frames, so when you cut DV and output an edited file, you just take the compressed frames and write them to tape in order.
MPEG-2 is lossy and compresses groups of frames, so if you cut MPEG-2 then you have to recompress the entire movie when you output an edited file, and every time the video will get worse (minus optimisations to copy over the original data in places whe
Re:HMM I wonder... (Score:3, Informative)
Re:HMM I wonder... (Score:2)
Re:HMM I wonder... (Score:2)
Ick (Score:2, Funny)
It'll just make the flaws and bad makeup stand out more.
Re:How much is enough? (Score:2, Funny)
Re:HDV recording, saved as MPEG2??? (Score:3, Insightful)
Re:Kill Interlaced Video Now!!! (Score:2)
The frame rate needs to go up, too. 24FPS is way too slow. The classic choice between strobing and blur on every pan needs to go away. Digital cinema should be 72FPS.
Some video games now have sharper images than Hollywood. This is embarassing.
Re:get ready to hear more film school talk (Score:2)
Re:Finally.... (Score:5, Insightful)
Unfortunately, porn in high res it not the panacea you imply. Even on DVD I can see waaaaay too much detail. (think: sores, pimples, rashes, bruises, and surgery scars)