Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×
Software Graphics

In-Depth Look At Video Codecs 149

johnsee writes "Atomicmpc has an incredibly in- depth look at a wide range of video codecs. It looks not only at their inner workings, but also shows the quality produced by each at a variety of settings and situations."
This discussion has been archived. No new comments can be posted.

In-Depth Look At Video Codecs

Comments Filter:
  • by datapharmer ( 1099455 ) on Friday June 08, 2007 @12:22PM (#19439089) Homepage
    Oh wait I'm "new" here.... let me go RTFA. Be right back.
    • by datapharmer ( 1099455 ) on Friday June 08, 2007 @12:29PM (#19439249) Homepage
      Apparently some people have no sense of humor. BTW I am back.

      For those not wanting to read the article:
      Rated best to worst with default settings
      Low Bitrate go with XVID, DIVX, h.264, WMV
      Medium: XVID or h.264 depending on lighting and motion, WMV, DIVX
      High: h.264, WMV, XVID, DIVX
      • by Silverlancer ( 786390 ) on Friday June 08, 2007 @12:31PM (#19439295)
        Xvid is horrible at low bitrates in my experience. For example, I did a test encode of a 1680x1050 FRAPS video at 1000kbps. H.264 was actually quite watchable (!!!) and you could even read the size-12 text in the chat windows. Xvid couldn't even get below 1400kbps or so with every frame at quantizer 31 with max motion search settings and the like. So I'd say Xvid is the opposite--good at higher bitrates, worthless at low ones.
        • I'd like to point out that 1680x1050 is huge. For ref, DVDs are encoded ~9000kbps at 720x480. Granted with an inferior codec, but still.

          I wouldn't expect many codecs to handle that size frame spectacularly well. That any of them did [h.264 in your case] is amazing.

        • XviD is an H.263, aka MPEG-4 Part 2 "Advanced Simple Profile" (ASP) encoder, no?

          This is quite different from the newer H.264 (MPEG-4 Part 10 "Advanced Video Profile" (AVP)) encoders like x264 (which is part of ffmpeg, at least recently, I believe). H.264 is a much better match for high-definition video that's going to be played back on HD equipment.

          I think it's been known since the AVP codecs arrived on the scene that they pretty much kicked the crap out of the ASP ones; their only major downside is the processing requirements both to encode and decode, and (more true in the past than now) limited installed base of people with the codecs.
          • Re: (Score:3, Informative)

            by russ1337 ( 938915 )
            I've had some guys at my work try to tell me that H.263 = MPEG4. It actually got quite nasty.

            I recognize their are similarities, but I do not believe they are the 'exact same'. My evidence is that the video cards we use have Mpeg4 hardware encoders, yet will not 'hardware decode' the H.263 stream.

            also, FTFA's reference:

            The resynchronization approach adopted by MPEG-4, referred to as a packet approach, is similar to the Group of Blocks (GOBs) structure utilized by the ITU-T standards of H.261 and H.2

            • by JohnnyLocust ( 855742 ) on Friday June 08, 2007 @01:38PM (#19440487) Homepage

              I've had some guys at my work try to tell me that H.263 = MPEG4. It actually got quite nasty. I recognize their are similarities, but I do not believe they are the 'exact same'.
              H.263 is a part of the entire MPEG4 specification (as is H.264).

              ie the following statement is always true:
              H.263 is always MPEG4

              However the the folloing statement is not always true:
              MPEG4 is always h.263

              My evidence is that the video cards we use have Mpeg4 hardware encoders
              Not true at all. There are some hardware MPEG4 encoders on the market, but it is for the most part, not included in modern GPUs. For decoding purposes, portions of the h.263 (IDCT to be exact) has been implemented in hardware on video cards for quite sometime. However, combined with programmable shaders, a good deal of h.263 decoding can be greatly accelerated by most modern GPUs (nVidia's PureVideo DirectShow codec is an example of this). ATI's AVIVO XCode app does use a great deal of shaders to speed up the encoding process for several codecs. Even though it's been shoehorned to work with other GPUs, it was intended to work thier X1X00 line of video cards.
              • Thanks for your response. I really appreciate it. If i've got this straight you're saying H.263 is a subset of MPEG4. I'm still a little confused though as I've been reading through the page linked: / mpeg-4.htm [], which says...

                H.263 uses 'Group of Block' (macroblock) resync, and Mpeg 4 uses ' the packet approach'

                H.263 also puts information on the macroblock in the header, the document says MPEG4 uses a 'picture start code'. These appear to be different
                • That page is either oversimplifying, or just wrong. I'm not sure which.

                  MPEG-4 is a group of specifications. It's not just one video format, or one codec. Just talking about "MPEG-4 video" is bad, because it could refer to any one of several video formats, and any of those formats could have been produced with a variety of codecs.

                  Within "MPEG-4," you have multiple "parts" which are actual specifications for video encodings. Wikipedia explains slightly:

                  MPEG-4 is still a developing standard and is divided into

                  • See, now this is why I read Slashdot. Thank you,Kadin and your progenitors for the excellent primer on codecs. If not a full course, it gives me enough to use.
                  • Re: (Score:3, Informative)

                    by NoMaster ( 142776 )
                    Almost right, but the "Part"s are descriptive standards or references, not implementations. H.264 is not "MPEG-4 Part 10" - Part 10 describes a standard which is technically the same as H.264, but does not provide an implementation (beyond pointing at the ITU-T's H.264). It's a subtle difference, but important, and Wikipedia is not always clear on this.

                    It's a bit like reverse-engineering in a cleanroom environment - the various MPEG-4 parts describe exactly how things should work, then you'd pass it off to
            • H.263 is indeed not the same as MPEG-4. H.263 was published in 1995 by the ITU, MPEG-4 (first "official" draft) was published in 1998 by the MPEG. H.263 is based on H.261 and MPEG-2, but was developed without the MPEG's help.

              It uses similar concepts to MPEG-4 (all BMC-based algorithms do), but several details in stream structure are different (which is not to say parts of a MPEG-4 codec can't be used to accelerate H.263 compression or decompression).

              H.264, on the other hand, was developed by the ITU in part
              • Thanks for your comments. After having read your articles that you posted later in this thread it is obvious you know your stuff. Thanks.

                I would be interested if you had a reply to JohnnyLocusts reply above...? 40487 []
                • Not really, beyond telling him to read the H.263 specification...

         -REC-H.263-200501-I!!PDF-E&type=items [] ...where he won't find any reference to the MPEG (let alone MPEG-4, which was published after H.263).

                  MPEG-4 is "H.263 compatible" in the sense that a basic H.263 stream can be correctly decoded by a "complete" MPEG-4 video decoder, but MPEG-4 decoders aren't required to be "complete" (which is not to say that a lot of them don't cover what's requir
            • I'll take a whack at it.

              H.263 baseline is the same bitstream as MPEG-4 pt. 2 short header (and forms the basis of the Flash Spark codec). Both H.263+ and ++ and MPEG-4 pt. 2 Simple Profile and Advanced Simple Profile have further (and different) enhancements to that core bitstream.

              Being based on H.263 proved to be much more of a limit for MPEG-4 pt. 2 development than was original determined, which led to the development of newer codecs like VC-1 and H.264.
          • Not exactly. H.263 is not the same as the MPEG-4 ASP. XviD is indeed based on the MPEG-4 ASP, and can also produce H.263 files, but the two things are different (albeit similar, especially if you consider H.263 v2).

            Also, bear in mind that video compression standards focus on "what" the encoders must create, not "how". Even if a certain standard supports more advanced features, a "smarter" encoder using a "lesser" standard can produce similar, or even better results.

            As H.264 encoders improve, they should cle
      • Hey,

        Product manager for the DivX codec here :)

        It's always very difficult to run a comprehensive codec comparison because each of the competing codecs offers a wide range of settings, and to test comprehensively over many different clips and bitrate is extraordinarily time consuming - so kudos to the author of TFA. However, I'd like to offer some brief feedback:
        • It would be my expectation (based on a lot of experience!) that Xvid and DivX should always be grouped tightly together. If this is not the case it
      • by SeaFox ( 739806 )

        For those not wanting to read the article:
        Rated best to worst with default settings
        Low Bitrate go with XVID, DIVX, h.264, WMV

        Wow, thanks for saving me from the time I would have wasted reading this article. As I know it has to be junk to say that XviD is better than h.264 at low bitrates,
  • You don't even have to read the article to guess that H.264 destroyed everything. Even VP7 doesn't stand a chance, though from my experience its the closest codec there is to H.264.
    • What would be more interesting is, "which H.264 codec performed the best?" They probably answer this question in the article, but I can't read it because it seems to be Slashdotted right now.
      • by Silverlancer ( 786390 ) on Friday June 08, 2007 @12:38PM (#19439421)
        There are a number of comparisons around the internet, and the last ones from 2006 show that x264 and Mainconcept are basically tied as the best, with Mainconcept having a tiny lead. However, x264 now has Adaptive Quantization available, an experimental feature that can help eliminate blocking in dark scenes which is pretty much impossible to avoid without AQ unless you use absurdly high bitrates. This feature alone puts it way over the top, IMO.

        --aq-strength for the win!
        • Note that adaptive quantization isn't something particularly magic in x264. Our released VC-1 (WMV9) codec supports Differential Quantization and Adaptive Deadzone.

          I just wrote up a blog post about using them in a downloadable 1080p version of the Elephant's Dream clip:

 le/ []
          • It is something special about x264, because no other H.264 encoder has it. VC-1 is a joke though; its quality is absolutely laughable compared to H.264. Microsoft should be able to do better.
            • I think our qualty is very competitive with H.264! Bear in mind that the majority of HD optical (HD DVD and BD) discs use VC-1. Did you check out my sample encode?

     le/ []

              I've got lots of bandwidth - feel free to suggest a scenario where you feel VC-1 isn't competitive, and I'll see if I can come up with a counterexample.
              • I did an encoding test of WMV9 and WMV9 Advanced Profile (VC-1) against x264-encoded H.264 High Profile using SSIM as the metric. The quality of WMV9 Advanced Profile was about the same as WMV9 but it dropped frames (!!!) and so it broke the SSIM measuring tool after 10-15 seconds of measurement, so I compared WMV9 and H.264 only. Let's just say that H.264 at 2000kbps was much better quality than WMV9 at 4000kbps, by a considerable margin.
                • We've done lots of testing, and that hasn't been our results.

                  Can you share what settings you were using? If you're seeing a lot of dropped frames, one possibility is you cranked the Quality slider up to 100. That sets the minimum quality of each frame really high, telling the codec to drop frames in order to maintain data rate. A quality of 90 will give you much less trouble.

                  WMV9-AP is a superset of WMV9, so there shouldn't be any cases where you get fewer frames out of AP.

                  Also, the current codec has som
                  • I just used the Windows Media Encoder with the quality slider maxed (not the one that affects quantizer, the one that affects motion precision and so forth). I should probably try a more fully-featured VC-1 encoder. Are there any VC-1 encoders that support multiple reference frames, mixed references, rate-distortion optimization, and trellis quantization? Those are the main features that I find really help in H.264 over most other codecs (along with CABAC, of course).
                    • Actually, go ahead and use one less than the max complexity. It's actually ever-so-slightly better, and quite a bite faster, in the current verison.

                      As for most of the other feaures, you can get them via that PowerToy tool, or they're already on:

                      Multiple Reference Frames: Not explicitly supported in VC-1 (they're a huge decode complexity hit). Instead, we get the same benefit for stuff like flashes and strobes by using BI frames - an intra-only encoded B-frame, which then lets the frame after the flash re
              • by GeffDE ( 712146 )
                What is laughable, however, is the fact that the H.264 codecs run on anything (mulitple OSes and processor architectures) while WMP is restricted solely to...Windows. VC-1 is completely uncompetitive, by default, for anyone not using Windows (and not using the latest version of WMP, which precludes anyone not using Vista or XP SP2 "Genuine.)"

                That being said, will VC-1 ever make it to a wider audience?
                • by Goaway ( 82658 )
                  ffmpeg already supports VC-1. That's about as broad an audience as you get these days.
                • VC-1 Main and Simple profiles *are* WMV9, and so are compatible to out of the box WMP9, 10, or 11, which gives compatibility down to Win 98 and 2K. WMV9-AP/VC-1 Advanced Profile, which is really only needed when you're doing native interlaced encoding, is an automatic free download for WMP 9 and 10, and installed with WMP 11.

                  On Mac, Flip4Mac is our recommended solutions for playing back WMV files, and we distribute it free for Telestream.

                  For other OS's, many companies have licensed WMV playback, like the K
  • by mrzaph0d ( 25646 ) <> on Friday June 08, 2007 @12:25PM (#19439141) Homepage
    ..any of the codecs the porn..i mean video sites i visit ask me to install before i get to see the videos..
  • OT: Divx Pro is free (Score:3, Informative)

    by reaktor ( 949798 ) on Friday June 08, 2007 @12:30PM (#19439265) []

    Grab it while you can.
    • Re: (Score:2, Informative)

      by Anonymous Coward
      The windows version is free too []

      and everyone gets sent the same serial number:



      Mac OS X

    • DiVX demands your e-mail address to receive "reg key," then immediately sells your e-mail address to douche bag spammers.
      • > DiVX demands your e-mail address to receive "reg key," then immediately sells
        > your e-mail address to
        > douche bag spammers.

        That didn't happen when I got their free codec. It didn't happen when I got their free package with the player and converter. It didn't happen when I bought the Pro version with the VOB conversion extension. It didn't happen when I registered an account with their site. It didn't happen when I changed my registration to add the fact I got a DivX capable home DVD player.
  • by Anonymous Coward on Friday June 08, 2007 @12:31PM (#19439281)
    I've found the best way to highly compress movies on OS X is to use the ASCII Movie Player codec [] to display the video in, capture that to a text file using a pipe, and then zip it all up.
  • Uncompressed Codecs (Score:5, Interesting)

    by Doc Ruby ( 173196 ) on Friday June 08, 2007 @12:50PM (#19439631) Homepage Journal
    The article makes some serious errors in overgeneralizations. It says that all codecs have in common that they make bitstreams shorter for transmission. But not all codecs compress (or otherwise reduce) their data. Some codecs transmit uncompressed raw data, increased in size by adding encoding data. For example, HD video monitors connected by HDMI (or DVI) use TDMS [] encoding not for compression, but to increase reliability in transmitting large raw data streams (10.2Gbps) quickly enough (340MHz) over cheap HW.

    And though humans learned stone tools remarkable close to finally learning to load CD-ROMs, the stone tools were paleolithic ("old stone"), while the CDs were at worst neolithic ("new stone"). Someday we'll look at the modern era as a new age, probably "hualic" [], or "glass" age. These silicon chips and glass fibers have changed us as much as we've changed the glass from which we make them.

    Just for kicks, I note that we've encoded the Si atoms into the new tools that define our age.
    • And though humans learned stone tools remarkable close to finally learning to load CD-ROMs, the stone tools were paleolithic ("old stone"), while the CDs were at worst neolithic ("new stone").

      In keeping with the naming of capital-A Ages after prevalent use of materials, I like to refer to the period from 1912 to 2045 as the "Plastic Age" (or possibly the "Polymer Age" or "Polyfantasic! Age"), covering the use of Bakelite on up in consumer goods.

      Your guess as to what happens after 2045 :) (Hint: Ray Kurz

      • In keeping with the naming of capital-A Ages after prevalent use of materials, I like to refer to the period from 1912 to 2045 as the "Plastic Age" (or possibly the "Polymer Age" or "Polyfantasic! Age"), covering the use of Bakelite on up in consumer goods.

        Your guess as to what happens after 2045 :) (Hint: Ray Kurzweil has something to say about that)


        No, it seems like all the materials that are whizzy and new now days are "Space Age Materials." Stuff like carbon fiber. So, that makes this the "Space

    • Re: (Score:3, Informative)

      Exactly. I'll copy/paste my response to this highly innacurate article on Digg:

      "Some other major inaccuracies:

      This site says that motion compensation was introduced in MPEG-4. What? Motion compensation and motion estimation are at the core of every MPEG and most other codecs.

      Also, his understanding of the DCT is way off (and no, you don't need a degree to understand it -- I was building JPEG encoders in 11th grade).

      "During an encode, every number in a series is simply halved and the remainders thrown away."
    • Codec stands for COmpression/DECompression algorithm. There certainly are data formats for video which are uncompressed, but they are not Codecs.
      • No, CODEC stands for [] "coder/decoder", of which de/compression is only one kind of encoding.

        As the article said, as I said, as the TMDS article to which I linked said.
  • Image Algorithms (Score:4, Interesting)

    by Rac3r5 ( 804639 ) on Friday June 08, 2007 @12:52PM (#19439661)
    Does anyone have a similar link to imaging and sound compression algorithms?
    • You hardly need a link: most of the popular low-to-mid bitrate sound algorithms are pretty straightforward in effectiveness. Low bitrate, AAC-HE is the king. Ogg is close though. At higher bitrates AAC and Ogg become more equal and eventually Lame MP3 catches up at the highest bitrates (192+).
      • Oh, I'm really a big fan of our WMA 10 Professional low bitrate modes (

        And on a practical level, we support rate control modes rarely seen in other audio codec implementations, like 2-pass CBR and 2-pass VBR. This lets us get more bang for the bit compared to 1-pass CBR and VBR modes.
        • Re: (Score:2, Insightful)

          by Movi ( 1005625 )
          I know youre kind of on-topic but this is the 2nd time ive seen you in this thread waving about what cool codecs you have. Thats fine, however, what good is it when i can't hear it/see it since i use Linux and Mac OS X. I _hate_ it when some doofus encodes something i want to see into wmv9 - this means extra hassle for me - either with maplyer and binary codecs, or WMVPlayer from flip4mac. The upside of _all_ other codecs mentioned here is that theyre all open source (ok, exclude divx, its superseeded by x
          • Well, the OP was about how good codecs are :).

            Actually, there are more non-Windows playback options than you think.

            First, Flip4Mac can play back all VC-1 flavors and WMA Pro today. It doesn't play back the higher frequencies of WMA Pro, but they continually improve their support every release (full VC-1 Advanced Profile came in 2.1.1 last month). Downloading it seems pretty simple, but it isn't open source. And it nicely integrates with QuickTime, so once it's installed, WMV beccomes just another file fo
        • And how does that help me, as a Linux user? Let me know when WMA is portable.
          • The codec has already been licensed by vendors of Linux-based devices. We offer a complete source code implementation as a porting kit.
    • If by "similar" you mean "completely clueless", yes, I know a few, but can't really see the point in propagating them. :-) If you want an article with some theory and a visual comparison of JPEG and JPEG-2000, here: s/viewarticle.jsp?id=109739 []

      (disclaimer: I wrote it, but I don't get any $$$ from page views... unfortunately :-P)

      There are lots of articles about sound codecs, but most of them seem a bit too "mystical" to me (as is typical with all things audiop
  • by Rui del-Negro ( 531098 ) on Friday June 08, 2007 @12:58PM (#19439787) Homepage
    I've just read a bit of the article and the only thing I can think of is to paraphrase Stanislaw Lem: "it always amazes me that people need a license to drive a car but can write and publish all sort of nonsense without any clue about the subject".

    His descriptions of "temporal compression" and "motion compensation" (to name just two of the fundamental building blocks of modern video codecs) are so wrong they don't even qualify as an error. He confused delta compression with motion compensation, thinks MPEG1 lacked the latter, doesn't understand why the former is virtually useless for video... sigh... even trolled Wikipedia articles manage to be more accurate than that.

    I feel truly sorry for the people who read that and think they've learned something about the subject.

    • by Rui del-Negro ( 531098 ) on Friday June 08, 2007 @01:13PM (#19440057) Homepage
      God, I've just read his description of DCT. It's even worse. He seems to think that DCT consists of "dividing numbers by two" (he doesn't even use the word "quantization", that probably has too many syllables). And people complain about Wikipedia...

      Time to shamelessly plug my articles about compression. Some parts are simplified (they're aimed at "end users") but, compared to this Atomic article, anything is flawless:

      Lossless (data, image, audio) s/viewarticle.jsp?id=106309 []

      Lossy + Hybrid (image, audio) s/viewarticle.jsp?id=109739 []

      Video (lossless, lossy) s/viewarticle.jsp?id=125089 []
      • Wow dude. Those are some great, fantastic articles.

      • Time to shamelessly plug my articles about compression.

        I agree with the others here, incredibly good articles!
        A very thorough introduction, and still easy to follow for a lay-person. Recommended!
        You wouldn't happen to have written any other works of similar quality you'd like to share with us? :)

        A question as well: Do you know if this Photoshop plugin (j2k) [] is a good implementation of jpeg2000? Maybe it's a stupid question, but I have no idea how strict the standard is, or if there are differences between

        • Well, in English, for Digital Media Net, I've also written an article about the exciting subject of... alpha channels:

 s/viewarticle.jsp?id=135386 []

          And I'll probably write a couple more, the first of which will be about HDR (high dynamic range) digital imaging. The problem with DMNet is they pay the same for a 3-paragraph article about "how to make your photos sharper in Photoshop" and a 20,000-word article about "how to build a working time machine and fix gl
    • Even worse is the fact they have 5 codecs? Indepth look at video codec? More like "really fast look". I've used more codecs when I was converting old tv shows. Divx, Xvid? WMP7? x264 and Cinepak?

      How about these tests....

      Which one is the least hassle to set up?

      Which one is the least hassle to convert from or to?

      Which one is the most accepted format? (let's be honest, APE and FLAC might be good audio codecs, too bad only a handful of players play them but Mp3s play on every player out there that I know o
      • Well, using the "must play anywhere" metric, I'd say MPEG-1 ranks the highest.

        • Divx has been gaining enough support there that it's worth looking at. I'm not looking for stuff to work on antiquated devices, but can it work on the Ipod Video, is it easy to get codecs for use with software? Can you transcode it easily (minimal processor power)? Or even can you convert it easily?

          The biggest problem any audio codec has is getting people to realize that while it might be better than MP3 for what ever reason you want to sell us, the new codec also has to be around the same size, and able
          • If it's wrapped in a common container (ex., AVI, Quicktime) and if downloading and installing the codec is an option, then pretty much anything can be converted into anything (at least within the same container format), so the point becomes more or less irrelevant. I thought you meant "will play anywhere right 'out of the box' ", that's why I mentioned MPEG-1.

            It's rarely necessary to have faster-than-realtime transcoding, so "how long does it take" is usually less relevant than "can it be done dynamically,
  • by Hoplite3 ( 671379 ) on Friday June 08, 2007 @12:59PM (#19439803)
    I'm a bit skeptical of information in that article after reading the DCT description that described it as a rounding trick. What, is frequency-space too hard of a concept? Doesn't everyone get some Fourier analysis in college these days? You need to know it to be informed about a lot of modern data analysis.
    • If you grabbed 1000 people at random - heck, I'll even give you 1000 people between the ages of 20 and 40, you'd be hard pressed to find 20 that have ever heard of DCT or Fourier analysis, and phenominally lucky to find one that could actually describe what it means.

      Oh, and the answer to your question is "yes." Saying "Frequency Space" as part of a description to anyone who is not involved in either said data analysis, compression, or vibrations (my former, and sometimes current, field) is guaranteed to be
      • by Rui del-Negro ( 531098 ) on Friday June 08, 2007 @01:32PM (#19440371) Homepage
        And so you should describe it as "dividing numbers by two and then multiplying them again"...? In other words, a "simple" description is preferable, despite the fact that it's completely wrong...? Hell, "dividing numbers by two" isn't even an accurate description of quantization, let alone of a DCT.

        I think I made a pretty decent job of explaining what "frequency space" is, and why it can be used to improve compression, here: s/viewarticle.jsp?id=109739-2 []

        (scroll down to "The transformers")

        It also explains why DCT isn't a form of compression per se, it simply makes it possible to use quantization in a way that does not affect quality as much as it would in "pixel space".

        Several "non-techies" have read that and, although they realised the transform itself is not something trivial, they understood what it did and what it was used for. Something that you can't really say about the Atomic article (or its author).
      • Re: (Score:3, Interesting)

        by bodrell ( 665409 )

        If you grabbed 1000 people at random - heck, I'll even give you 1000 people between the ages of 20 and 40, you'd be hard pressed to find 20 that have ever heard of DCT or Fourier analysis, and phenominally lucky to find one that could actually describe what it means.

        Oh, and the answer to your question is "yes." Saying "Frequency Space" as part of a description to anyone who is not involved in either said data analysis, compression, or vibrations (my former, and sometimes current, field) is guaranteed to

    • by Miseph ( 979059 )
      Nope, only the people who learn how to do modern data analysis. You can pretty much assume that anyone with a liberal arts background 9literature, history, law, psychology, etc.) has not.
    • Doesn't everyone get some Fourier analysis in college these days?

      No, The people who study English, Journalism and Education don't have to take any math past finger counting. OK, well maybe decimals and fractions. It's quite rare to find someone who can understand "technical stuff" and write well enough to explain it.
  • by CaptainPatent ( 1087643 ) on Friday June 08, 2007 @01:13PM (#19440039) Journal
    So I compressed it for ease of reading:

    "Atmcmpc has n n-krdibly n-depth lûk ata wide rng of video codcs. It lûks not nly at ther iner wrkngs, but also shws thá kwality produced by each ata vriety of settngs and situashuns."

    please note a lossy codec was used for paraphrasing

  • They missed 3ivx (Score:5, Informative)

    by azav ( 469988 ) on Friday June 08, 2007 @01:27PM (#19440273) Homepage Journal
    This is not "everything you wanted to know about codecs." In fact, 3ivx just released 3ivx 5.0 for encoding to MPEG4 a few days ago.

    A bit of a bummer that an Australian website missed reviewing an Australian created codec.

    FYI, here's the press release. And YES! It does do Linux!. Tux be praised. []

  • A wide range? (Score:5, Interesting)

    by jd ( 1658 ) <> on Friday June 08, 2007 @01:45PM (#19440607) Homepage Journal
    I've seen more codecs on the back of a postage stamp. Seriously, one "modern", effectively one "old" (DivX and XviD are forks of the same original design), and one "archaic" does not make for much of a range. It doesn't even cover the spectrum of eras, never mind the spectrum of codecs.

    For those who like laundry lists, here are some codecs not listed: Dirac, Theora, Huffyuv, Lempel-Ziv-Oberhumer Codec, MNG, Cell, NV, WaveCodec, Motion JPEG and MSU Lossless Video Codec. The wikipedia page doesn't list all of these, it took some scouting to find others and some of the early early ones are apparently only listed in the documentation on Open Source videoconferencing software I had back in the early 1990s.

    Are any of these significant, though? Well, Dirac (BBC) damn well should be - we're only talking a high-definition TV quality codec by a major broadcaster with on-site offices in most countries that would be a logical choice for their remote bureaus to use and be a good candidate for competing with digital broadcasters in general.

    Theora - well, it would be the ideal desktop videoconferencing codec in many ways. Those in common use today are heavier than necessary but the quality you buy with that at the bandwidth generally available just isn't worth it.

    Huffyuv is said to be the fastest codec on the planet by some, which is entirely possible. That would make it good for most things where CPU power is expensive but bandwidth is cheap. (Embedded systems would probably fall into that category a lot.)

    MSU's Lossless Codec is probably the slowest codec ever written, but gives by far the best compression. It makes a great reference codec to compare others against, apparently. If you could develop a decent hardware implementation, it might be a serious competitor to HD-DVD and Blu-Ray, as you could pack a comparable volume of material onto a standard DVD and therefore use already-existing commodity disks and players. All you'd need is a patch kit to add the decoder. This would likely appeal far more to consumers, as they wouldn't need to spend as much, but the studios and the manufacturers would hate and despise it for the same reason.

  • by autophile ( 640621 ) on Friday June 08, 2007 @02:00PM (#19440917)

    "Print" does not mean stipping out all graphics and ads, but leaving the number of pages the same.



  • Leaving aside how crap the article is in terms of what it does cover, it didn't even include any Wavelet CODECs (Dirac, Snow, etc, ...) which outperform the DCT based methods, both for video and for still images (JPEG 2000).
    • Wavelet based codec SUCK for video. REALLY REALLY REALLY suck.

      You can get a bit of boost by compressing the I-frames with wavelets, and doing P/B frames classically, but that aint giving you much of a benefit, at the cost of having to worry about the convolution of 2 completely different error sources.

      (and yeah, i know stuff like snow. Those actually prove that point)
  • The Slashdotting finally eased up enough for me to finally get to Page 4. Earlier complaints about the complete absence of accurate facts in the technical part were dead-on. But in the proceudre, wow, it's hard to know what relevant of the test would be.

    40 FRAME clips? The default GOP length of most of these codecs is longer than that! There's no useful test of rate control in there, or keyframe supression popping.

    And as far as compression setings, all they say is "we used the defaults, but set it to highest quality". There isn't just ONE defualt in these products. We don't know if they're even comparing CBR and VBR, 1-pass or 2-pass. And there are lots of tweaks appropriate to different kinds of content that would be used in practic - one doesn't compress film source like cel animation!

    Sheesh, there's really no useful information here at all. The average reader would probably wind up knowing less about compression after reading it...
  • From TFA:

    an uncompressed two-hour film in digital cinema resolution and quality will clock in at about 12 terabytes,

    That number doesn't sound REMOTELY reasonable.

    4K is 4096 x 2160 = 4,527,360 pixels/frame

    32bpp per pixel == 144,875,520bits == 17,685 kilobytes/frame

    4K is 24fps == 424,440 kilobytes/sec.

    2 hours == 7,200 seconds

    Which makes 3,055,968,000 Kbytes total or 2.846 Terabytes for 2 hours of uncompressed video.

    And I agree with everyone who's already said how useless this article is. If you don't unders

Doubt is not a pleasant condition, but certainty is absurd. - Voltaire