Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Graphics Software

What Has Happened To Fractal Image Compression? 27

Dennis Thrysøe asks: "In 1995/1996 Iterated Systems (Michael Barnsley) made a program that compressed/decompressed images with fractal compression technology. It was for real, pretty fast and really worked. It was even free, except if you wanted to make your own program, that compressed images from the libraries. What on earth happened to this field of technology? You can still find the same old version of 'Fractal Imager 1.1', but has it been developed on since? Has anybody else implemented anything (open/free) that really works? Fractals / Iterated function systems are REALLY amazing for compressing images, but why aren't they being used more?"
This discussion has been archived. No new comments can be posted.

What Has Happened to Fractal Image Compression?

Comments Filter:
  • by Anonymous Coward
    The main problem is that nobody has come up with a compression algorithm better than GSLIR (Graduate Student Locked In Room).
  • It just dosen't make sense for me (or, I think, for your average computer user) to compress image files, given the currently available hardware.

    That arguement makes absolutely no sense. JPG and PNG are two examples of compressed image files which are used extensively today, in the world of gigahertz speed processors and tens of gigabytes of storage on your typical machine. People still compress files to transfer them (self-extracting .zip and .zip, .rar and .tar.gz), even though their connection is many times faster than before.

    I believe the question the author wanted answered is what happened to fractal compression? Tech seems to have locked on to wavelet compression (for images I believe) and still uses LZW and the more "normal" forms of compression for binary files.

    personally the only fractal compressor I ever saw was around '95 or so and all it did was store the compressed file as bad sectors on the disk and indicate where and how many were in the "fractally-compressed archive" itself. OLW I think it was called.

  • I don't know whether that's the acronym or not, but I'm pretty sure I ran across that program. Fractal data compression on the order of 100:1, almost as quick as pkzip, [this was around '93, I think...I was young and naive then :( ] and all of this by way of simply indexing the "compressed archive" to the existing larger file, and padding it with random info. Soo.... if you deleted the old file, poof, no more working archive. And that's, I think, where tzanger got the "bad sectors" thing; rather than giving "hey, you deleted the file I needed to 'restore' your archive" as an error message, it spat some hoopla about file corruption and bad disk sectors.

    Yup; that's the one. Was it really '93? Wow that kinda dates me hehe... I remember downloading it skeptically (back at 2400 baud...) and running it on a test program. I'm pretty sure it hid the original file (not deleted as I'd originally stated) and then reported wondrous compression. I viewed the "archive" with good old Norton Commander (I still use this program today, and it's Linux counterpart, Midnight (I call it Morton) Commander) and wondered how the hell it could store a 300k program in approximately 27 bytes (I believe all the archives were about 27 bytes long, regardless of the original size), when 15 of those bytes took up the path and filename.

    Wow this is starting to bring back memories... 232 characters a second at a time. :-)

  • I seem to remember that there's an information theory proof that wavelets are, the most efficient way of encoding the information in any given image (although I think there was a caveat that the `ideal' set of wavelets is likely to be very hard to find. It's a while since I've looked at the wavelet literature, and I don't have the reference to hand, but it can probably be turned up somewhere...

    Anyway, I guess this is why most interest is in wavelets now -- and I saw a JPEG2000 implementation listed on freshmeat the other day!

  • Simply put, fractal compression was a neat idea that never really worked well. For some quick information the comp.compression FAQ part 1 [mit.edu], search for "Subject: [17]". For a long history of fractal compression, check the comp.compression FAQ, part 2 [mit.edu], search for "Subject: [77]".
  • It's because everyone is into wavelet compression now...
  • Before any image format becomes so huge that you can't help but notice it everywhere it needs built-in support in Netscape and IE -- at least. Generally it needs the support in at least two full versions before content creators think that it's safe to use (ie; support is included in version X.0 but isn't used until version X+1.0).

    Apart from that, fractal image compression probably is being used all 'round the place, just in internal applications and closed communications that you and I don't see.

  • by Christopher Thomas ( 11717 ) on Sunday September 10, 2000 @05:40AM (#791185)
    I think that the licensing issue *might* be what killed it, or at least stunted its growth.

    I have doubts about that. The license would have been for one specific type of fractal compression; however, the idea was known to enough people that if licensing was a big issue, several other groups would have created their own replacement based on any of a variety of algorithms. I'm sure I'm not the only hobbyist who played with my own compression programs in those days.

    I think that the first poster's explanation is the most plausible. Years ago, fractal compression was the latest toy that programmers played with in their free time. Now, it wavelet compression, or something other than compression.

    There will still be people playing with fractal compression in their free time. If it's useful, we may see results eventually.
  • wouldnt you know, but i still have somewhere a cd full of .fif files, unable to see / use them anywhere. these fifs sure were nice, but are now nowhere to be seen.
  • "Days of runtime on a Cray" doesn't really tell that much. The Cray is a finicky beast. If your problem doesn't suit the machines architecture (i.e. vector operations), the performance isn't all that spectacular.

  • I don't know whether that's the acronym or not, but I'm pretty sure I ran across that program. Fractal data compression on the order of 100:1, almost as quick as pkzip, [this was around '93, I think...I was young and naive then :( ] and all of this by way of simply indexing the "compressed archive" to the existing larger file, and padding it with random info. Soo.... if you deleted the old file, poof, no more working archive. And that's, I think, where tzanger got the "bad sectors" thing; rather than giving "hey, you deleted the file I needed to 'restore' your archive" as an error message, it spat some hoopla about file corruption and bad disk sectors.

    God, that's way too much information on such a tidbit. I .did. get hit with this sucker, though, and it did teach me to trust the "wondrous technology" a little less, and use a little more common sense.

    I just hope it's more informative than it is off-topic ;)
  • Several factors combined to kill it for now.
    1. lots of marketing claims on performance and availablity that didn't hold up
    2. not very robust, certain images creates visible artifacts. The algorithm needed more work and less marketing
    3. highly proprietary implementation prevented more people from playing with it or improving it.
    4. complex and difficult mathematics not taught very many places limited the ability to incorporate into custom hardware
    5. did I mention the heavy BS factor?
  • Altamira [altamira-group.com] produce a photoshop plugin based on iterated's FIF .

    Sometimes use it to send high quality images by email, just have to be sure the person you are sending to have the same program.

    Not sure what happened to Iterated's own app, although seem to remember reading on their site way back that they were working with someone else of a streaming video application for it?

    Maybe they sold it, or maybe just shut up about it till they had their final product (and if there are better methods now it could just have been dropped).

  • I seem to remember reading (maybe even on Slashdot) that the inventor of the fractal compression scheme had got a patent (or several), and decided that he just wanted to sit on that patent and not do anything with it. Hence, nothing has happened recently with fractal compression.

    I also remember that the _compression_ algorithm was, erm, slooooow. So slow that it needed some huge machines chugging for a long time to spit out one fractally-compressed image. But the decompression was pretty quick (although slower than jpeg).

    There was a floppy with a quiz using JFIF pictures a few years back on one of the UK PC magazines.. shame I haven't still got the floppy, as the pictures were _very_ nice quality and incredibly compressed.
  • A fast compression algorithm is not as important as a fast decompression, because compression is only used while creating the image, while decompression is used every time one displays the image. Because of this, the compression algorithm was probably not that much optimized.


    I completely agree, as this is pretty obvious. However, the amount of math involved in generating the compressed images was prohibitive, and so the bits saved transmitting or storing the image(s) could easily be overcome by the additional costs of compressing the image to achieve that saving.

    Besides, at the time the algorithm they were using was supposed to be `highly optimized' by some very clever maths types, but still needed several days of runtime on a Cray (this was a good few years ago) to compress a single image. I'd call that prohibitive cost ;-)
  • At that time I was developing Multimedia CD-ROMs, and a better codec woud have been helpful. I repeatedly told them to wrap their code in a QuickTime codec, so I could compare it side by side with what I was already using, and so I wouldn't have to change the code of my App. They never did and always wanted you to use their SDK.
    Their image compession claims were very overblown, and their examples were always the same few images that compressed really well (that iguana image being the most prominent).
    The lesson here for new codecs is wrap it in QuickTime, and then the content-holders can try it out with very little effort.
  • "It just dosen't make sense for me (or, I think, for your average computer user) to compress image files, given the currently available hardware."

    You are compressing (or at least decompressing) images. JPEG and GIF and PNG and others are all compressed image formats. If they weren't, you would be downloading something more than ten times as much data to view every webpage.

    Example: The Slashdot logo, top left on every page. It's a 3,473 byte GIF. Uncompressed, it's 59,848 bytes.

    Even with today's technology, data compression is a BIG DEAL, for images or anything else.

    I think that the licensing issue *might* be what killed it, or at least stunted its growth. Would we be using GIFs so much had Unisys been a tyrant about it's little patent back when GIFs were catching on? (I sure hope Unisys wasn't a tyrant back then, because otherwise my argument is dead...)
  • Many image compression methods are currently in existance, with varying degrees of success. Fractal image compresion isn't an incredible storage method, compression-wise. One of the biggest disadvantages to it is we're applying 2d-functions to a 3d-mapped-to-2d image. Another is the fact that, while fractals create subtle detail, its generally not the detail you want. Picture applying it to the new FBI fingerprint database - could you imagine incorrect pores and ridges? That's why they went with a wavelet-based compression method (it scored notably better than jpeg compression, also, because jpegs create small tiling artifacts that distorted the prints too much, and had sharp cutoff points on low-level quality).

    - Rei
  • I agree with all the posts here guessing at the shiny-new-toy tendency. I think that this plays a factor.

    When asking the above question, I actually hoped that someone who knew a lot about the licensing and patents on this field would come forward and fill the rest of us in. I know for a fact that Michael barnsley from Iterated (former Iterated?) has a bunch of patents on this.

    Does that in effect mean that nobody else can implement an open (or commercial for that matter) library based on IFS?

    How to do it is pretty accesible knowledge, but I don't know if it's legal at all.

    -dennis

  • Here's something I found searching for the Photoshop plug-in "Mr. Sid" I've seen used by one of my consulting friends to really cut down file size of images for print media... It looked like it was based on fractal technology (and I'm sure it is somewhat), but it was referred to as the "infant child" of JPEG 2000 - so I guess this won't be anything new.

    It's called Genuine Fractal 2.0, and it's made by the Altamira Group [altamira-group.com]. It's also a Photoshop plug-in. They make some pretty huge claims, but I've seen this stuff work, and if their output is anything near as good as Mr. Sid, it should be sweet. Here's a clip from the site:

    You need only between 15MB and 40MB of RGB data to capture an image for any size output. For example, you could scan a 4" x 5" transparency at 600 dpi to produce a 20MB RGB file, do all your image editing at that scan resolution, then encode the image. Depending on the image, your encoded file will typically be 2MB-10MB. Now, if you need to output the image at 450MB and 60MB, you can generate both resolutions from the same encoded file.


    For smaller print-quality output, you can start with a 4MB-5MB original, encode to less than a megabyte, then render the image easily and beautifully to 20MB.

    For screen-resolution output, you can start even smaller. For example, using Genuine Fractals' 50 Web Graphics options, a 640 x 480-pixel original compresses to between 10KB and 150KB and renders a high-quality image for quick full-screen display on the Web.

  • C.A.T. compresses, encrypts data, video Lafe Technologies' C.A.T. (cellular automata transforms) technology enhances data compression and encryption. Sold as QuikCAT software, it is compatible with Unix workstations or DOS/Windows-based PCs. Just found this while looking for porn Looks very interesting. And can you get any geekier than CA's? p.s. their website is www.lafetech.com uses flash "exstensively", though.
  • My 1995 computer:

    66Mhz 486

    8MB Ram

    1GB HD

    3.5" 1.44MB Floppy

    5.25" Floppy

    4x CD-ROM

    Slow (I forget how slow) modem.

    My current Computer:

    533MHz Celeron

    128MB RAM

    13GB Hard-Drive (Two Drives)

    3.5" Floppy

    4x4x32 CD-RW

    T1 Internet Access (College LAN)

    In 1995, I couldn't easily transfer or store large files. The hassel-factor of dealing with compressing and decompressing files was negligable compared to the hassel-factor transfering the files to my PC and storing the files on removable media. Today, I can transfer large files to my computer by a number of network protocols, and can store them on a CD-R very easily. It just dosen't make sense for me (or, I think, for your average computer user) to compress image files, given the currently available hardware.

  • The differences between a fractal compressed image and other compression schemes (JPEG, RLE, LZW) are much less significant on modern hardware than on the older hardware. I certainly have no argument that I don't need or want any image compression at all. That would be way to inefficient, especially on my college connection already bogged down with MP3's. Thank god they're compressed!
  • I should have included this in my first post, but what I meant to say was not that no compression is needed, but that the gains of fractal compression over JPEG, etc. is small compared to the ease of use of JPEG, since it is supported by nearly everyting.
  • AFAIK, one key advantage of fractal compression over JPEG is, that it is resolution independent, i.e. you can zoom in without seeing pixel artifacts. This is very useful if you intend to display it on devices with vastly different resolutions (like screen and printer). Of course, the picture details have to be there in the first hand.
  • ..._compression_ algorithm was, erm, slooooow.

    A fast compression algorithm is not as important as a fast decompression, because compression is only used while creating the image, while decompression is used every time one displays the image. Because of this, the compression algorithm was probably not that much optimized.
  • We (an IT consultancy) used the Fractal Image compression quite a bit in the early days, paid for a compressor board etc. and it was great. The people saying it produced bad artifacts ?? I never saw any unless you pushed the compression right to the limit (say 150k image down to 3k) at which stage JPEG was just a multi-coloured checkerboard anyway.

    But this was in DOS days. When we wanted to develop Windows 3 software, there were no decompression libraries, so we had to use something else as a stopgap JPEG for large images and simple RLE for small images (this was for an Image Database on CD-ROM project).

    And then when the Windows software finally came out it was terrible - images that used to decompress in memory in a fraction of a second now took 10, 20 or 30 seconds to decompress ! This was no use whatsoever - so we just gave it up as another good technology gone bad, where the later developers they hired after the initial startup just had no idea....

    Tim

"May your future be limited only by your dreams." -- Christa McAuliffe

Working...