Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Comment Re:Measure sharpness? (Score 1) 291

Merely resaving an image with a higher quality setting won't magically repopulate the high frequency coefficients, so I think this will still work in that case. Of course, if the image has gone through any sort of filtering or modification before being resaved, then all bets are off. But that's also beyond the scope of what the original poster asked for - he wanted a way to detect JPEG compression artifacts, not any extra filtering that might have also been applied.

Comment Look at the DCT coefficients (Score 3, Informative) 291

JPEG works by breaking the image into 8x8 blocks and doing a two dimensional discrete cosine transform on each of the color planes for each block. At this point, no information is lost (except possibly by some slight inaccuracies converting from RGB to YUV as is used in JPEG). The step where the artifacts are introduced is in quantizing the coefficients. High frequency coefficients are considered less important and are quantized more than low frequency coefficients. The level of quantization is raised across the board to increase the level of compression.

Now, how is this useful? The reason heavily quantizing results in higher compression is because the coefficients get smaller. In fact, many become zero, which is particularly good for compression - and the high frequency coefficients in particular tend towards zero. So partially decode the images and look at the DCT coefficients. The image with more high frequency coefficients which are zero is likely the lower quality one.

Comment Re:Measure sharpness? (Score 1) 291

No it doesn't. This method has another problem (see my replies to it), but other than that, it could work. He's suggesting that to each copy of the image, you look at the difference between that copy and a blurred version of it. This will give you an idea of how sharp that copy is. And since JPEG throws out high frequency information first, resulting in blurring, it would appear at first glance that the sharper image should be the higher quality one.

As I said in another comment though, JPEG operates on blocks, and especially at very low qualities, you get sharp edges between each block. So the assumption that sharp image == high quality is not really valid.

Comment Re:Measure sharpness? (Score 1) 291

Also, JPEG works on blocks. While it's true that JPEG gets rid of high frequency details first (and thus results in blurring), this is only useful within each block. You can have high contrast areas at the edge of each block, and this is actually often some of the most annoying artifacting in images compressed at very low quality. So just because it has sharp edges doesn't mean it's high quality.

Comment Re:Measure sharpness? (Score 3, Insightful) 291

Even faster is look at the DCT coefficients in the file itself. Doesn't even require decoding - JPEG compression works by quantizing the coefficients more heavily for higher compression rates, and particularly for the high frequency coefficients. If more high frequency coefficients are zero, it's been quantized more heavily, and is lower quality.

Now, it's not foolproof. If one copy went through some intermediate processing (color dithering or something) before the final JPEG version was saved, it may have lost quality in places not accounted for by this method. Comparing quality of two differently-sized images is also not as straightforward either.

Comment Re:What are we trying to achieve? (Score 4, Informative) 427

Off by one. Linux deprecated OSS3, and OSS4 is now opensource.

And not only does it work better (in my admittedly little experience with it), it's also more in keeping with the UNIX philosophy of treating devices just like any other file. Sure, with ALSA you do have device files, but you pretty much have to use alsalib to use them AFAIK. With OSS, you get to use the standard UNIX file APIs.

Comment Re:NOT amd64 (Score 1) 251

I had heard that V8 was 32-bit only right now, which is why I was surprised that there was an amd64 package. But everything I've seen online (in the admittedly small amount of searching I've done) indicates that 64-bit support is a low priority. I even saw somewhere mentioned that the code makes various non-portable assumptions such as sizeof(int)==sizeof(void*), which if true means they really weren't planning for 64-bit support when they started. I hope they add proper 64-bit support soon, but I'm not holding my breath.

Comment Re:NOT amd64 (Score 1) 251

In the sense that it doesn't pull in the 32-bit dependencies, yes. But rather than fix it, I'll just uninstall. Chrome should hopefully support 64 bit properly at some point, and I'd rather wait until then to try it out than install a bunch of 32 bit libraries for the sole purpose of running a very alpha browser that I'll likely play around with for all of five minutes.

Comment NOT amd64 (Score 4, Informative) 251

A friend wrote up a Gentoo ebuild for it, which I went and installed (for the amd64 version - I run an almost entirely 64 bit system). Try to run it, and got this message:

/opt/google/chrome/chrome: error while loading shared libraries: libgconf-2.so.4: cannot open shared object file: No such file or directory

That's odd ... double check ... yes, /usr/lib64/libgconf-2.so.4 exists ... No ... they couldn't have ...

$ file /opt/google/chrome/chrome
/opt/google/chrome/chrome: ELF 32-bit LSB executable, Intel 80386, version 1 (SYSV), dynamically linked (uses shared libs), for GNU/Linux 2.6.8, stripped

*facepalm*

The 64-bit Chrome is *NOT* 64-bit, and will not run on 64-bit systems which are missing a number of 32-bit libraries.

Comment Re:What? (Score 4, Informative) 313

Main Concept being the best overall.

Oh? this (and this follow up post) seem to indicate that it's not so clearcut. Looks like x264 beat MainConcept in most tests, and the major tests it lost in were rather unrealistic.

But in the interest of full disclosure, Dark Shikari is one of the main developers on x264, so he's got an obvious bias. Doesn't necessarily make him wrong though.

Comment Re:Using an iPhone makes you look pretty lame? (Score 1) 884

The inclusion of a real web browser isn't really that important in the Japanese market. In Japan, probably more people browse the web on their cellphone than on a computer. This means that my and large, Japanese websites are made with the limited browsers in mind in the first place, though many sites will check the user agent of similar to allow separate versions for computers and cellphones. Because there was demand for it, the mobile web already worked rather well in Japan, and throwing Safari onto a cellphone there doesn't really change things much.

Comment Re:Elasticity of Demmand (Score 1) 763

Obviously, you can only cut the price so far because you need to make some profit per unit

That doesn't really apply to most videogames. The actual cost per unit is the cost of the disk and packaging, so almost nothing. All the development, production, testing, etc are fixed costs no matter how many units you sell, so theit "cost per unit" is really a very fuzzy concept, depending on many different factors. Valve's little experiment here is a perfect example of how lowering the price can even lower the cost per unit.

Slashdot Top Deals

UNIX was not designed to stop you from doing stupid things, because that would also stop you from doing clever things. -- Doug Gwyn

Working...