Ask Slashdot: Is There An Open Source Tool Measuring The Sharpness of Streaming Video? 101
dryriver asks:
Is there an open source video analysis tool available that can take a folder full of video captures (e.g. news, sports, movies, music videos, TV shows), analyze the video frames in those captures, and put a hard number on how optically sharp, on average, the digital video provided by any given digital TV or streaming service is?
If such a tool exists, it could be of great use in shaming paid video content delivery services that promise proper "1080 HD" or "4K UHD" quality content, but deliver video that is actually Youtube quality or worse. With such a tool, people could channel-hop across their digital TV service's various offerings for an hour or so, capture the video stream to harddisk, and then have an "average optical sharpness score" for that service calculated that can be shared with others and published online, possibly shaming the content provider -- satellite TV providers in particular -- into upping their bitrate if the score turns out to be atrociously low for that service....
People in many countries -- particularly developing countries -- cough up hard cash to sign up for various satellite TV, digital TV, streaming video and similar services, only to then find that the bitrate, compression quality and optical sharpness of the video content delivered isn't too great at all. At a time when 4K UHD content is available in some countries, many satellite TV and streaming video services in many different countries do not even deliver properly sharp and well-defined 1080 HD video to their customers, even though the content quality advertised before signing up is very much "crystal clear 1080 HD High-Definition".
What's the solution? Leave your thoughts and suggestions in the comments.
And is there an open source tool measuring the sharpness of streaming video?
If such a tool exists, it could be of great use in shaming paid video content delivery services that promise proper "1080 HD" or "4K UHD" quality content, but deliver video that is actually Youtube quality or worse. With such a tool, people could channel-hop across their digital TV service's various offerings for an hour or so, capture the video stream to harddisk, and then have an "average optical sharpness score" for that service calculated that can be shared with others and published online, possibly shaming the content provider -- satellite TV providers in particular -- into upping their bitrate if the score turns out to be atrociously low for that service....
People in many countries -- particularly developing countries -- cough up hard cash to sign up for various satellite TV, digital TV, streaming video and similar services, only to then find that the bitrate, compression quality and optical sharpness of the video content delivered isn't too great at all. At a time when 4K UHD content is available in some countries, many satellite TV and streaming video services in many different countries do not even deliver properly sharp and well-defined 1080 HD video to their customers, even though the content quality advertised before signing up is very much "crystal clear 1080 HD High-Definition".
What's the solution? Leave your thoughts and suggestions in the comments.
And is there an open source tool measuring the sharpness of streaming video?
Re: What the hell? (Score:2, Insightful)
If you need a tool to tell, does it really matter?
Re: (Score:2)
Re: (Score:2)
Re: (Score:2, Insightful)
sounds pretty simple to me: submitter has amassed a ton of pirated media with many duplicates. now they want an automated means of choosing the highest quality files (bit rate and/or resolution alone won't necessarily do that) from those dupes so they can trash the rest.
Re: (Score:2)
It may seem that way but it is a real issue. Watching a football(US) game on Comcast, live, it is easy to see the lack of visual quality as the camera moves and the field just becomes a blur of green, which once again returns to detail as the image stops moving. Movies from Netflix/Amazon are often blocky (low res image) over a 1080p image choice. How can we hold providers to account and or make an informed choice as to which service provider is better? We don't watch the same instance of a game on two prov
Copy protection? (Score:2)
You'll probably run into copy-protection issues with many media sources. If you can get at the bits to analyze it, you can dump them out to create a copy. I'm not even sure sharpness is going to be easy to measure, if a soft/low-res source has been artificially sharpened to give the appearance of quality. Sports broadcasts seem to be the worst for it for it from my experience, with glowing halos around high-contrast edges to make them stand out even more.
Re: (Score:2)
Re: (Score:3)
Theoretically, assuming your hardware is capable of doing it properly, a progressive-scan (p) video would need to have nearly twice the framerate of an interlaced (i) video, in order to achieve the same motion smoothness.
Re: (Score:2)
Yeah, any tool that was capable of doing this on DRM'd sources would be sued to oblivion and back.
Re: (Score:2)
The technical solution to this would be to
a) split the video signal between an HDCP compliant screen and a capture card (such as black magic or preferably dektec)
b) Capture the frames and perform full frame FFTs to identify frequency distribution. A large area DCT would work extremely well. Another option is to use wavelets and identify act
Re: (Score:1)
That will still be a problem, as any properly HDCP-compliant capture card will not (or should not!) give you access to the data either. Every device should check the downstream devices are compliant before passing any data on, and not provide any way to intercept it. Any that knowingly do are breaking the HDCP licensing agreement and are likely to be sued.
Sharpness is an illusion (Score:5, Informative)
But your brain has special cells which recognize transitions between light and dark, and identifies them as an edge. When the exaggerated transition from an unsharp masked edge hits those brain cells, they get more excited and signal the rest of your brain that this is a really strong edge. Thus creating the illusion that you are seeing a sharper image, when objectively it's a degraded image.
So any algorithm which detects "sharpness" as interpreted by the brain would actually rate inferior (heavily processed) video streams higher than video streams conveying the maximum amount of information possible. It's why unsharp masking is typically added by the TV or video player, rather than incorporated into the original video stream. (A slight amount of sharpening is done to counteract the blurring caused by the Bayer filter used in camera sensors, but that's another story.)
Re: (Score:2)
How does it lose information if changes the information it finds?
That sounds pretty odd
Re:Sharpness is an illusion (Score:4, Insightful)
Sharpening the image is an inherently lossy process. You're replacing a more detailed image containing a soft edge with something that has a crisp edge with less gradation. You can't then reverse the process to get back to the original image, since there'd be no way to tell the difference between things that actually had a sharp edge in the original image and things that had merely had their edges sharpened after the fact. They'd both look the same.
Frankly, sharpness is a really lousy measure for determining how good an image is. It doesn't make images look better, in and of itself, and it's something that the user can already introduce themselves if that's where their preferences lie, simply by bumping up the sharpness on their TV. Of course, if your goal is to have the highest fidelity image, you shouldn't set your TV's sharpness to anything over 0. Back in the day of analog CRTs it used to be the case that sharpness might improve the fidelity, but ever since the switch to digital LCDs anything more than 0 adds sharpness that wasn't there in the original image, thus decreasing the visual fidelity of the resulting image.
Re: (Score:2)
“Looks worse” is a matter of subjective preference, which I specifically addressed, but when it comes to objective measures (e.g. fidelity), yes, it’s definitively worse. Feel free to sharpen things on your end if that’s your preference, but don’t foist your (bad) taste on everyone else.
Re: (Score:3)
Because a sharpened image over-writes some parts of the image with false information, see a good example here:
https://www.youtube.com/watch?... [youtube.com]
Re: (Score:2)
The questioner is hopelessly confused anyway. For example, YouTube has some of the best streaming quality, especially with 4k videos. What makes YouTube videos look bad is when the person editing and uploading the video doesn't know what they are doing.
It certainly would be possible to create some kind of measure of video quality, but it could be gamed and would be of limited use. All you can do is verify that the specs that the streaming services claim are true (e.g. 1080p frames are being sent) and look a
quality vs practicality (Score:1)
Video quality is a function of codec and bitrate. It will always be a compromise between image quality, file size and decoding power. Even if the service wanted to serve users super sharp, high preset h265 + flac audio, most users would be unable to use it - file sizes would eat their download caps and their netbooks/phones would be unable to cope with decoding. Obviously on the service side they also don't want to pay for storage and transfer of those huge files, especially if they would be hardly watched
Video metrics (Score:4, Informative)
The only thing you can more or less do on the broadcast video is to identify encoding artifacts like macro blocks, mosquito noise and deinterlacing side effects. MPEG2 streams are easy to identify, H264 and H265 are slightly more difficult. I do not know of a package solution to this. Be aware of news stations that use low fi mobile feeds!
Try this (Score:3)
Already a solved problem (Score:2, Informative)
There are already algorithms for doing this, like PSNR, SSIM, and VMAF. Here's a good read: https://medium.com/netflix-techblog/toward-a-practical-perceptual-video-quality-metric-653f208b9652
And there are already commercial solutions that apply these metrics as well as network analysis etc. across a range of video delivery methods, e.g.:https://www.telestream.net/iq/
Re: (Score:3)
gotta remember (Score:2)
You need to check with the specific streaming services terms of use to see what they say about their streaming. Most likely most will have a way out if the video is not up to par. Also, sometimes the streaming video is based on the actual speed of your internet at the time of the streaming (Amazon Prime, Netflix, etc). In addition some service providers will throttle the stream - especially if mobile.
IMO, getting a measuring tool would really be a waste of time.
What you really want... (Score:5, Informative)
Is a way to tell not that the video is "sharp" but closely matched the original source in fidelity. If all you looked at was sharpness a scene that had motion blur or an out of focus effect would run afoul of the algorithm. Your end goal of name and shame is incredibly difficult to do well or even accurately.
There's ways to get video quality measures (SSIM, MSE, etc) but they require a comparison to the source (or a source you consider good enough) as they're relative measures. Interpreting the results is also not obvious.
Besides the challenge of the comparison it's also important to understand the video pipeline from raw source to what you see on screen. The video stream from the provider might be providing a high quality stream but the scaler on your TV might suck and muddies the image upon display or your LCD panel's dithering might be really shitty.
Video, especially streaming video (either Internet streaming or live delivery like cable), is really complicated. There's a lot of different dimensions where you might judge "quality" and even then it's an envelope and not a single scalar value. There's no objective "good" reference for any recorded scene, even the concept of "life-like" is not clear since the recording is entirely dependent on physical properties of the equipment.
Just use bitrate. (Score:5, Insightful)
There's already a metric that basically defines video sharpness: bitrate.
The sharpness of H.264 and H.265 is very well known. Since commercial streaming services use commercial streaming video codecs, it's a pretty safe bet that you can almost directly correlate resolution to bitrate.
There's virtually no incentive for streaming companies to deliver lower resolutions at higher bitrates. It would be a technical challenge to deliver higher resolutions at higher bitrates.
Therefore, bitrate is most likely the simplest and most accurate measure of streaming video sharpness.
Re: (Score:2)
Re: (Score:2)
Ignoring the point that different algorithms can produce different results at the same given bit rate. If there's a video provider that doesn't really care about the service they're offering, they may ensure everything is sent at a particular bitrate. This can result in inconsistent quality between titles if the algorithms that were used to encode it are different (and again, this is for a content provider that doesn't care about what they're doing). Also complex scenes need higher bitrates in order to rend
That's what I was g to say. Compare to same codec (Score:2)
That's what I was going to say. Compare to competitors using the same codec and more or less bitrate = quality. It's not a PERFECT measure, but it's pretty good and dead simple.
Re: That's what I was g to say. Compare to same co (Score:2)
No, itâ(TM)s not. Every encoder offers configurations that trade off speed against quality. If you donâ(TM)t want to spend much on hardware, or turn around time is important, youâ(TM)ll reduce the quality irrespective of bitrate.
Then live encoding is another issue. x264 is very CPU intensive, and watch the CPU spike during (Iâ(TM)m not sure which, but they could be the same) scene detection or I-frame generation.
There's that, but if you're distributing widely (Score:2)
That's true, one can decide to spend less CPU encoding.
Having said that, when widely comparing services such as Dish Network vs Comcast vs Frontier, that part would be different only if they are very stupid. For a given level of quality, the higher CPU encoding gives lower bandwidth. Some of us may be used to thinking of higher quality if it were the same bandwidth, but the flip side of that is it also means lower bandwidth for whatever quality level they accept.
Since they are going to transfer the stream
Re: (Score:2)
The idea of encoding content once for everyone (in the streaming case) is actually the totally incorrect thing to do. You actually want to encode a single source a number of times with different settings intended for different client environments. If you're Netflix you even encode different segments of each source video to different profiles.
Storage and CPU are much cheaper than bandwidth. HTTP streaming protocols like HLS and MPEG-DASH break video streams up temporally into time-based segments. The playlis
Re: (Score:2)
Of course.
My point is that when 12 million people watched an episode of Game of Thrones, that means it was transferred 12 million times. It wasn't encoded 12 million times. Sure a scene might be encoded 12 times, then each encoded copy was downloaded a million times.
If I watch a 7Mbps stream of GoT, a million other people watched the same 7Mbps stream. That 7Mbps version only had to be encoded once, for a million viewers, so it would be crazy to skimp on a dollar of encoding time.
Re: (Score:2)
I think the primary problem is the provider and in-between boxes lying about the bitrates. You can output 4K HDR on your cable box but your provider may be streaming 240p which the box up converts.
Re: (Score:2)
You can have a 4k size clip go on forever and require very little bitrate so long as nothing much is actually moving; the p-frames will be almost entirely black.
So the bitrate required is a function of the amount of detail/motion necessary to explain it. This makes it heavily scene dependent and the best example of this in recent memory was in Thor: Ragnarok during a fight
Re: (Score:2)
That is total bollocks. You'd have to be a video encoding total noob to believe it.
These video standards have various "profiles" that define the encoding "tools" that can be used, and the characteristics of the target decoders. These can vastly change the required bitrate to acheive a given quality. The quality of the encoder design makes a vast different too. Finally, moreso than anything, the characteristics of the input video affect the required bitrate for a given quality. A football game may need 3x th
Re: (Score:2)
There's already a metric that basically defines video sharpness: bitrate.
Sorry but that is just plain wrong. Even within an identical decoder the same bitrate can produce wildly different results in terms of perceived image quality depending on the many hundreds of settings available, including sharpness. On top of that perceived sharpness can be faked by applying a local contrast increase right at the point of the change in brightness.
There's virtually no incentive for streaming companies to deliver lower resolutions at higher bitrates.
Actually there is. Improving image quality costs time, or hardware which directly translates into actual money. e.g. I could live capture video i
Re:Wait a second... (Score:4, Informative)
A video is a set of images. Sharp images = sharp video.
What the poster is asking for is not a measure of "sharpness" but a measure of quality.
Adaptive bitrate streaming (Score:2)
You are probably not aware of how video streaming works nowadays, and you should read about adaptive bitrate streaming [wikipedia.org].
TLDR, nowadays content providers encode each video at different resolutions and quality levels (and bitrates), each with a different level of "sharpness", and the client selects the one that best fits its available network bandwidth.
This means that playing the same video several times may result in different quality each time. This also implies that the quality of the video you receive may
It's about spatial frequency (Score:2)
Fundamentally, a 2-D FFT describes the dynamics present in any image. Sure, you could compare them between a known source displayed via different streaming methods on different screens. However, these numbers are meaningless on their own without taking the context of the viewer into account.
The FFT must include physical dimensions, starting with the screen's PPI value. So a 4K phone screen is inherently "sharper" than a 4K projector on a 20 meter movie screen. But we all know the movie screen offers a f
Re: (Score:2)
Dear Salty Dude (Score:1)
Dear Salty Dude,
Your picture quality is crap because streaming services dynamically vary the quality to match the available bandwidth. If you've got less than 20Mbps or if your ISP is throttling your 20Mbps connection so that you can only get 2Mbps from Netflix, then you're not going to see 4K(UHD). https://fast.com
Even if you could programatically characterize the quality of the video, your salty rants are not going to shame the likes of NetFlix and Amazon. So don't make a spectacle of yourself or waste yo
Betteridge says 'No'! (Score:2)
And that's the law!
Sharpness (Score:1)
As others here already sort of mentioned; Sharpness is the wrong metric.
Another person mentioned "macro blocks".
You don't even know what you're asking (Score:2)
It might out some of the people who rip off others (Score:2)
FFmpeg can do it (Score:2)
FFmpeg can do PSNR calculations. The only problem with what you call 'sharpness' (in technical terms, we call it noise or signal-to-noise ratio) is you have to have something to compare it with. Movie and video makers often use 'blur' either as a proxy to indicate motion or to put emphasis on the thing in the story that the viewer should focus on. It's also natural for things that are in the 'focus' of the camera to be sharp and everything else to be blurry (unsharp). So measuring 'sharpness' of an image is
Re: (Score:2)
I think what the poster was asking about detecting compressed, lower resolution video being upscaled to a higher resolution (4K@60Hz output from eg. the cable box but the (often encrypted) transport is lower resolution) at the end point. This process induces a variety of noise and interpolation which you see as blurry-ness but the computer sees simply as more complex video.
Asking the impossible, but there is an idea (Score:2)
Measuring sharpness is possible, but that would mean nothing. There are many image sharpening algorithms. Any self respecting editor can play with sharpness any way he wants, and it is even built in some TVs and video players.
What you want is a measurement of quality and it can't be done without a reference. Think about it, if you can tell how close a video is to the reference without the reference, then the same algorithm could be used to reconstruct the original from a degraded video. And guess what, comp
How to do this (Score:2)
Work out what the original material was created in, film, HD, 4K, 8K.
Find the same show globally.
Sort for the bandwidth, compression, chip sets used.
And only selling so much bandwidth to broadcasters?
Are 2nd and 3rd world nations broadcasters trying to fit many shows over a set amount of bandwidth they can use in that part of the world?
Its not a conspiracy to make TV look bad in 2nd and 3
Why analyze the footage? (Score:2)
Since transmission over the last mile is the expensive part, why not just check the received stream's bps? You can assume that they're not going to upsample it and pay more just for the last mile.
I hope not, it'd be abused. (Score:2)
That reminds me of the abhorrent sharpness setting most TVs even including 4k TVs have.
Sharpness is not a measurement of quality, quality is a measurement of quality or more specifically accurate reproduction - how close does the stream match a high quality version such as a blu-ray 4k copy etc.
If you introduce measuring sharpness as a way of measuring quality then the end result is cheating that will artificially make all streams sharper at the expense of picture quality.
So, get it right, sharpness != qual
Why sharpness? (Score:4, Informative)
Image quality is not determined by sharpness. This idea is the reason why those horrible TVs apply a post process sharpness increase to content (the first thing you should turn off when you buy a TV).
Quality covers a large number of metrics including artifacting, posterisation, loss of colour fidelity, and loss of contrast ratio to name a few.