Comment Re:They Don't Care (Score 5, Insightful) 46
No, it's lack of hardware support. I called in to an AOM webinar end of October entitled "Is Real-Time AV1 Ready For Prime Time?". Meta, Google and Microsoft for instance all complained that there isn't enough hardware support, e.g. only in higher-end phones or some GPUs. Agora didn't mind so much because most of their users were desktop browsers and they could cope with CPU decoding.
There's also the cost factor. A streaming service isn't going to switch off AVC or HEVC streams just because they've started to use AV1. They want to support the long tail of users who can't decode AV1, so that means adopting a new codec increases their bandwidth, storage and CDN costs, as well as making their encoding and packaging pipelines more complex and expensive. Furthermore, many vendors that have already switched/added from AVC to HEVC aren't going to switch to AV1 because the savings aren't enough and are more likely to wait for something better such as VVC.
Complaining that usage is not widespread enough seems to be common and it shows a lack of understanding of codec adoption. HEVC usage is still growing and it was standardised in 2013. AV1 needs a few more years.
Give it more time.