AV1 adoption is steady but there's still lots of holdouts, and I wouldn't deploy it with no fallback today, unless this is an internal corporate environment and you know what client devices are...
AV1 adoption is steady but there's still lots of holdouts, and I wouldn't deploy it with no fallback today, unless this is an internal corporate environment and you know what client devices are being used.
Consider that AV1 software decoding is very intensive on older hardware, far more so than h264 or vp9 software decoding, and forcing clients without hardware decoder support to do so in software will make for a very bad experience as their CPU usage shoots way up.
However, VP9+opus in a webm container is well supported today thanks to a decade of use by YouTube. VP9 software decoding places only a moderate load on older hardware, and VP9 hardware decoders have been around for nearly a decade, starting with Intel 7th gen (2016), AMD GCN 3 (2015), and Nvidia Pascal (2016) on desktop. VP9 will net you significant video savings, and Opus over AAC will cut your audio track sizes in half (not that they were likely large to begin with...).
YouTube is still serving h264 alongside VP9, but only up to 1080p. 1440p/2160p streams, as well as "1080p Premium", are VP9 only, with videos of any popularity also receiving AV1 encodes.
You can install a browser extension like h264ify to force YouTube to only stream h264. You will, of course, lose the "1080p Premium"/1440p/2160p quality settings, but the regular...
You can install a browser extension like h264ify to force YouTube to only stream h264. You will, of course, lose the "1080p Premium"/1440p/2160p quality settings, but the regular 144p/360p/480p/720p/1080p will be available.
I do the same thing on my 2015 Retina MBP I use sometimes. I miss the higher quality settings (especially since it's got a ~1440p screen), but I'll still take that over the like 50% CPU utilization for VP9 software decode.
That is exactly what I did, and it seems to make a difference so that I'm not pushing 90 degrees and 90% CPU usage on a regular 720p video. My laptop is a 2011 MBP which I've put 16GB of RAM and...
That is exactly what I did, and it seems to make a difference so that I'm not pushing 90 degrees and 90% CPU usage on a regular 720p video.
My laptop is a 2011 MBP which I've put 16GB of RAM and an SSD into. It still chugs along for any regular tasks I do on it, but the lack of software support is starting to show. For example, just recently Firefox have stopped supporting my OS.
This oversimplifies things a lot. Are you sure you are not just saying this because you just found out older hardware means there is no positive answer to your question? ;) In reality, things are...
This oversimplifies things a lot. Are you sure you are not just saying this because you just found out older hardware means there is no positive answer to your question? ;)
In reality, things are a lot more complex. If the laptop still qualifies in all other aspects for their use case, it would be extremely wasteful to replace it just to get better YouTube performance. It also isn't really needed as YouTube provides various codecs so they could choose a lower resolution and be good to go.
Granted, 13 years is fairly old. But it does raise the question of where to put the cutoff line. Doing it based on security reasons could be valid. It is what Microsoft did with the introduction of Windows 11. Support for hardware decoding of things feels like a less valid reason to say "just replace it".
Spitefully, I am happy that Windows 11 has a hardware barrier to entry. It means that fewer people will willfully adopt this awful, anti-consumer, data mining excuse for an OS
Spitefully, I am happy that Windows 11 has a hardware barrier to entry. It means that fewer people will willfully adopt this awful, anti-consumer, data mining excuse for an OS
"modern web technology" generally hasn't been progress in most people's opinions. I think a lot of people are saying that because a site that delivers mostly text being dozens of megabytes and...
"modern web technology" generally hasn't been progress in most people's opinions. I think a lot of people are saying that because a site that delivers mostly text being dozens of megabytes and requiring massive, high clock speed proccessors is patently ridiculous.
The argument has always been that excess processor time saves developer time, which is more expensive. As far as I've seen though, modern websites aren't any easier to develop than sites from 30 years ago were. If anything the opposite is true and the barrier to entry for web programming is higher than it's ever been.
That's fair. Though I'd have to say that even though it is modern web technology, not all of it is a movement forward. A lot of websites these days are poorly optimized and are extremely heavy. If...
That's fair.
Though I'd have to say that even though it is modern web technology, not all of it is a movement forward. A lot of websites these days are poorly optimized and are extremely heavy. If I can find it, I'll link it, but there is an excellent article out there about how a lot of websites exclude a huge potential visitor base. Simply because most developers use top of the line hardware and have a very good internet connection.
For example, the BBC website loads in roughly 15mb of data on page load alone. I am on a fiber connection, so I don't notice. Scrolling down the page with the developer tools open, I saw this double, which I did not notice due to my connection and hardware.
But a huge percentage of the people out there are on relatively poor connections. Not only in bandwidth but also latency and general package loss. This is not only true for a lot of developing countries, but also rural communities in western countries.
To bring it back to what prompted this conversation. There are likely budget smartphones sold today without AV1 hardware decoding support that due to their budget will also have trouble software decoding.
It's why in my answer I took care to not only mention old hardware but also lower end hardware.
To illustrate this fact, I recently tested ffmpeg offline (grabbing frames into RAM as fast as possible) software decoding for a small app I'm writing and unfortunately I did not save the data to...
Consider that AV1 software decoding is very intensive on older hardware, far more so than h264 or vp9 software decoding
To illustrate this fact, I recently tested ffmpeg offline (grabbing frames into RAM as fast as possible) software decoding for a small app I'm writing and unfortunately I did not save the data to give exact numbers, but AV1 was in the ballpark of 20x slower than h264 or so.
Hardware decoding on older or lower end hardware is a realistic issue you could be facing. On the PC side of things, I believe decoding has been supported since the previous generation of GPU...
Hardware decoding on older or lower end hardware is a realistic issue you could be facing. On the PC side of things, I believe decoding has been supported since the previous generation of GPU hardware. On the mobile side I have no clue what SoCs do have backed in decoding support. What I do know is that a lot of the support is also fairly recent.
I suspect that devices without hardware decoding support can fall back to software decoding. But given that these then are older or lower end devices, this can then cause issues there. Even more so as I believe AV1 software decoding actually is more CPU intensive compared to H264.
The reason you can currently use H264 with no fallback is because even the cheapest lowest end devices out there these days come with hardware decoding support for it. I'd say we are still a few years out before AV1 support is as ubiquitous on all devices out there.
Not that I work with you, but if I were asked this question at the office: I'd suggest testing it yourself instead of making a guess -- we don't know your audience, and each one is different....
Not that I work with you, but if I were asked this question at the office:
What's the chance of switching to AV1 and not having to worry about the fallback for the most part?
I'd suggest testing it yourself instead of making a guess -- we don't know your audience, and each one is different. Assuming your clients support javascript (most people don't block it), you can use something like modernizr and an analytics framework to answer this question directly.
Pulling back a little: consider that dropping support for older codecs affects people running older hardware. Assuming you're releasing to a wide external audience, this tends to mean that you're preventing poorer (e.g. the global south, appalachia), disadvantaged (e.g. ethnic minorities), or oppressed people (e.g. anyone in a blockaded country w/o easy access to modern hardware) from accessing your content. That said, poor people aren't worth much per advertisement impression, and they can't afford to pay you enough, so the above might not be a loss for the company.
The sooner we move to AV1, the sooner we can have high quality video stored at smaller file sizes, which is a massive bonus.
This line stood out -- are you hosting your video files yourself, or using a video/streaming CDN? Normally the latter doesn't care about the format of your uploaded videos. And regardless, I'd imagine that discount storage services (such as B2) should keep your fees fairly low regardless of compression algorithm. Not sure what your budgetary constraints are like, however.
Ah, gotcha, that makes more sense. Couple of notes: I was thinking moreso directly linking to files on it, or pulling it through a bandwidth alliance member to avoid egress fees. That said, since...
Ah, gotcha, that makes more sense. Couple of notes:
I did test storing in B2 and playing back from there using rclone, it was pretty flawless.
I was thinking moreso directly linking to files on it, or pulling it through a bandwidth alliance member to avoid egress fees. That said, since you’re OK with self hosting, you're already using the most cost effective option!
[using vp9]
Seems like a reasonable argument, but it’s normally worth considering what you’ll do in failure scenarios — eg if someone has been playing your videos on an ancient Android tablet duct taped to a meeting room table, or some such. Even a pop up window explaining why the video isn’t running might be worth avoiding the support call.
[20 TB SSD RAID 6 array]
Ah, I guess that’s where the video file size consideration comes in? I’m kinda surprised at using SSDs for low-access rate content storage; for a self hosted solution, I’d have expected a pile of HDDs behind an SSD cache (or a machine with a pile of RAM). Still, it explains the motivation at least.
AV1 adoption is steady but there's still lots of holdouts, and I wouldn't deploy it with no fallback today, unless this is an internal corporate environment and you know what client devices are being used.
Consider that AV1 software decoding is very intensive on older hardware, far more so than h264 or vp9 software decoding, and forcing clients without hardware decoder support to do so in software will make for a very bad experience as their CPU usage shoots way up.
However, VP9+opus in a webm container is well supported today thanks to a decade of use by YouTube. VP9 software decoding places only a moderate load on older hardware, and VP9 hardware decoders have been around for nearly a decade, starting with Intel 7th gen (2016), AMD GCN 3 (2015), and Nvidia Pascal (2016) on desktop. VP9 will net you significant video savings, and Opus over AAC will cut your audio track sizes in half (not that they were likely large to begin with...).
YouTube is still serving h264 alongside VP9, but only up to 1080p. 1440p/2160p streams, as well as "1080p Premium", are VP9 only, with videos of any popularity also receiving AV1 encodes.
I don't know much about media but you've just helped diagnose why my 13 year old laptop uses so much CPU for youtube videos.
You can install a browser extension like h264ify to force YouTube to only stream h264. You will, of course, lose the "1080p Premium"/1440p/2160p quality settings, but the regular 144p/360p/480p/720p/1080p will be available.
I do the same thing on my 2015 Retina MBP I use sometimes. I miss the higher quality settings (especially since it's got a ~1440p screen), but I'll still take that over the like 50% CPU utilization for VP9 software decode.
That is exactly what I did, and it seems to make a difference so that I'm not pushing 90 degrees and 90% CPU usage on a regular 720p video.
My laptop is a 2011 MBP which I've put 16GB of RAM and an SSD into. It still chugs along for any regular tasks I do on it, but the lack of software support is starting to show. For example, just recently Firefox have stopped supporting my OS.
This oversimplifies things a lot. Are you sure you are not just saying this because you just found out older hardware means there is no positive answer to your question? ;)
In reality, things are a lot more complex. If the laptop still qualifies in all other aspects for their use case, it would be extremely wasteful to replace it just to get better YouTube performance. It also isn't really needed as YouTube provides various codecs so they could choose a lower resolution and be good to go.
Granted, 13 years is fairly old. But it does raise the question of where to put the cutoff line. Doing it based on security reasons could be valid. It is what Microsoft did with the introduction of Windows 11. Support for hardware decoding of things feels like a less valid reason to say "just replace it".
Spitefully, I am happy that Windows 11 has a hardware barrier to entry. It means that fewer people will willfully adopt this awful, anti-consumer, data mining excuse for an OS
"modern web technology" generally hasn't been progress in most people's opinions. I think a lot of people are saying that because a site that delivers mostly text being dozens of megabytes and requiring massive, high clock speed proccessors is patently ridiculous.
The argument has always been that excess processor time saves developer time, which is more expensive. As far as I've seen though, modern websites aren't any easier to develop than sites from 30 years ago were. If anything the opposite is true and the barrier to entry for web programming is higher than it's ever been.
That's fair.
Though I'd have to say that even though it is modern web technology, not all of it is a movement forward. A lot of websites these days are poorly optimized and are extremely heavy. If I can find it, I'll link it, but there is an excellent article out there about how a lot of websites exclude a huge potential visitor base. Simply because most developers use top of the line hardware and have a very good internet connection.
For example, the BBC website loads in roughly 15mb of data on page load alone. I am on a fiber connection, so I don't notice. Scrolling down the page with the developer tools open, I saw this double, which I did not notice due to my connection and hardware.
But a huge percentage of the people out there are on relatively poor connections. Not only in bandwidth but also latency and general package loss. This is not only true for a lot of developing countries, but also rural communities in western countries.
To bring it back to what prompted this conversation. There are likely budget smartphones sold today without AV1 hardware decoding support that due to their budget will also have trouble software decoding.
It's why in my answer I took care to not only mention old hardware but also lower end hardware.
To illustrate this fact, I recently tested ffmpeg offline (grabbing frames into RAM as fast as possible) software decoding for a small app I'm writing and unfortunately I did not save the data to give exact numbers, but AV1 was in the ballpark of 20x slower than h264 or so.
Hardware decoding on older or lower end hardware is a realistic issue you could be facing. On the PC side of things, I believe decoding has been supported since the previous generation of GPU hardware. On the mobile side I have no clue what SoCs do have backed in decoding support. What I do know is that a lot of the support is also fairly recent.
I suspect that devices without hardware decoding support can fall back to software decoding. But given that these then are older or lower end devices, this can then cause issues there. Even more so as I believe AV1 software decoding actually is more CPU intensive compared to H264.
The reason you can currently use H264 with no fallback is because even the cheapest lowest end devices out there these days come with hardware decoding support for it. I'd say we are still a few years out before AV1 support is as ubiquitous on all devices out there.
Not that I work with you, but if I were asked this question at the office:
I'd suggest testing it yourself instead of making a guess -- we don't know your audience, and each one is different. Assuming your clients support javascript (most people don't block it), you can use something like modernizr and an analytics framework to answer this question directly.
Pulling back a little: consider that dropping support for older codecs affects people running older hardware. Assuming you're releasing to a wide external audience, this tends to mean that you're preventing poorer (e.g. the global south, appalachia), disadvantaged (e.g. ethnic minorities), or oppressed people (e.g. anyone in a blockaded country w/o easy access to modern hardware) from accessing your content. That said, poor people aren't worth much per advertisement impression, and they can't afford to pay you enough, so the above might not be a loss for the company.
This line stood out -- are you hosting your video files yourself, or using a video/streaming CDN? Normally the latter doesn't care about the format of your uploaded videos. And regardless, I'd imagine that discount storage services (such as B2) should keep your fees fairly low regardless of compression algorithm. Not sure what your budgetary constraints are like, however.
Ah, gotcha, that makes more sense. Couple of notes:
I was thinking moreso directly linking to files on it, or pulling it through a bandwidth alliance member to avoid egress fees. That said, since you’re OK with self hosting, you're already using the most cost effective option!
Seems like a reasonable argument, but it’s normally worth considering what you’ll do in failure scenarios — eg if someone has been playing your videos on an ancient Android tablet duct taped to a meeting room table, or some such. Even a pop up window explaining why the video isn’t running might be worth avoiding the support call.
Ah, I guess that’s where the video file size consideration comes in? I’m kinda surprised at using SSDs for low-access rate content storage; for a self hosted solution, I’d have expected a pile of HDDs behind an SSD cache (or a machine with a pile of RAM). Still, it explains the motivation at least.