Ffmpeg and AV1 for HTML5 streaming
I've been looking around online at compatibility for HTML5 browser streaming. It looks like straight up AV1 in a MP4 container is becoming absolutely fine for browser playback on devices.
Is anyone using this on webpages yet? The sooner we move to AV1, the sooner we can have high quality video stored at smaller file sizes, which is a massive bonus.
Right now my company video hosting is purely in MP4 with H264, moov atom to the front as per the requirement, and it plays back on everything with no fallback in a straight HTML5 video container. What's the chance of switching to AV1 and not having to worry about the fallback for the most part?
Edit: I should have used a better title. I used FFMpeg for MP4 and AV1 creation/encoding. This is more about HTML5 video container code and direct AV1 file playback.
AV1 adoption is steady but there's still lots of holdouts, and I wouldn't deploy it with no fallback today, unless this is an internal corporate environment and you know what client devices are being used.
Consider that AV1 software decoding is very intensive on older hardware, far more so than h264 or vp9 software decoding, and forcing clients without hardware decoder support to do so in software will make for a very bad experience as their CPU usage shoots way up.
However, VP9+opus in a webm container is well supported today thanks to a decade of use by YouTube. VP9 software decoding places only a moderate load on older hardware, and VP9 hardware decoders have been around for nearly a decade, starting with Intel 7th gen (2016), AMD GCN 3 (2015), and Nvidia Pascal (2016) on desktop. VP9 will net you significant video savings, and Opus over AAC will cut your audio track sizes in half (not that they were likely large to begin with...).
YouTube is still serving h264 alongside VP9, but only up to 1080p. 1440p/2160p streams, as well as "1080p Premium", are VP9 only, with videos of any popularity also receiving AV1 encodes.
I don't know much about media but you've just helped diagnose why my 13 year old laptop uses so much CPU for youtube videos.
There is a reason old CPUs need to eventually die.
People seem to think that a decade old hardware can keep performing like a new CPU with the "right Linux distribution", but no, it really cannot. Anything modern and it'll show it's aged colours pretty brightly.
This oversimplifies things a lot. Are you sure you are not just saying this because you just found out older hardware means there is no positive answer to your question? ;)
In reality, things are a lot more complex. If the laptop still qualifies in all other aspects for their use case, it would be extremely wasteful to replace it just to get better YouTube performance. It also isn't really needed as YouTube provides various codecs so they could choose a lower resolution and be good to go.
Granted, 13 years is fairly old. But it does raise the question of where to put the cutoff line. Doing it based on security reasons could be valid. It is what Microsoft did with the introduction of Windows 11. Support for hardware decoding of things feels like a less valid reason to say "just replace it".
Spitefully, I am happy that Windows 11 has a hardware barrier to entry. It means that fewer people will willfully adopt this awful, anti-consumer, data mining excuse for an OS
No, it's not a gut reaction on my part. It's like all things, there comes a time when it is no longer as useful as it once was.
Executing old code on old hardware is obviously fine. When the CPU and GPU struggle to render the BBC news website in Firefox simply due to the moving forward of web codecs and modern web technology, it's time to let it go and move on.
As many of you have probably worked out, I work for a mid sized TV company and it was only last week I asked one of my team to upgrade the COOs PC from one that is 9 years old to one that is pushing 5. I don't believe in creating waste where it's not required. However, time is a bigger value than money. No one wants to sit and wait idly while something loads, especially if they cannot even multi task.
That's where my annoyance with old hardware truly lies.
"modern web technology" generally hasn't been progress in most people's opinions. I think a lot of people are saying that because a site that delivers mostly text being dozens of megabytes and requiring massive, high clock speed proccessors is patently ridiculous.
The argument has always been that excess processor time saves developer time, which is more expensive. As far as I've seen though, modern websites aren't any easier to develop than sites from 30 years ago were. If anything the opposite is true and the barrier to entry for web programming is higher than it's ever been.
That's fair.
Though I'd have to say that even though it is modern web technology, not all of it is a movement forward. A lot of websites these days are poorly optimized and are extremely heavy. If I can find it, I'll link it, but there is an excellent article out there about how a lot of websites exclude a huge potential visitor base. Simply because most developers use top of the line hardware and have a very good internet connection.
For example, the BBC website loads in roughly 15mb of data on page load alone. I am on a fiber connection, so I don't notice. Scrolling down the page with the developer tools open, I saw this double, which I did not notice due to my connection and hardware.
But a huge percentage of the people out there are on relatively poor connections. Not only in bandwidth but also latency and general package loss. This is not only true for a lot of developing countries, but also rural communities in western countries.
To bring it back to what prompted this conversation. There are likely budget smartphones sold today without AV1 hardware decoding support that due to their budget will also have trouble software decoding.
It's why in my answer I took care to not only mention old hardware but also lower end hardware.
You can install a browser extension like h264ify to force YouTube to only stream h264. You will, of course, lose the "1080p Premium"/1440p/2160p quality settings, but the regular 144p/360p/480p/720p/1080p will be available.
I do the same thing on my 2015 Retina MBP I use sometimes. I miss the higher quality settings (especially since it's got a ~1440p screen), but I'll still take that over the like 50% CPU utilization for VP9 software decode.
That is exactly what I did, and it seems to make a difference so that I'm not pushing 90 degrees and 90% CPU usage on a regular 720p video.
My laptop is a 2011 MBP which I've put 16GB of RAM and an SSD into. It still chugs along for any regular tasks I do on it, but the lack of software support is starting to show. For example, just recently Firefox have stopped supporting my OS.
To illustrate this fact, I recently tested ffmpeg offline (grabbing frames into RAM as fast as possible) software decoding for a small app I'm writing and unfortunately I did not save the data to give exact numbers, but AV1 was in the ballpark of 20x slower than h264 or so.
Hardware decoding on older or lower end hardware is a realistic issue you could be facing. On the PC side of things, I believe decoding has been supported since the previous generation of GPU hardware. On the mobile side I have no clue what SoCs do have backed in decoding support. What I do know is that a lot of the support is also fairly recent.
I suspect that devices without hardware decoding support can fall back to software decoding. But given that these then are older or lower end devices, this can then cause issues there. Even more so as I believe AV1 software decoding actually is more CPU intensive compared to H264.
The reason you can currently use H264 with no fallback is because even the cheapest lowest end devices out there these days come with hardware decoding support for it. I'd say we are still a few years out before AV1 support is as ubiquitous on all devices out there.
Not that I work with you, but if I were asked this question at the office:
I'd suggest testing it yourself instead of making a guess -- we don't know your audience, and each one is different. Assuming your clients support javascript (most people don't block it), you can use something like modernizr and an analytics framework to answer this question directly.
Pulling back a little: consider that dropping support for older codecs affects people running older hardware. Assuming you're releasing to a wide external audience, this tends to mean that you're preventing poorer (e.g. the global south, appalachia), disadvantaged (e.g. ethnic minorities), or oppressed people (e.g. anyone in a blockaded country w/o easy access to modern hardware) from accessing your content. That said, poor people aren't worth much per advertisement impression, and they can't afford to pay you enough, so the above might not be a loss for the company.
This line stood out -- are you hosting your video files yourself, or using a video/streaming CDN? Normally the latter doesn't care about the format of your uploaded videos. And regardless, I'd imagine that discount storage services (such as B2) should keep your fees fairly low regardless of compression algorithm. Not sure what your budgetary constraints are like, however.
The audience would be huge broadcasting companies.
The videos are hosted in house, there is no CDN as they are loaded ad-hoc, once in a blue moon and one off. If I wanted to show you a demo video of a new show, I'd send you a link and it literally renders up a basic HTML5 page, with a video src block. We don't chop up the video into multi codecs and bitrates, nor stream with hls. It's simply a mov container with a standard H264.
I stayed away from going H265/hevc just because of licensing. However, by the sounds of it I could go VP9 and Opus without much fanfare. AV1 just seems to be the new kid on the block that is really maturing.
Storage wise, it's again locally hosted in a 20TB raid 6 on flash. It's nippy enough and the video is slow turn over, we don't generate too much of it. I did test storing in B2 and playing back from there using rclone, it was pretty flawless.
In reality we could get a straight up Pro account on Vimeo but with the world of enshitification, I'd rather stay on prem and know that next week I won't be parting with an extra £12k due to a licensing and storage change (here's looking at you, Wistia!).
Ah, gotcha, that makes more sense. Couple of notes:
I was thinking moreso directly linking to files on it, or pulling it through a bandwidth alliance member to avoid egress fees. That said, since you’re OK with self hosting, you're already using the most cost effective option!
Seems like a reasonable argument, but it’s normally worth considering what you’ll do in failure scenarios — eg if someone has been playing your videos on an ancient Android tablet duct taped to a meeting room table, or some such. Even a pop up window explaining why the video isn’t running might be worth avoiding the support call.
Ah, I guess that’s where the video file size consideration comes in? I’m kinda surprised at using SSDs for low-access rate content storage; for a self hosted solution, I’d have expected a pile of HDDs behind an SSD cache (or a machine with a pile of RAM). Still, it explains the motivation at least.