Hacker News new | past | comments | ask | show | jobs | submit login
Chrome Removing Theora Support (groups.google.com)
143 points by rebelwebmaster 6 months ago | hide | past | favorite | 114 comments



Interesting enough, I'm probably one of the few people who still has some theora videos on the web in some very old blogposts of mine.

Some history for people who are not aware: Theora became somewhat popular in free software / open source circles, because at the time, it was the best codec which was believed to be either free of patents or the patents were explicitly opened up for free use. Therefore, if you were concerned about patents and their impact on free software, you'd use it. But Theora wasn't a great codec, which we always knew, it just was the best we had before google bought and opened up VP8.

It's an interesting tradeoff. Theora was never particularly popular, so you probably will have a low number of sites being impacted. But we kinda have a tradition that the web plattform rarely breaks things. You mostly can still use old html from 20-30 years ago, gif will probably stay supported in browsers forever, and I don't think there are many examples of media formats in browsers being deprecated. Even odd things like bmp are still supported.


For videos there's an easy workaround - download the video file and open it in VLC. FFMPEG will likely never drop its support.

Dropping e.g. HTML/CSS features is much harder, since there likely won't be any workaround other than running older version of the browser.


I suppose Flash could be considered a notable exception there.


Flash was proprietary. Open standards and formats are what we've always kind of expected would keep working.


Yeah, and java applets + some more tech enabled by plugins. But I guess we always considered the "plugin-web" to be "different" in that regard.


IIRC flash was a plugin but it was integrated into chrome install. I don't remember if it was part of FF default install or not, but I can guess it wasn't.


Can't we do what Ruffle has done and just create a WebAssembly decoder for Theora? That way either the site host just needs to add a tiny piece of code to the page, or the browsee just needs to install a plugin to re-enable this functionality.


VLC has been working on a WASM version so yah, that seems likely.


I'd gladly accept if my daily browser traded some backwards compatibility for speed, security and the ability to move forward faster.

Obviously only if I'd be certain some other browser can still open it, or maybe some emulator would be able to display it. My earliest websites are 27 years old now (and live on a floppy disk) I'D hate to find that they're no longer readable in any software.

But I'm perfectly fine with my Firefox, with which I spend hours a day on the modern web, dropping support for that, when they need to shed some cruft or weight.


Firefox as well: https://bugzilla.mozilla.org/show_bug.cgi?id=1860492

It's a shame Theora never "made it". The peak of its popularity is long past.


It was never good. We used it in spirit of OSS in our hackerspace, but it was always terrible and behind all the MPEGs of the time.


> It was always terrible and behind all the MPEGs of the time.

So was VP8, which Google ended up pushing at nearly the same time they were rejecting Theora as an inferior H.264 clone. The irony being that VP8 wasn't that different either[1]

[1] https://web.archive.org/web/20150301015756/http://x264dev.mu...


Interesting you claim that - I actually put together a rather in-depth report at the time and VP8/H.264 performed very closely as formats (that is, quality metrics vs. filesize were very comparable even if encoding time of x264/libvpx wasn't) and theora was just awful (SSIM and similar quality metrics were behind for like 30-40% which was a lot).

The article you've linked it talking about some decoder implementation details which I'm not sure were all that relevant for end users - x264 was always the superior encoder, libvpx was "ok", libtheora was terrible when it came to actual encoded video results. Not sure what the article you posted really proves around that.


Theora is VP3 with some more leeway in the hope that the encoder can use the additional parameter freedom to produce better video (which never really happened; most of the 1.1 branch was, as I understand it, just cleaning up a lot of old mistakes around e.g. rate control). It's pretty clear that VP8 outperforms it.


Is there any good video codec that the library is as small or not much bigger as Theora and that has no patent/license issue?

I tried to use VP9 in the past but it's like 20-40MBs a dll. The lowest I could find was dav1d and it's still around 4MB for the library dll and encoding AV1 and getting good compression rate was not trivial.


Maybe the best option will be just waiting for h264 patents to expire, just like happened with mp3s a while ago. On that note, does anyone knows when h264 expires? Or it's earlier siblings like avc? It also has the benefit of having vast access to an array of hardware decoders/encoders, although those might have their own can of IP worms.


> does anyone knows when h264 expires

I was wondering about this too. https://www.osnews.com/story/24954/us-patent-expiration-for-... seems to think 12/2027.

Would it be surprising if H.264 was replaced by something else by that point? We have multiple subsequent standards, and it seems like everyone producing or providing content would want improved codecs by then.


That's the very last patent, and I haven't looked at it in any detail to see whether it might even be relevant to a typical software implementation; a lot of patents are on special optimisations / hardware implementations, so may not be relevant to your use-case.

It's also noteworthy that of the full list of H.264 patents here:

https://scratchpad.fandom.com/wiki/MPEG_patent_lists

...the majority of them have already expired. IANAL but since the original H.264 spec became public 20 years ago, everything in it should be usable as prior art.

Also, all existing MPEG-4 part 2 (infamous DivX etc.) patents and anything older, e.g. H.263, MPEG-1/2 and H.261, have certainly expired by now.


The problem is that H.265's patent owners didn't form a single patent pool, so the licensing is a nightmare, with three separate pools, patent holders that never joined a pool, and lots of double-billing all over the place. H.266 isn't much better. So a lot of people just stuck with H.264 as the "good enough" codec as a result. The only people not using it were organizations who couldn't or wouldn't pay for any patented invention (e.g. most of the web standards people, Wikimedia, etc), who stuck with Theora and later VP8.

AV1 (itself derived from the On2 VP8 and VP9 formats) is supposed to be the answer to H.265's patent shenanigans, but support is very slow to manifest. Like, Apple only added it to the iPhone 15 - as in, the one that just came out a month ago. Implementations of AV1 in discrete GPUs similarly only landed last year with Nvidia 40 series, Intel Arc, and Radeon 7000 series cards.


>So a lot of people just stuck with H.264 as the "good enough" codec as a result. The only people not using it were organizations who couldn't or wouldn't pay for any patented invention (e.g. most of the web standards people, Wikimedia, etc), who stuck with Theora and later VP8.

h.265 is very popular in the piracy scene, since you can get a file size about half that needed for equivalent-quality h.264. Of course, being pirates, they don't worry about patents. And since decoders are freely downloadable for players like VLC, there's no good reason not to use it.


> Implementations of AV1 in discrete GPUs similarly only landed last year with Nvidia 40 series

NVIDIA has supported AV1 hardware decoding since the 30 series. I believe they were the first to market with it.


Radeon 6000 also supports accelerated av1 decoding. The 7000 series added encoding.


> something else

Isn't AV1 the "preferred" option as a replacement, at least by those who are looking for something high quality and without patents encumbrance?


According to Wikipedia it seems Safari is only now starting to support it, so there's still a large segment of the Apple market lacking support.


AVC is not an older sibling, it is H.264, just like H.265 is HEVC.


Strip out debug symbols if you care about size? There’s no reason either of those libraries should be that large unless you’re including them.


Honest question. In which scenario do you care so much about the size of the DLL , these days ?


Well I'm looking for the same thing, for a lightweight 2D game engine supporting decades old hardware (the game player is 2MB zipped, 1MB for a minimised build). Animations are likely to be low resolution and seconds long, so a huge decoding library would be pointless. Sure, most users don't care about file sizes, but some (including me) still do. Especially for the wasm and Win95 ports.

I considered libtheora, the library size is good but the compression/visual quality is awful compared to the alternatives.


Same use-case, we used to rely on system libraries but the video codec support is all over the place on each platform, and developers would test in their main platform and ship to others without fully testing and have a broken video near the end of their games because WMV runs on Windows but not on macOS, or the audio of the video was a variant of AAC that didn't run in the web port in some places and a bunch of other random differences between platforms to the point the only reliable way was to cut system video libraries and make sure all platforms have the exact same codecd. But then it turns out there isn't something better than Theora or mpg-1 without making the game player have 10x or more in size - the game player is around 3MB unzipped.


Does that hardware have the CPU power for a recent video codec? It doesn't sound like it.


Maybe to encode/decode video on a Mars rover.


Your primary concern is almost certainly power draw, so you're using a hardware encoder. The weight of a few more megabytes ROM would be inconsequential.


I mean the Ingenuity helicopter runs Linux, which seems to suggest that a few megabytes here and there isn't such a big deal; whatever the constraints of mars are, they seem to have a fair amount of onboard storage. I can't imagine that the improved video compression of AV1 compared to Theora or H.264 wouldn't outweigh the cost of that 4MB extra storage space.


That hardware selection was locked-in years ago, I think that's the major hold up with ginny in particular. I don't think even the next helicopter they're building now will be able to get hardware AV1 encoding. Would be moderately funny if they could eek better performance out of ffmpeg with remote updates.


Just FWIW...

* eke out -- extend something by stretching it or using less.

* eek -- a noise of surprise or fear, stereotypically used upon seeing mice.

https://www.dictionary.com/browse/eke--out


Embedded would be my guess.


Embedded machine with big enough CPU for video encoding, but not having 10-20 MB flash? Doesn't sound too plausible.


an embedded machine that 1. has a software decoder, 2. has presumably some ram for that decode and 3. cant afford like 128megs flash on its design for its rom?


This feels like an absurd ask. Who cares how big the codec is? If you watch a 480p video for 10 minutes, won't that have overwhelmingly dwarfed 20MB or whatever of savings?

This feels like a gross gross misoptimization that is actively harmful to 99.999999999% of user experiences.


Just because DVDs are several GB doesn't mean it's acceptable for the decoder to be a few hundred MB.

...and in fact it doesn't need to be, as I can say so from having written an MPEG-2 (+MPEG-1) decoder myself, whose binary turned out to be less than 16KB.

When one hears about a codec being dozens of MB, the natural instinct should be "for what?" and not "who cares?" The latter attitude is responsible for why software has gotten so much more inefficient, and serves only to line the pockets of hardware manufacturers.


The size of the codec is so unimportant as to be essentially irrelevant within reason outside of a few extremely niche cases. We care about video file size, we care about subjective quality, we care about power efficiency in compression and decompression, in many applications we care about latency; on all these metrics, modern codecs are stunningly well-optimised. Modern codecs are extremely complex, but that complexity is absolutely necessary for them to perform well on the dimensions that actually matter to users.

Nobody uses Theora, because it's a bad codec. It was worse than H.264 back in 2009 and it hasn't been updated since. Removing support for dead formats is generally a very good idea, particularly in a web browser, because it reduces the attack surface; we have recently seen a number of major vulnerabilities caused by archaic, neglected file formats and codecs that provided almost no value to users.


> The size of the codec is so unimportant as to be essentially irrelevant within reason outside of a few extremely niche cases.

Codec size matters if you're going to include the codec in a phone app, which is a huge niche.

You can try to rely on system codecs, but then you're at the mercy of system codec availability and system codec security.


If you're including a video codec in a phone app, you are almost certainly making a terrible mistake. Android and iOS have very comprehensive media APIs.


Is there one codec that's universally available on all supported versions of iOS and all versions of Android with significant use? With an encoder and a decoder? If not, something has to make up the difference, and that something is probably a server that can see all the content.

Is it space efficient? (Compared to whatever you're prepared to license and run on the cpu) Not shipping a codec to save app size but sending larger media doesn't help the user much.

Are all implementations secure? If you have to predecode to verify the file won't trigger buffer overruns etc, you've written half of a safe decoder, and maybe you should just include the rest of the owl. Media decoding is a bountiful area of security vulnerabilities... which brings risks to both using the system apis and using your own, but at least you have some control over your own.


H.264 and VP8 are available essentially everywhere. Reasonably recent versions of Android and iOS (5.0 and 11, IIRC) include support for H.265 and VP9. The APIs on both platforms make it very easy to query available media formats, including hardware acceleration capabilities. Hardware acceleration is particularly important on mobile for power consumption reasons, so you should use it wherever possible.

If any app developer believes that they are better able to implement a secure codec than Apple or Google, I have a bridge to sell them.


> If any app developer believes that they are better able to implement a secure codec than Apple or Google, I have a bridge to sell them.

Google isn't the only one providing system codecs on Android phones.


VLC is a mistake?


I said almost certainly making a terrible mistake. VLC has an extremely specific niche as "that one media app that plays literally anything in any format ever". You are almost certainly not developing that app. The overwhelming majority of apps have lots of very good reasons to rely on the platform APIs and no good reasons to eschew them.


> Removing support for dead formats is generally a very good idea, particularly in a web browser, because it reduces the attack surface;

This seems unfortunate, surely they could sufficiently sandbox the decoder


They do, but it's defense in depth


Yet websites can run arbitrary JS?


"I apologize for such a long letter - I didn't have time to write a short one." - Mark Twain, et al.

A codec bloated by megabytes to me is a little too close to a web site bloated by megabytes. It works...but isn't it also a little embarrassing?


Just because you can't fanthom a legitimate case for needing a small codec lib doesn't mean there isn't one.

Typical engineering myopia.


> Who cares how big the codec is?

Small inefficiencies add up and in the end you have the janky laggy mess that is modern software.


A larger codec doesn't mean it's more inefficient in other areas than a smaller codec. A 400KB codec is not necessarily less laggy or janky than a 4MB codec.

Otherwise, why use a bunch of larger algorithms for map routing when BFS/DFS are so simple and small?


it does contribute to software bloat, I’m always impressed when I install an app and it weighs in at 20MB


esp32 with encoded video being streamed out to Wifi?


You wont be encoding any video with algorithm more computationally intensive than jpeg on esp32, and even mjpeg will be excruciatingly slow.



this is decoding


There's mjpeg for that. Is there any evidence anyone has streamed to such a platform over real video codecs? No there's not. People just make up endless shit to justify anti use cases.


How big is the VP8 codec? Did you try building libvpx with --disable-vp9?


I won’t miss Theora - but I am as part of my job helping a business migrate from some old open-source software which inexplicably re-encoded almost all audio uploads into Ogg Vorbis audio files.

I can just play those as they are, in any browser, except Safari. Painfully, macOS actually supports Vorbis, but only in a CAF (Core Audio Format) container instead of an Ogg container. Still hoping; because shipping an entire Ogg decoder in the browser with WASM works but is ugly.

Of course, I could also just re-encode them with a microservice but it’s just… bleh.


You might find this helpful: https://github.com/brion/ogv.js

From the FF announcement linked elsewhere in these comments.


Apple has always been an asshole when it comes to codec support.

VP9 on Safari? Sure, on desktop. On mobile? Oh yeah, only via WebRTC(why???).

Want to import FLACs into Apple Music? Nope, only inferior ALAC is supported for lossless.

AV1? Only just added to iPhone 15 Pro series (not in 15 cause old SoC) and still not on Macbooks.

HEVC? Oh yeah, of course we use it as HEIC for photos and support HEVC playback in Safari, how could we not?


TIL filename extensions .ogv, .ogg = Theora


ogg is just a container, the actual contents can be audio (usually opus, but other audio codecs are supported) or video (theora, plus some obscure ones)


You can put basically anything in Ogg. I toyed with the idea of putting logs in it - A log file is append-only, same as encoding a video. It has a field for timestamps already. You can use Ogg's binary seeking to quickly find positions / ranges within a log file without building up an index. It's got checksums too, I think.

It was a "nifty but who really needs it?" idea


I once made a custom Ogg stream for a simple multimedia document format [1]. The idea was that I wanted to have AV streams in the same file as basic vector graphics slides, and back in 2008 it was Ogg that had some open source momentum.

It worked, but my experience was that Ogg isn't really a well-designed container at all. Even QuickTime / MPEG-4 with its 1990s warts is more flexible and efficient. I would definitely pick Matroska today if I really wanted to torture myself with this kind of document format again.

Somebody else wrote more eloquently about Ogg's bizarre design choices:

https://hardwarebug.org/2010/03/03/ogg-objections/

"The variable overhead in the Ogg format comes from the page headers, mostly from the segment_table field. This field uses a most peculiar encoding, somewhat reminiscent of Roman numerals. In Roman times, numbers were written as a sequence of symbols, each representing a value, the combined value being the sum of the constituent values.

"The segment_table field lists the sizes of all packets in the page. Each value in the list is coded as a number of bytes equal to 255 followed by a final byte with a smaller value. The packet size is simply the sum of all these bytes. Any strictly additive encoding, such as this, has the distinct drawback of coded length being linearly proportional to the encoded value. A value of 5000, a reasonable packet size for video of moderate bitrate, requires no less than 20 bytes to encode."

- -

[1] https://github.com/pojala/twentytwenty/blob/master/twtw-ogg....


.logg - haha


That's so clever that now I have to do it lol


Usually Opus-encoded audio uses .opus as a file extension, though it's not unheard of to use .ogg. Most of the time you will find Vorbis audio in .ogg files instead.


I still have a few Ogg Vorbis audio files from many years ago too. Wonder if I can still play them on anything...


My entire music collection is encoded as ogg/vorbis (16k+ tracks).

Just about everything I own can play them. Including rockbox on my sansa clip+.

Writing about ogg vorbis as if it is a historical format is silly. Sure, it wasn't adopted by streamers, but everything on Bandcamp (for example) is available as ogg/vorbis.


I did have a little sansa clip+ with rockbox a few years back, great little machine.

I think I thought of it more as a historical format because Xiph have said that Opus supersedes it.

I mostly stopped caring because of streaming some years ago now. I have my 'old' collection as a mix of mp3 and some ogg vorbis, dating from before the streaming era. These days I buy some CDs from bands I want to support, but listen to everything by streaming.

I get the whole "<Streamer of choice> pays nothing to the artists!" argument. OTOH most the stuff I stream I also own, I'm just using streaming services to save me the hassle of ripping and hosting somewhere for access by all my devices.


Just FYI, Apple Music supports automatic iCloud syncing of your personal music files that you drag into your library. Most of what I listen to is on streaming, but I can have non-streaming music, bootlegs, vinyl rips, Bandcamp purchases, and music only available on SoundCloud or YouTube (via yt-dlp) seamlessly alongside the streaming stuff.

I’ve tried to explain how limiting any other streaming service is to friends, but I’ve mostly been met with blank stares at the mere concept of wanting to listen to off-streaming music or even where one would come across actual music files these days.


That's pretty cool, especially as my collection has a bunch of 90s-00s goth and indie bits and pieces which just aren't present on the streaming services.

This was something google play music was supposed to do as well, though that's dead now!


Yeah, I’m big into ‘90s UK hardcore where a lot of the classics were only released on some cassette or vinyl at a show back then so being able to include the obscure stuff I’ve procured is a hard requirement for me.

It makes sense that Google Play Music offered this, as it seems like adding a whole cloud-storage stack would be prohibitively complex and expensive for a company that isn’t running their own datacenters and/or already maintaining cloud storage as part of their business. Spotify supports local files and (apparently) lets you sync them iPod-style to your phone, but that’s where I draw the line in terms of music-collection-anachronism.

Random note: you do have to convert FLAC to ALAC if you want to add lossless files to Apple Music, but that’s an easy ffmpeg one-liner.


I believe Spotify still uses Vorbis in at least some circumstances?


Uh, Firefox? Chromium? Did everyone else but me give up on Vorbis? It works great. Opus is nicer for bitrate but for a long time Vorbis used way less CPU to decode than Opus.

All my music is Vorbis, Opus, or ripped m4as, and browsers take it all fine.

The first version of WebM actually used Vorbis. It's the newer ones that are Opus.


Probably VLC


Winamp, foobar2000


Winamp still exists? I thought it was resold twenty times to various shady entities.


Relaunched about a year ago.

The website is so unremittingly awful I didn't even spin up a Windows VM to try it. Massive, all bitmaps, almost no text, no actual info at all, mentions blockchain.

https://www.winamp.com/


It has always existed. The final release from years ago still works, and whoever owns it now has been updating it under version 5.9 fairly recently. There is a "Winamp 6" on the way supposedly...


There’s a modernized fork of it: https://getwacup.com/

The official one is still around, but last I checked it had a crash bug in the media library plugin.


It's like if source tarballs didn't include the compression algorithm in the extension but just used .tar for .tar.bz2, .tar.gz, etc. and just left the extractor to figure it out (which they can do).


It's more like .zip (which technically supports more compression algorithms and not just deflate) as the outer container format is the same for both Vorbis in .ogg and OPUS in .ogg (more commonly with the .opus extension) unlike tarballs where the archive format is wrapped by the compression format so a .tar.bz2 and a .tar.gz look nothing alike until you remove the compression.


Interesting. Aligns with my belief that Wikipedia videos are rarely viewed. They’re often ogg theora.


I wish the browsers were modular and you could simply disable/enable the codecs you wanted and get and use the relevant libraries


Someone will still have to invest engineering resources to maintain those codecs.


Not really, I didn't assume all the outdated codes would be maintained even if no one uses them. You could just as well disable them all before the global chrome team wakes up to all the insecurities and flips the switch for everyone

But even then, in a modular environment the potential pool of engineering resources is much broader


Aren't modern video codecs prohibitively computationally expensive for mediocre office-class PCs?


That depends on what you mean by modern - AV1 is definitely a struggle, but VP9, H.265, H.264, and even VP8 are all much better than Theora in terms of quality and performance, and hardware supported decode goes back many years even on things like Intel integrated graphics hardware.


Not really, my laptop is intel 5000 series dual core, it can handle 720p AV1 video. I did get a few dropped frames, but I had to go into "Stats for nerds" to see that they dropped. When I switched to vp9, I still had some dropped frames anyway, so it's slow regardless

My laptop doesn't have a 1080p screen, but trying 1080p it could do some videos better than others.

in this one I only dropped 2 frames:

https://www.youtube.com/watch?v=m1jY2VLCRmY&list=PLAMlLc3Zgg...


> it can handle 720p AV1 video. I did get a few dropped frames

I'm sorry for being ambiguous. I meant encoding, not decoding. In my opinion an average PC should be able to encode at least some minutes of video in reasonable time.

Whatever, having to recourse to frame dropping (even in a negligible degree) means having 100% CPU or IO load already reached, doesn't it? 100% load on 720p playback sounds bizarre. I have been accustomed to any video playback taking just a few percents on any old computer with Intel graphics.


Encoding on a PC is an edge case, especially on software. Just hardware encode H264 at that point

> I have been accustomed to any video playback taking just a few percents on any old computer with Intel graphics.

it's using like 160% of a hyperthreaded core, so not even completely saturating a laptop from 2015 (8 years ago) - I don't know why it needs to drop frames though (single core too slow?)


dav1d can decode 4K video in realtime on a pretty normal office-class PC. The real issue for is mobile.


Mobile doesn't have 4K screens, any phone released in the last few years can handle 1080p or 1440p AV1

I mean ANY phone. My phone is $160 from 2019 and it can play 1440p AV1 on youtube (with a few frame drops, but nothing noticeable), even though it has a 720p screen. I'd have to get my OnePlus 3 (2016) to stutter in 1440p

note that the OnePlus 3 would stutter in h264 1440p as well, it's just old


It's not only about “handle”, it's about not eating through all of your battery before you're done watching the video. But no, “any” phone cannot reliably play 1080p AV1:

https://www.spiedigitallibrary.org/conference-proceedings-of...

“Overall AV1 real-time playback of 720p 30fps @ 2Mbps is feasible for low-end devices with 4 threads and 1080p 30fps @ 4Mbps is feasible for high-end and mid-range devices with 4 threads using Dav1d decoder.”


> it's about not eating through all of your battery before you're done watching the video

we're talking about hours of playtime before you run out of power, you can verify this on your own phone on youtube right now, go enable av1 and marvel at how YOUR phone can play these videos at 1080p

> Overall AV1 real-time playback of 720p 30fps @ 2Mbps is feasible for low-end devices with 4 threads and 1080p 30fps @ 4Mbps is feasible for high-end and mid-range devices with 4 threads using Dav1d decoder

Why only 4 threads? My phone from 2020 has 8 cores/8 threads and a 720p screen

https://www.gsmarena.com/oppo_a32-10454.php

So even if it doesn't handle some 1080p videos that I haven't tested, I should probably be watching them at 720p since that's all my phone can do.


> Why only 4 threads? My phone from 2020 has 8 cores/8 threads and a 720p screen

Of which half of them are small cores (Cortex-A53), which is pretty likely to throw a spanner in the works of the threading model. But hey, feel free to inform the dav1d authors that their dav1d paper benchmarked dav1d wrong.


I didn't say they did it wrong, I asked a question. It wasn't a rhetorical question.

Anyway, this point might be moot because dav1d has released several versions with further NEON optimizations after the paper

https://code.videolan.org/videolan/dav1d/-/tags/1.1.0

https://code.videolan.org/videolan/dav1d/-/releases/1.2.1

https://code.videolan.org/videolan/dav1d/-/releases/1.3.0


Yes, and there's _still_ not 1080p everywhere! There was a talk on Demuxed 2023 _yesterday_, where the last sentence in the abstract is:

“720p real-time AV1 software playback on the large majority of Android devices out in the wild is now a reality.”


Maybe some difficult 1080p videos, but my phone can handle 30 FPS ones from what I've tested. It depends of course if you require NO dropped frames, or simply not noticeable performance degradation. I can't tell when it drops one frame out of a thousand


> Mobile doesn't have 4K screens . . .

The Sony Xperia 1 series does (along with a headphone jack, microSD card support, notification LED, a hardware camera button, no hole in the screen for the front camera, etc.).


That's a high end device that can chew through 4K AV1 video like no tomorrow


havent use actual theora (opus i still daily use) since gta:sa


FTA:

"Chrome will deprecate and remove support for the Theora video codec in desktop Chrome due to emerging security risks. Theora's low (and now often incorrect) usage no longer justifies support for most users. "


So I guess, regarding a theoretical "everything in the browser" future, the swiss-army tools like VLC and ffmpeg would either have to pack their own performant codecs in wasm, or stay as desktop / CLI applications.

(Not that it would make any sense to implement ffmpeg on top of WebCodecs on top of ffmpeg. Just needed an example.)

I hope I have not completely missed the train by focusing on other areas outside of web.


VLC has a port to WebAssembly: https://code.videolan.org/jbk/vlc.js#vlcjs-vlc-for-wasmasmjs

(it's actually usable to some extent)


WASM ffmpeg already exists. It was GREAT before SPECTRE lead to massive restrictions in SharedArrayBuffer.


Yeah, I figured. But last time I toyed with decoding audio in wasm, the speed was I think 5x worse. Not sure if it's cause SIMD was not implemented or what.


I am asking the WMF to add support for MP4: https://phabricator.wikimedia.org/T329258




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: