JPEG XL

Info

rules 58
github 38692
reddit 688

JPEG XL

jxl-rs 0
tools 4270
website 1813
adoption 22069
image-compression-forum 0
glitch-art 1071

General chat

welcome 3957
introduce-yourself 294
color 1687
photography 3532
other-codecs 25116
on-topic 26652
off-topic 23987

Voice Channels

General 2578

Archived

bot-spam 4577

other-codecs

AccessViolation_
2025-12-31 08:44:22
they might do this for a bunch of patents. I was just thinking about the video format related ones :/
2025-12-31 08:48:00
this organization actually pisses me off
tee
2026-01-05 12:12:55
I want to reduce the file size of images losslessly. In the case of the JXL convert error `libpng warning: iCCP: profile 'icc': 'GRAY': Gray color space not permitted on RGB PNG`, stripping PNG icc profiles seems to be fine, as the icc profile doesn't match the image for whatever reason. But what about for images downloaded off the internet (eg. artwork) that have working icc profiles? Will stripping the icc profile result in any noticeable difference for how the image is displayed? Also, for photos taken with a camera/phone, should the icc profile be kept? `oxipng -o max --strip tEXt,zTXt,iTXt,eXIf,tIME,iCCP --force`
username
tee I want to reduce the file size of images losslessly. In the case of the JXL convert error `libpng warning: iCCP: profile 'icc': 'GRAY': Gray color space not permitted on RGB PNG`, stripping PNG icc profiles seems to be fine, as the icc profile doesn't match the image for whatever reason. But what about for images downloaded off the internet (eg. artwork) that have working icc profiles? Will stripping the icc profile result in any noticeable difference for how the image is displayed? Also, for photos taken with a camera/phone, should the icc profile be kept? `oxipng -o max --strip tEXt,zTXt,iTXt,eXIf,tIME,iCCP --force`
2026-01-05 12:17:03
the ICC should be kept in most cases. here's an extreme example of an image that doesn't look right without it's ICC:
dogelition
2026-01-05 12:36:08
don't think jpg has an equvialent, but at least on png you can have a cicp chunk nowadays (probably not supported by all software yet) that can essentially have the same information as the icc profile
2026-01-05 12:36:50
e.g. see <https://github.com/w3c/png/blob/main/Third_Edition_Explainer.md#hdr-support>
2026-01-05 12:38:37
probably the most common sdr, non-srgb profile you'd see is p3 d65, e.g. on photos taken on apple devices (don't know what android oems do nowadays, maybe the same)
2026-01-05 12:38:54
it won't look completely broken if you strip the profile but it'll certainly not look as it should
2026-01-05 12:43:42
also note, the icc profiles used to tag images are typically very small (just have a few parameters instead of huge 3d lookup tables, like you'd find on some display profiles). so you're not really saving a lot of space unless you have a ton of small images
tee
username the ICC should be kept in most cases. here's an extreme example of an image that doesn't look right without it's ICC:
2026-01-05 01:20:39
Thanks, this is a good example
dogelition also note, the icc profiles used to tag images are typically very small (just have a few parameters instead of huge 3d lookup tables, like you'd find on some display profiles). so you're not really saving a lot of space unless you have a ton of small images
2026-01-05 01:21:32
While I do have ~2tb of images, it is good to know that keeping the icc profile isn't taking too much space
Adrian The Frog
2026-01-06 06:21:25
https://themaister.net/blog/2025/06/16/i-designed-my-own-ridiculously-fast-game-streaming-video-codec-pyrowave/
2026-01-06 06:23:21
Seems like a good choice for VR streaming IMO
jonnyawsom3
2026-01-06 07:32:50
Close enough, welcome back MJPEG
5peak
Close enough, welcome back MJPEG
2026-01-07 09:34:24
Quarter of the century too late.
spider-mario
2026-01-07 10:35:58
> … VMAF, are you drunk? sounds about right
Lumen
2026-01-07 10:57:19
A homosapien
2026-01-07 10:58:13
DZgas Ж
2026-01-07 11:15:18
<:monkaMega:809252622900789269> uncompressed
username
2026-01-08 02:10:31
apparently the Steam workshop just lets you upload arbitrary files as thumbnails: https://steamcommunity.com/sharedfiles/filedetails/?id=3384488018
monad
2026-01-08 03:26:16
arbitrary?
username
2026-01-08 03:42:30
I mean it seems like it although I haven't fully checked myself. It could be some backend layer that accepts more formats then the rest of the layers recognize but idk
jonnyawsom3
username apparently the Steam workshop just lets you upload arbitrary files as thumbnails: https://steamcommunity.com/sharedfiles/filedetails/?id=3384488018
2026-01-08 09:16:02
Yeah but... What's that one?
username
Yeah but... What's that one?
2026-01-08 09:37:24
AVIF
AccessViolation_
2026-01-08 11:15:25
Steam uses Akamai which has some sort of product similar to Cloudinary
2026-01-08 11:17:28
jonnyawsom3
username AVIF
2026-01-08 11:19:16
Doesn't steamdeck capture in AVIF by default?
username
2026-01-08 11:36:42
the MIME type that gets sent over by the servers is a fallback
2026-01-08 11:37:17
so this doesn't really seem like something that's actually officially supported
2026-01-08 11:38:56
could probably upload a JPEG 2000 file if you only want people on older versions of Safari on MacOS to see your thumbnail(s)
RaveSteel
Doesn't steamdeck capture in AVIF by default?
2026-01-08 01:29:24
No, the steamdeck only captures in PNG, HDR games are also just tonemapped to SDR PNGs
2026-01-08 01:29:59
the deck's settings have no options at all regarding screenshot capture
2026-01-08 01:30:21
you can change settings on the desktop, but it didn't carry over for me
HCrikki
2026-01-08 01:31:10
png v3 supports some form of (true non-gainmaped?) hdr so thats be a safer solution since png libs gonna be upgraded with that support everywhere anyway
RaveSteel
2026-01-08 01:33:08
Valve could just let the deck capture screenshots as AVIF, since this is the default for gamescope and also an option that can be enabled in the steam settings
2026-01-08 01:33:51
I assume steam tonemaps HDR to SDR if AVIF capture is not enabled?
jonnyawsom3
RaveSteel Valve could just let the deck capture screenshots as AVIF, since this is the default for gamescope and also an option that can be enabled in the steam settings
2026-01-08 01:56:05
Right, that's what I was thinking of, with your tests showing it's awful compared to JXL
RaveSteel
2026-01-08 01:57:33
I actually included JXL capture into gamescope, but haven't shared it on here yet. I have been using it for over a week now
jonnyawsom3
2026-01-08 01:59:16
Oh?
RaveSteel
2026-01-08 02:01:54
Sadly I don't expect valve to accept a PR, even if the code conformed to their standards, which I assume it does not at this point in time. Maybe when Chrome, Electron and Firefox get support
Quackdoc
2026-01-08 02:02:36
probably worth posting a PR anyways
RaveSteel
2026-01-08 02:03:24
currently I just ripped out AVIF capture and replaced it with JXL. I am working on being able to capture AVIF and JXL, with AVIF being the default
jonnyawsom3
2026-01-08 02:04:00
Waiting another month or two for the browsers might not be a bad idea, they're both getting pretty close to finished
RaveSteel
2026-01-08 02:05:38
Will take a while anyways, since I am an absolute novice at coding
2026-01-08 02:06:37
But my JXL inclusion works perfectly (as far as I can tell) for SDR as well as HDR
2026-01-08 02:06:58
Critic and suggestions will very much be welcome when I get around to posting it
HCrikki
2026-01-08 02:25:34
make a patch and even 3rdparty builds available to the willing if already produced. upstreaming is only ideal but not the only way
username
RaveSteel currently I just ripped out AVIF capture and replaced it with JXL. I am working on being able to capture AVIF and JXL, with AVIF being the default
2026-01-08 02:28:08
if JXL support becomes widespread enough you could probably switch the default to it
HCrikki
2026-01-08 02:28:13
blogging with screenshots and numbers about your tentative to add jxl support could help drum up interest and support upstreaming effort
RaveSteel
2026-01-08 07:31:01
I will do some benchmarks when I find the time
Reddit • YAGPDB
2026-01-09 11:40:21
Meow
2026-01-09 11:49:07
Sisvel should feel impressive
jonnyawsom3
2026-01-09 12:37:40
https://fixupx.com/videolan/status/2009025832187252874
2026-01-09 12:39:00
Feels weird knowing the same cone is sat behind me from the conference last year
AccessViolation_
2026-01-09 01:08:24
if AVIF2 gets added to browsers within a few months of the AV2 announcement I will be on the news
jonnyawsom3
AccessViolation_ if AVIF2 gets added to browsers within a few months of the AV2 announcement I will be on the news
2026-01-09 01:21:13
"does not bring sufficient incremental benefits over AVIF2" https://chromium-review.googlesource.com/c/chromium/src/+/7416348
2026-01-09 01:22:55
Hopefully it gets fixed up and merged before they get given yet another excuse
AccessViolation_
2026-01-09 01:26:46
maybe AVIF2 should get shafted because it doesn't have enough incremental benefits over JXL <:KekDog:805390049033191445>
2026-01-09 01:30:11
nah, I'm excited for AV2, but only for video (unless they somehow made AVIF2 (AV2F?) an appealing image format for much more than web delivery)
2026-01-09 01:31:46
I wish JXL had the dev resources AV1 has <:SadCheems:890866831047417898>
jonnyawsom3
2026-01-09 02:16:13
Simultaneously having Adobe, Apple, Intel, Meta, Microsoft and more push for support, and yet giving almost no contact, making odd implementation decisions and not contributing to libjxl which they shipped to all their devices and software
2026-01-09 02:17:46
We're desperately in need of fresh and skilled talent, there's just so many features of JXL to improve and implement
AccessViolation_
2026-01-09 02:24:57
I think we have highly talented people working on it, but there just aren't a lot of them
2026-01-09 02:25:50
I have high hopes for jxl-rs, I hope when the decode side is ready enough, some developer time will be allocated to an encoder side to it as well
2026-01-09 02:31:49
hopefully with an architecture that makes it easy to experiment with different techniques. with how expressive JXL is, I think being able to add different encoding modes/tunes/heuristics relatively quickly, without too much risk of breaking the existing code, would be really good (but I'm not an expert on anything here so)
jonnyawsom3
2026-01-09 03:11:00
libjxl is nearing a decade old, with most of it spent on the private gitlab repo, so while it makes an easy base to change variables and enable different settings, it's hard to know *what* variables to change or if some settings work at all
AccessViolation_
2026-01-09 03:20:46
imagine if testing a new tune/feature, like screen content detection and segmentation, could just be a submodule, or even its own crate, in the package
2026-01-09 03:20:59
I have no idea if that's feasible, but it would be really really nice
_wb_
2026-01-09 05:22:07
libjxl has never been a particularly elegant or clean codebase, it was born by smashing together the pik and fuif code and then doing lots of experimental stuff on top of that. I think it was a bit too ambitious to try to be at the same time a reference implementation, cutting edge in terms of both speed and compression performance, and production ready. I mean, I think it kind of achieves all of that, but not in a way that is easy to maintain or contribute improvements to.
jonnyawsom3
2026-01-09 05:31:55
The fact it works, and it works well enough to be competitive against AVIF with all the money getting pumped into it, is a testament to JXL itself
username
AccessViolation_ nah, I'm excited for AV2, but only for video (unless they somehow made AVIF2 (AV2F?) an appealing image format for much more than web delivery)
2026-01-09 08:29:40
I feel like what might happen is that AV2 gets shoved into AVIF and then they try to do a rollout similar to when they added YCgCo-R: https://github.com/AOMediaCodec/libavif/issues/2077
RaveSteel
2026-01-09 09:32:30
I really hope for an improvment in the avif tools, because it is currently not really possible to decode an HDR AVIF to PNG mathematically losslessly
veluca
_wb_ libjxl has never been a particularly elegant or clean codebase, it was born by smashing together the pik and fuif code and then doing lots of experimental stuff on top of that. I think it was a bit too ambitious to try to be at the same time a reference implementation, cutting edge in terms of both speed and compression performance, and production ready. I mean, I think it kind of achieves all of that, but not in a way that is easy to maintain or contribute improvements to.
2026-01-09 09:48:51
_definitely_ not easy to maintain
2026-01-09 09:49:45
especially after the chunked encoding changes, I think everyone on the jxl team is afraid to touch the encoder in substantial ways xD (at least me and Zoltan are, and I suspect everyone else at least as much)
AccessViolation_
2026-01-09 10:20:07
first class support for glitch art in the reference implementation sounds like a boon to me :p
whatsurname
RaveSteel I really hope for an improvment in the avif tools, because it is currently not really possible to decode an HDR AVIF to PNG mathematically losslessly
2026-01-10 02:20:53
I'm not sure what you want lossless AVIF -> PNG is already mathematically lossless, and of course you can't decode a lossy AVIF to PNG losslessly
Quackdoc
RaveSteel I really hope for an improvment in the avif tools, because it is currently not really possible to decode an HDR AVIF to PNG mathematically losslessly
2026-01-10 02:27:06
wut
2026-01-10 02:27:08
source?
RaveSteel
whatsurname I'm not sure what you want lossless AVIF -> PNG is already mathematically lossless, and of course you can't decode a lossy AVIF to PNG losslessly
2026-01-10 03:04:05
Should have been more specific, I am specifically talking about decoding lossless HDR AVIFs to PNG Decoding a 10 bit HDR AVIF to PNG via libavif's avifdec changes the checksum. Using FFmpeg does work if compiled from main at this time (recently a bug was fixed regarding this) but still results in a different checksum Using ImageMagick results in a identical checksum, but HDR metadata is lost (there are two open issues regarding this) Same thing for libvips libheif can be used to decode AVIFs, but guess what, the output also has a different checksum Bascially there is no library or tool that allows bitexact decode to other formats without either changing the checksum or losing HDR metadata
Quackdoc source?
2026-01-10 03:04:23
I opened an issue with libavif [here](https://github.com/AOMediaCodec/libavif/issues/2916)
2026-01-10 03:05:24
If anyone here knows of anything that works in a bitexact manner without losing metadata, please let me know
Quackdoc
2026-01-10 03:05:32
[monkastop](https://cdn.discordapp.com/emojis/720662879367332001.webp?size=48&name=monkastop)
RaveSteel
2026-01-10 03:06:38
The main reason why libavif decodes like it does is because the color values are rounded apparently. And it does not seem like they intend to change that behaviour
2026-01-10 03:07:22
This is the main reason why I got off my arse to include JXL encoding into gamescope lol
Quackdoc
RaveSteel This is the main reason why I got off my arse to include JXL encoding into gamescope lol
2026-01-10 03:13:22
oh right, looked at the code, steamcompmgr should probably save in 16bit for HDR so you can save scrgb without truncation
2026-01-10 03:13:57
well for jxl at least
RaveSteel
2026-01-10 03:16:19
gamescope by default does not seem to care about this, as screenshots are always saved as 10 bit AVIFs
2026-01-10 03:16:48
I would have to check how gamescope handles it
whatsurname
RaveSteel I opened an issue with libavif [here](https://github.com/AOMediaCodec/libavif/issues/2916)
2026-01-10 03:46:10
They are not lossless AVIF though, both the SDR and HDR image have MC=9, which means they are stored in YUV not RGB
RaveSteel
2026-01-10 03:47:19
I have lots of gamescope AVIFs, which are actually GBRP Same issue with those
2026-01-10 03:48:10
libavif detects them as YUV too btw FFmpeg built from main shows them as GBRP, as they actually are
whatsurname
2026-01-10 03:56:03
Do you have any examples? I would be surprised if libavif doesn't detect the CICP correctly Maybe gamescope didn't write the right value
RaveSteel
2026-01-10 03:57:53
Here's an image you can test with
2026-01-10 04:01:07
And since you will find that libavif and ImageMagick report this image as YUV, here's the offending code from gamescope which captured this image ``` pAvifImage->transferCharacteristics = bHDRScreenshot ? AVIF_TRANSFER_CHARACTERISTICS_SMPTE2084 : AVIF_TRANSFER_CHARACTERISTICS_SRGB; // We are not actually using YUV, but storing raw GBR (yes not RGB) data // This does not compress as well, but is always lossless! pAvifImage->matrixCoefficients = AVIF_MATRIX_COEFFICIENTS_IDENTITY; ```
2026-01-10 04:02:19
From [here](https://github.com/ValveSoftware/gamescope/blob/221394fedaed213f9bce6d18f60242e3120b661f/src/steamcompmgr.cpp#L2976)
whatsurname
2026-01-10 04:04:45
Yeah MC=0 means RGB, so gamescope did write the right value and libavif detected it correctly
2026-01-10 04:07:52
The "Format: YUV444" from avifdec output only indicates whether it's subsampled, the CICP is for its color space
2026-01-10 04:09:45
I did see the decoded PNG is not pixel identical though
dogelition
RaveSteel I opened an issue with libavif [here](https://github.com/AOMediaCodec/libavif/issues/2916)
2026-01-10 04:16:06
i think rather than comparing the chroma-upsampled png to the avif it would make more sense to do an avif -> png -> avif round trip test to judge whether the conversion is "lossless"
whatsurname
2026-01-10 04:16:55
libavif does not support encoding HDR PNG yet
RaveSteel
dogelition i think rather than comparing the chroma-upsampled png to the avif it would make more sense to do an avif -> png -> avif round trip test to judge whether the conversion is "lossless"
2026-01-10 04:17:28
Sure, it may be relevant if encode has similar issues, but I personally only care about decode
dogelition
2026-01-10 04:19:08
right but as mentioned in that thread, i would assume the difference in pixel data just comes from different chroma upsampling algorithms
2026-01-10 04:19:31
as comparing the avif to the png requires chroma upsampling the avif first. and that's only guaranteed to match the png if the exact same algorithm was used in generating the png in the first place
whatsurname
2026-01-10 04:21:03
No chroma subsampling here for the image above
RaveSteel
2026-01-10 04:21:04
PNG -> AVIF seems fine from a short test, the encoded AVIF has the same checksum as the 8 bit PNG I've used
2026-01-10 04:22:04
sorry, had it switched up there
whatsurname
2026-01-10 04:36:38
It's probably due to 10-bit -> 16-bit conversion I don't think there's a way to only use the low 10-bit in PNG?
Quackdoc
2026-01-10 04:40:49
it shouldnt matter unless there is processing issues
2026-01-10 04:41:30
well, if you wanted to be sure it would be best to do 10->16->10 then hash
whatsurname
2026-01-10 04:48:07
That may introduce some rounding difference
dogelition
2026-01-10 04:53:25
you can do whatever you want with the LSBs in png (using all 0s or 1s is better for compression), and ideally it should have an `sBIT` chunk indicating the real bit depth
2026-01-10 04:57:00
how do you get a hash from `identify`?
whatsurname
2026-01-10 04:57:23
I don't think it really matters though if it's just libavif and imagemagick using different algorithms for 10-bit -> 16-bit conversion
dogelition
2026-01-10 05:02:30
``` avifdec source_10bit_gbrp_hdr10_smpte2084_bt2020.avif the.png avifenc -l -d 10 --cicp 9/16/0 the.png the.avif compare -metric AE source_10bit_gbrp_hdr10_smpte2084_bt2020.avif the.avif null ``` it roundtrips perfectly so yeah, in this case since there's no subsampling the difference just comes from whatever is done to the padding LSBs
2026-01-10 05:03:42
if you compared the avif to the png MSBs they should also be identical
RaveSteel
dogelition how do you get a hash from `identify`?
2026-01-10 05:09:15
identify -format "%m %#\n"
AccessViolation_
2026-01-10 05:13:02
I literally have to look up how to use imagemagick commands every single time, the cli is not intuitive at all
RaveSteel
2026-01-10 05:13:56
This is why I have an unlimited bash history <:galaxybrain:821831336372338729>
Quackdoc
2026-01-10 05:30:43
lol
TheBigBadBoy - 𝙸𝚛
2026-01-10 08:39:28
I've set my Bash history to load on startup, then unlink so I can add commands to `.bash_history` only manually <:KekDog:805390049033191445>
monad
2026-01-11 01:12:06
but then you can't count how many times you've run something!
username
2026-01-14 05:07:07
https://github.com/MikuAuahDark/simplewebp
Exorcist
2026-01-18 09:08:16
Today I learned: > The key to good GIF compression is making LZW algorithm lossy > https://github.com/ImageOptim/gifski/releases/tag/1.13.0
jonnyawsom3
2026-01-18 09:15:55
What's the AI link for? But yeah, it's been around for over a decade https://kornel.ski/lossygif
Exorcist
2026-01-18 09:20:15
I never think entropy coding itself can be lossy <:Thonk:805904896879493180>
jonnyawsom3
2026-01-18 09:26:35
Anything lossless can inherently be lossy, it just depends what you store in it
_wb_
2026-01-19 08:07:33
anything lossless where the compressed size depends on the actual values (i.e. not something uncompressed like ppm) can be made lossy by replacing expensive-to-encode values with cheap-to-encode values
monad
2026-01-19 11:21:10
you can make ppm smaller by writing fewer values
_wb_
2026-01-19 11:27:20
downscaling the image you mean? yes, that would be a kind of lossy ppm, I guess
RaveSteel
2026-01-19 03:16:18
<@184373105588699137> If you didn't come accross a 128bpp JXR yet, I found one in my collection https://files.catbox.moe/vbnm0p.jxr it is 128bppRGBAFloat according to JxrDecApp Decoding it to EXR via magick only results in a 16 bit file, no idea why
Quackdoc
2026-01-19 03:32:53
I dont think magic does a supurbe job, I use jxrdecapp from libjxl directly and found that it will make a large tiff
2026-01-19 03:33:05
I'll try that one too though ofc
RaveSteel
2026-01-19 03:33:15
I had to use libvips, which worked
2026-01-19 03:33:22
but the resulting JXL was larger than the JXR
Quackdoc
2026-01-19 03:35:33
I did get an unsupported format with it so I'll add logic to it tonight
2026-01-19 03:35:42
which is interesting since it should work
RaveSteel
2026-01-19 03:36:38
oh? your tool worked for me
2026-01-19 03:36:46
``` height x width: 2160 x 3840 Pixel depth: ThirtyTwoF channels: 4 colorspace is: rgba128float, scRGB Pixel Arrangement: RGB image will be saved as a rgb 32fbpp jxl ```
Quackdoc
2026-01-19 03:39:04
what the fuck, jxrdecapp doesn't work for me at all, I think I downloaded the file corrupt lol
RaveSteel
2026-01-19 03:39:30
happens
Quackdoc
2026-01-19 03:39:37
lol I download the webpage one sec
2026-01-19 03:40:38
yeah it works for me too lol, typod O instead of 0 for curl
RaveSteel
2026-01-19 03:40:44
ah
2026-01-19 03:41:11
but still, sad that the JXL is much larger than the JXR
Quackdoc
2026-01-19 03:41:47
LMAO don't encode it to a png
RaveSteel
2026-01-19 03:42:02
Hadn't tried that, one second lol
2026-01-19 03:44:15
Smaller, but of course PNG only supports 16 bit. A 16 bit JXL is also much smaller than the JXR
Quackdoc
2026-01-19 03:45:00
yeah, also that is one hot image, I wonder what the nit range is on this
RaveSteel
2026-01-19 03:45:19
But since the JXL is round 2x larger than the JXL with same image hash, I can even understand why Microsoft is keeping JXR in this case lol
2026-01-19 03:46:08
tev shows ~50 over standard SDR
2026-01-19 03:46:24
Does that scale linearly or like stops in a camera?
Quackdoc
2026-01-19 03:47:21
I can't remember off the top of my head, its been a while, but this image is a goof showcase of a place where jxl lossless is struggling, it shouldn't be loosing to jxr here
2026-01-19 03:47:31
even at this bitdepth
2026-01-19 03:48:01
I'll probably add a force 16bit flag or something to the tool
RaveSteel
2026-01-19 03:49:35
I wonder how much potential savings could be had here
2026-01-19 03:50:33
hdrview shows ~12 over SDR instead of 50
jonnyawsom3
Quackdoc even at this bitdepth
2026-01-19 04:10:55
Could try using an older libjxl version before the pallete was disabled, or just comment out the line
Quackdoc
Could try using an older libjxl version before the pallete was disabled, or just comment out the line
2026-01-19 04:16:30
maybe it will depend on jpegxl-rs, what line to comment out? would be easiest route
_wb_
2026-01-19 05:08:00
wait a second, the JPEG XR wikipedia page says this: > 32 bits per component as fixed-point numbers or full-precision floating point numbers packed into 96 or 128 bits (for which lossless coding is not supported due to the excessively high precision)
2026-01-19 05:08:55
so if the jxr is not actually lossless, then it's not surprising that it compresses better than jxl lossless
Quackdoc
2026-01-19 05:17:28
I thought jxr was lossless 0.0
_wb_
2026-01-19 05:30:55
could be that wikipedia is wrong, but probably it's correct and jxr does not actually support lossless float32
RaveSteel
2026-01-19 05:37:06
oh lol
2026-01-19 05:38:29
well, no wonder that the lossless JXL is 2 times larger then
2026-01-19 06:00:23
Should be correct, I get the same image hash when converting 8 bit integer TIFFs to JXR, but not when I convert a 128bit float image
_wb_
2026-01-19 06:46:40
Possibly better compression of float32 is possible by not disabling RCTs in case there are no negative input values. YCoCg expands the range by one bit, e.g. 0..255 becomes -255..255, which is why it is disabled when the image is already 32-bit since we don't have that spare bit. But if a float32 has only positive values, it's only an 31-bit number so we could apply YCoCg...
2026-01-19 06:50:41
The thing is that you need to inspect the entire image to know that all values are indeed non-negative, and in case of chunked encoding you cannot do that. Though you could still do a per-group RCT.
Quackdoc
2026-01-19 06:51:29
so far all of the images I've tested have had at least a couple negative value pixels.
RaveSteel
2026-01-19 06:52:16
And, knowing now that these JXRs are actually lossy, JXLs performance compared to 16 bit float, which are lossless, is better than JXR
_wb_
2026-01-19 06:53:30
I need to check the exact ranges of things but probably if the float32 values are in some range like [-1, 2], that's good enough to ensure that YCoCg roundtrips.
2026-01-19 06:56:10
https://github.com/libjxl/libjxl/blob/main/lib/jxl/enc_modular.cc#L893 you could try dropping that condition and see if it still roundtrips
veluca
2026-01-19 07:40:41
could per-group channel palette first, YCoCg later
2026-01-19 07:40:46
feels a bit weird but...
RaveSteel
2026-01-19 08:18:32
No idea if I did it right, but deleting the condition increases filesize a bit
Quackdoc
2026-01-19 08:44:45
I'm amazed by how hot some of these images are, just goes to show how some games just don't care and will send the display anything it can take.
jonnyawsom3
Quackdoc maybe it will depend on jpegxl-rs, what line to comment out? would be easiest route
2026-01-19 09:15:43
I'm extremely late and now we know the data is lossy, but it might still work. I got a 4x reduction in the past <https://github.com/libjxl/libjxl/blob/53042ec537712e0df08709524f4df097d42174bc/lib/jxl/modular/transform/enc_palette.cc#L189>
_wb_ Possibly better compression of float32 is possible by not disabling RCTs in case there are no negative input values. YCoCg expands the range by one bit, e.g. 0..255 becomes -255..255, which is why it is disabled when the image is already 32-bit since we don't have that spare bit. But if a float32 has only positive values, it's only an 31-bit number so we could apply YCoCg...
2026-01-19 09:19:45
Pallete was disabled for the same reason as that (I think?), in case it didn't reduce the effective bit depth enough to be lossless with the extra palette data I could be wrong though, the commit just said "prevent overflow" without any more details
AccessViolation_
2026-01-20 09:24:00
how is this PNG able to load progressively? https://www.libpng.org/pub/png/img_png/pnglogo-grr.png
_wb_
2026-01-20 09:25:45
Adam7 interlacing
2026-01-20 09:26:13
Those marbles describe it 🙂
AccessViolation_
2026-01-20 09:34:12
oh! fun
2026-01-20 09:34:49
I thought interlacing was a tool for better compression in certain instances, I didn't know it allowed for progressiveness. TIL
jonnyawsom3
2026-01-20 09:36:41
It usually makes compression around 25-50% worse
AccessViolation_ how is this PNG able to load progressively? https://www.libpng.org/pub/png/img_png/pnglogo-grr.png
2026-01-20 09:37:50
<https://www.libpng.org/pub/png/pngpic2.html> > Thousands of people ask Greg every day, "Just how does two-dimensional interlacing work in the Portable Network Graphics specification, and what does it mean to my sex life?"
2026-01-20 09:40:26
There's some fun notes about how tech changed between the 90s and 2020, rendering the example image > The 320x240 images averaged about 40 minutes each on a 486-33; the 1024x768 monster required over 12 hours. > 3840x2160 - 175.8 seconds
spider-mario
2026-01-20 12:08:15
for a second, I thought you meant “rendering” as in rendering the interlaced PNGs
AccessViolation_
2026-01-20 02:38:32
ohhh
2026-01-20 02:38:48
that's what I thought too... I was very confused
mincerafter42
2026-01-21 01:08:41
PNG's two-dimensional progressive interlacing is an improvement over GIF's one-dimensional progressive interlacing :p
_wb_
2026-01-21 01:33:37
Yes, but you lose a lot of prediction potential, since PNG applies prediction independently to each scan. In FLIF, you also do 2D interlacing but prediction can use all nearby already-decoded pixels, which is funky for the last pass where you can predict from 7 neighbors, all except East.
2026-01-21 01:35:15
This funky interlacing was dropped in FUIF (and JXL) since it is not very nice for speed / memory access patterns, but it is fun.
DZgas Ж
2026-01-28 02:09:15
"EnCodec" neural audio compression (2022) https://discord.com/channels/794206087879852103/806898911091753051/1466071895601582296
2026-01-28 02:13:57
At 24 bitrate, it's pretty mediocre; AAC-HE and Opus might sound better in mono at that bitrate. But stereo music at 12 bitrate is absolutely unrivaled <:This:805404376658739230>
dogelition
2026-01-28 07:46:11
https://gitlab.com/AOMediaCodec/avm/-/issues/1258
2026-01-28 07:46:20
am i reading this right? av1 has 12 bit support and av2 won't?
AccessViolation_
2026-01-28 07:58:52
<:banding:804346788982030337>
TheBigBadBoy - 𝙸𝚛
2026-01-28 08:22:02
anyway most HW do not support 12bit decode
2026-01-28 08:22:20
never saw 12bit AV1 nor HEVC in the wild
dogelition
2026-01-28 08:25:35
isn't 12 bit support required for lossless 10 bit rgb images using ycocg-r?
2026-01-28 08:28:52
hevc and av1 support 12 bit and it looks like h.266 goes up to 16, so it just seems like a strange decision if they want to compete with those
2026-01-28 08:29:12
unless they just aren't intending it to be used as a high quality interchange format
2026-01-28 08:39:43
also i'm curious why they're doing the implementation out in the open but the spec development happens on a private repo
AccessViolation_
2026-01-28 08:41:48
maybe to prevent some patent troll situations?
2026-01-28 08:41:58
or make it harder for them to do it?
HCrikki
2026-01-28 08:52:39
iinm a big patent troll explicitly said theyre watching just to find ways to prey on anything implemented. i doubt aom feels its a threat, just an annoyance
AccessViolation_
2026-01-28 08:53:12
I guess you're talking about sisvel
2026-01-28 08:53:19
our favorite organization
dogelition
2026-01-28 09:02:50
recent paper with maybe interesting stuff https://arxiv.org/abs/2601.02712
AccessViolation_
2026-01-28 09:08:46
funky partitioning scheme
2026-01-28 09:10:40
laughs in JPEG XL where we can place blocks wherever we want ✨
spider-mario
dogelition isn't 12 bit support required for lossless 10 bit rgb images using ycocg-r?
2026-01-28 10:29:40
on the other hand, lossless AV[1-2] might not be worth rescuing
AccessViolation_
2026-01-28 10:37:34
hmm, I wonder if lossless AVIF uses PSNR tuning instead of some better metric
2026-01-28 10:38:57
PSNR is worse for visual similarity, but when losslessly encoding residuals, all that matters is getting the values as close to the source as possible, and that's basically what PSNR checks for afaik
jonnyawsom3
2026-01-28 10:41:21
> lossless AVIF uses PSNR ....what
AccessViolation_
2026-01-28 10:41:59
yeah, doesn't lossless AVIF work by lossy encoding a frame and losslessly encoding the residuals?
username
2026-01-28 10:42:04
I think one of the core problems with lossless AVIF is that they are using a lossy codec for the purposes of lossless which AFAIK historically always produces mediocre results. the only formats that have noticeably good lossless compression are ones where lossless is the only mode or what WebP and JXL do where the lossless mode is kinda a separate codec in a way
jonnyawsom3
2026-01-28 10:42:04
Oh, wait, didn't you say lossless AV1 is just lossy mode with residuals stored lossle- Right
AccessViolation_
2026-01-28 10:42:51
it seems like an easy oversight to continue using better visual similarity metrics, while you should actually be using PSNR, the most basic and least accurate one, for lossless :p
2026-01-28 10:42:58
no idea whether they do or not, just a random thought
2026-01-28 11:04:08
a massive screenshot of the Steam store page as a lossless webp that took 6 seconds to encode is the same size as an effort 9 lossless JXL (without patches) that took 30 seconds to encode <:SadCheems:890866831047417898>
2026-01-28 11:13:30
actually I should have done more testing before posting that. it seems lower efforts produce about the same file size, but getting over that mark used exponentially more compute for smaller and smaller gains (for this specific image)
RaveSteel
2026-01-28 11:13:37
lossless webp is king for screenshots
AccessViolation_
2026-01-28 11:14:14
it's really good
RaveSteel
2026-01-28 11:15:42
JXL is still better in general of course, but for mostly text webp is simply the best
AccessViolation_
2026-01-28 11:16:22
afaik the only coding tool it has that JXL doesn't is a palette of recently used pixels that it can reference. JXL has static palettes
2026-01-28 11:17:05
though not all the palette features in JXL are currently working, like delta palette just has an experimental implementation and doesn't always work at all
RaveSteel JXL is still better in general of course, but for mostly text webp is simply the best
2026-01-28 11:18:18
which is a bit surprising. isn't lossless JXL conceptually very similar to lossless WebP?
veluca
RaveSteel JXL is still better in general of course, but for mostly text webp is simply the best
2026-01-28 11:18:21
that's just the encoder not being good enough 😛
AccessViolation_
2026-01-28 11:20:28
at least JXL has patches which do better still
RaveSteel
2026-01-28 11:21:05
So many improvements that can still be had, the potential is enormous
jonnyawsom3
AccessViolation_ though not all the palette features in JXL are currently working, like delta palette just has an experimental implementation and doesn't always work at all
2026-01-28 11:24:35
AFAIK the only time Delta Pallete is used is when you do `-d 0 --lossy_modular_palette`
AccessViolation_
AFAIK the only time Delta Pallete is used is when you do `-d 0 --lossy_modular_palette`
2026-01-28 11:37:11
effort 7: > (15.588 bpp) effort 8: > (1.639 bpp) it's experimental, alright :p
veluca
2026-01-28 11:39:31
I'm not sure the delta palette encoder is in an especially good state
jonnyawsom3
AccessViolation_ effort 7: > (15.588 bpp) effort 8: > (1.639 bpp) it's experimental, alright :p
2026-01-28 11:43:40
Effort levels only impact the lossless compression that encodes the delta residuals, the image always stays the same. Though, because it disables chunked anyway, you can do `-e 10 -g 3 -I 100 -E 11 --patches 0` for basically free
monad
2026-01-28 11:45:52
free by what measure
AccessViolation_
monad free by what measure
2026-01-28 11:47:12
as someone that just spent 8 minutes encoding an image with delta palette at effort 9, I have some idea
monad
2026-01-28 11:47:29
i still don't
AccessViolation_
2026-01-28 11:48:56
all I know is that delta palette took much much longer than I expected. presumably jonny meant that those extra parameters add relatively little encode time compared to the massive encode time the delta palette itself has
2026-01-28 11:49:37
anyhoo I really need to sleep 👋
jonnyawsom3
2026-01-28 11:50:06
Yeah, it's already really slow, so relatively it's not much more expensive to go all the way
2026-01-28 11:50:26
2x slowdown compared against normal encoding where those settings might be 20x
monad
2026-01-28 11:56:27
ok, I wouldn't consider 2x slower to be free. and what about decode cost? anyway, that kind of inefficiency just tells me delta palette isn't worth it, but I never tried it myself.
2026-01-28 11:57:17
WebP is basically free on this scale, but for real.
AccessViolation_
2026-01-29 09:04:23
WebP doesn't have a delta palette coding tool. JXL is just as free if you disable it
spider-mario
AccessViolation_ WebP doesn't have a delta palette coding tool. JXL is just as free if you disable it
2026-01-29 11:49:07
(it does – https://groups.google.com/a/webmproject.org/g/webp-discuss/c/QlTNOR2b7ak/m/ovfJdkvuAQAJ )
AccessViolation_
2026-01-29 11:54:01
oh, huh. I didn't see that when I was reading about lossless WebP
2026-01-29 11:55:09
does JPEG XL have something equivalent to WebP near lossless, by the way?
username
AccessViolation_ does JPEG XL have something equivalent to WebP near lossless, by the way?
2026-01-29 11:56:51
lossy modular?
AccessViolation_
2026-01-29 11:56:56
from the description of how near-lossless works it doesn't seem like that's what lossy modular does by default
username
spider-mario (it does – https://groups.google.com/a/webmproject.org/g/webp-discuss/c/QlTNOR2b7ak/m/ovfJdkvuAQAJ )
2026-01-29 12:06:00
did* https://github.com/webmproject/libwebp/commit/0ec80aef3d96b95315577c3d121b771261a2274e
spider-mario
2026-01-29 12:08:17
I understood “coding tool” to mean that the format allows it?
username
2026-01-29 12:09:20
I guess it might still work but there's no encoder to do it anymore in practice
AccessViolation_
2026-01-29 12:15:50
I think I found the post that introduced the idea https://groups.google.com/a/webmproject.org/g/webp-discuss/c/W-iFWM1jGh8
2026-01-29 12:19:38
I'm not sure if this is also how the delta palette in JXL works. I'm a bit confused from the way it's been described, but it seems to be based on vectors in the palette's "color space" rather than encoding of quantized residuals to a palette entry. but maybe they're about the same in practice
2026-01-29 12:52:45
I only just found out lossy modular without squeeze is pretty good...
2026-01-29 12:53:53
I didn't know you could disable squeeze...
spider-mario I understood “coding tool” to mean that the format allows it?
2026-01-29 01:10:13
it looks like it's not a dedicated coding tool, rather the possibility for delta palette emerges from being able to reverse the order of palettization and spacial prediction in the codestream
2026-01-29 01:17:12
hence why I didn't read about it in official papers
monad
AccessViolation_ WebP doesn't have a delta palette coding tool. JXL is just as free if you disable it
2026-01-30 12:59:49
Gotcha, I was working within the context where JXL was 5x the cost of WebP.
TheBigBadBoy - 𝙸𝚛
2026-02-02 08:19:51
what other discord servers exist for "general" data compression (images, archives, audio, ...) ?
DZgas Ж
TheBigBadBoy - 𝙸𝚛 what other discord servers exist for "general" data compression (images, archives, audio, ...) ?
2026-02-02 08:58:43
meh, https://encode.su/forums/2-Data-Compression
TheBigBadBoy - 𝙸𝚛
2026-02-02 08:59:27
I was specifically talking about discord
DZgas Ж
2026-02-02 09:01:12
then there is hardly anything else like this, it is such a highly specialized topic, too few people are interested in anything other than the very basics
TheBigBadBoy - 𝙸𝚛
2026-02-02 09:07:51
right thanks <:SadOrange:806131742636507177>
jonnyawsom3
2026-02-05 05:45:12
I just got in bed at a brisk 5am, and as my eyes adjusted to the dark I had an idea. VR headsets started to use foveated streaming to save bandwidth and improve clarity where the user is looking, but our eyes only have cone cells to receive color in the centre and rods surrounding it, so instead of just lowering the resolution/quality in the periphery, what if we weighted Luma and Chroma differently? Full chroma and reduced luma in the very centre, then full luma and significantly reduced chroma in the periphery I'd assume someone's already thought of it, and the quality reduction already lowers chroma more, but there might be some improvements by redistributing the spatial quality of the planes
DZgas Ж
I just got in bed at a brisk 5am, and as my eyes adjusted to the dark I had an idea. VR headsets started to use foveated streaming to save bandwidth and improve clarity where the user is looking, but our eyes only have cone cells to receive color in the centre and rods surrounding it, so instead of just lowering the resolution/quality in the periphery, what if we weighted Luma and Chroma differently? Full chroma and reduced luma in the very centre, then full luma and significantly reduced chroma in the periphery I'd assume someone's already thought of it, and the quality reduction already lowers chroma more, but there might be some improvements by redistributing the spatial quality of the planes
2026-02-05 09:31:10
I thought Steamframe was using some complex algorithm for this, but it's actually using the simplest algorithm possible, the simplest possible in principle: simple resolution reduction, quantization, interpolation, that's all. And it's all so simple solely for the sake of speed in calculating each frame. Your idea sounds reasonable, as if we were using a 256-color palette for the entire blurred space. BUT it's not feasible, given the current performance requirements for this algorithm, which is based on maximum simplicity.
2026-02-05 09:35:44
any conversion to another color space or any algorithm for determining relative and reasonable Luma and Chroma would be too complex for all this, for the sake of saving a few bits
jonnyawsom3
DZgas Ж I thought Steamframe was using some complex algorithm for this, but it's actually using the simplest algorithm possible, the simplest possible in principle: simple resolution reduction, quantization, interpolation, that's all. And it's all so simple solely for the sake of speed in calculating each frame. Your idea sounds reasonable, as if we were using a 256-color palette for the entire blurred space. BUT it's not feasible, given the current performance requirements for this algorithm, which is based on maximum simplicity.
2026-02-05 10:14:28
Hardware encoders support a region of interest map, it's feasible that could be extended to support independent Luma and Chroma weights
DZgas Ж
Hardware encoders support a region of interest map, it's feasible that could be extended to support independent Luma and Chroma weights
2026-02-05 10:18:40
the original algorithm doesn't do any compression; it's pure data transmitted directly, without any DCT-like processing. The original pixels and the reduced pixels are out of view. This is technical RAW, since rendering a game involves a large number of different pixels, so compression is inappropriate. And any algorithm beyond that seems unnecessarily complex
dogelition
2026-02-07 07:47:40
missed opportunity to save 1 bit in some transform blocks in av2 https://gitlab.com/AOMediaCodec/avm/-/issues/1237
lonjil
2026-02-07 08:52:17
Quick someone check how much of a difference that makes
spider-mario
2026-02-07 09:56:53
would 9 even have been the most judicious value to reduce to 1 bit out of the three?
dogelition
2026-02-07 10:00:10
good point actually
RaveSteel
2026-02-08 08:48:27
https://www.phoronix.com/news/Glycin-2.1-Beta-JPEG-2000
DZgas Ж
2026-02-08 10:24:04
It's interesting that many compression formats—images, video, audio—are quite resilient to data, block, and byte corruption. But none can survive a byte shift. Except perhaps MP3, hmm, which was designed with CDs in mind, among other things. All the others require precise positional correspondence. mkv & webm containers have clusters that can be decoded separately, but they are usually 256 kilobytes in size, which is large, shift 1-byte enough after which the entire cluster dies.
_wb_
2026-02-08 10:49:52
JPEG and JPEG 2000 are in principle both able to re-sync after byte insertion/deletion corruption, since they're using markers (0xFFXX) to separate segments.
2026-02-08 10:51:47
in JXL we didn't want to bother with having to escape 0xFF bytes and we wanted to be able to seek immediately to bitstream sections to dispatch parallel decode threads, so we use a TOC instead which is fragile for insertion/deletion corruption (the offsets will all become wrong)
2026-02-08 11:00:05
most modern media don't really produce insertion/deletion corruption though, and in any case corruption protection is generally something handled at a different system layer because you cannot really do error protection effectively without having some model of the kinds of errors that will get introduced. E.g. an SSD drive or a wifi transfer will behave quite differently, and different error protection strategies are applicable in different scenarios
2026-02-08 11:01:09
so it doesn't really seem like a good idea to try to solve it at the level of the file format
RaveSteel
2026-02-08 11:34:08
The best way to at least verify integrity is by implementing checksums the way FLAC does
2026-02-08 11:36:08
You cannot recover the file, but at least you'll know, that it lost its integrity
2026-02-08 11:36:19
Everybody ought to have backups anyway
DZgas Ж
RaveSteel Everybody ought to have backups anyway
2026-02-08 11:55:43
This is precisely the approach I don't like ideologically. It's like "it's the 21st century, nothing breaks easily anymore, no damage, so we won't protect it at the format and container level" "But by the way, it's better to have copies, otherwise everything might break."
2026-02-08 12:00:03
Just think how much better it would be if inside each 256x256 independent Jpeg XL block there were 2 bytes of Reed Salamon's recovery code.
_wb_ in JXL we didn't want to bother with having to escape 0xFF bytes and we wanted to be able to seek immediately to bitstream sections to dispatch parallel decode threads, so we use a TOC instead which is fragile for insertion/deletion corruption (the offsets will all become wrong)
2026-02-08 12:04:12
Interesting information, does this mean that not a damaged byte, but a damaged single bit is enough to break the entire file?...Well, I actually just did that and it won't open anymore, oh wellp.
_wb_
DZgas Ж Interesting information, does this mean that not a damaged byte, but a damaged single bit is enough to break the entire file?...Well, I actually just did that and it won't open anymore, oh wellp.
2026-02-08 12:14:02
Most bitflips should get detected in jxl, and libjxl will then refuse to decode. Of course decoders could do other things when detecting corruption, e.g. see https://github.com/libjxl/libjxl/pull/4062
DZgas Ж Just think how much better it would be if inside each 256x256 independent Jpeg XL block there were 2 bytes of Reed Salamon's recovery code.
2026-02-08 12:16:23
We kind of have that, in the sense that the final ANS state is used like a checksum.
2026-02-08 12:18:39
That gives us free checksums for every bitstream section - free in both senses: it doesn't have to be explicitly signaled and it doesn't have to be computed, it's just a side effect of the entropy decode.
DZgas Ж
_wb_ Most bitflips should get detected in jxl, and libjxl will then refuse to decode. Of course decoders could do other things when detecting corruption, e.g. see https://github.com/libjxl/libjxl/pull/4062
2026-02-08 12:19:56
As I understand it, this is not in the current latest version?
_wb_ We kind of have that, in the sense that the final ANS state is used like a checksum.
2026-02-08 12:21:52
that's an interesting property, like no side effects left at the end of decoding? like no remainders from division in
2026-02-08 12:22:11
all the sums converge and the result is 0
2026-02-08 12:24:40
In my personal tests, a 10-year-old HDD damaged 3 bytes when transferring 500 gigabytes from scratch. I'm not entirely happy with how jpeg xl completely dies like solid archive with a single-bit error
2026-02-08 12:30:35
Not all formats actually suffer from 1-byte corruption, but there is an example, here it is, 1 byte, jpeg, webp, avif, but jxl just died
2026-02-08 12:32:27
The lesson here should be that the data saved on more technologically advanced compression should be used for data recovery, but there is just no such thing, you saved data, and you have more risk
2026-02-08 12:35:04
Looks like I'll have to write the recovery data to the picture description block heh
2026-02-08 12:37:19
What I'm getting at is that using jpeg xl for archiving can have a very serious counterargument because of all this.
2026-02-08 12:41:09
The format, by its architecture, does not imply any mechanism of damage control, even despite the block essence of the entire format. Although I was sure that unrelated superblocks of size 256x256 can be independent when decoding <:SadCheems:890866831047417898>
VcSaJen
2026-02-08 01:50:21
The need to "separate" stuff into the pretty theoretical categories at the expense of practicality _and_ parity with older formats is not something I like. I don't really care about checksums or corruption, but the absence of lossnessness indication (a single bit!) means that I always will prefer random PNG to a random JXL on the web when I want to save an image.
_wb_
2026-02-08 01:57:58
PNG does not have a losslessness indication bit, does it? At least I wouldn't know what chunk that info should go in when writing a lossy PNG.
DZgas Ж The format, by its architecture, does not imply any mechanism of damage control, even despite the block essence of the entire format. Although I was sure that unrelated superblocks of size 256x256 can be independent when decoding <:SadCheems:890866831047417898>
2026-02-08 01:58:55
Libjxl checks errors and refuses to decode corrupted files. It's trivial to make it produce corrupted output instead, if you prefer that behavior.
DZgas Ж
_wb_ PNG does not have a losslessness indication bit, does it? At least I wouldn't know what chunk that info should go in when writing a lossy PNG.
2026-02-08 02:00:58
I checked and PNG also breaks when any bit is damaged
_wb_ Libjxl checks errors and refuses to decode corrupted files. It's trivial to make it produce corrupted output instead, if you prefer that behavior.
2026-02-08 02:01:44
Yes, that's what I wanted to ask, libjxl refuses to decode even without performing any decoding, how can I force it?
_wb_
DZgas Ж I checked and PNG also breaks when any bit is damaged
2026-02-08 02:01:53
Also depends on decoder implementation/settings in case of PNG. Nothing forces you to refuse to decode if the CRC doesn't match
DZgas Ж
2026-02-08 02:03:18
Considering that png is line-based and is based on archive compression, I can assume that the limit to which decoding can be performed is the line up to the first error
_wb_
DZgas Ж Yes, that's what I wanted to ask, libjxl refuses to decode even without performing any decoding, how can I force it?
2026-02-08 02:03:53
The PR I linked earlier adds an option to make it decode corrupted files, at least if the corruption is in HF groups it will fall back gracefully and show upsampled LF instead for the corrupted HF groups.
DZgas Ж
2026-02-08 02:04:26
and yet it is not accepted for a year
username
_wb_ PNG does not have a losslessness indication bit, does it? At least I wouldn't know what chunk that info should go in when writing a lossy PNG.
2026-02-08 02:16:51
people just assume PNG is always lossless due to various factors such as most software and tools not doing any lossy transformations on the pixel data for the most part in most cases (although I still see people get tripped up with that *one* export option in Photoshop). In most cases online it's safe to assume that a PNG on a website will be the least lossy of the available formats since in most cases people and even some CDNs don't really think too much at all when exporting various formats so what you usually get/see is whatever the default settings are for the most common encoders which in PNG's case is 8bpc lossless
_wb_
2026-02-08 02:31:54
I wonder if you take a random sample of PNG files from the web, how many would be actually lossless. I suspect it may be less than half.
2026-02-08 02:35:40
JPEGs saved as PNG, 16-bit images getting saved as 8-bit, 8-bit images getting saved as a color-reduced palette image, transparent images accidentally getting flattened, layered images by necessity getting merged, etc etc: there are many ways to get a PNG file that is not actually lossless wrt the original source image.
2026-02-08 02:37:22
(not to mention downscaling and cropping)
AccessViolation_
2026-02-08 02:55:31
<https://xkcd.com/1683/>
spider-mario
_wb_ JPEGs saved as PNG, 16-bit images getting saved as 8-bit, 8-bit images getting saved as a color-reduced palette image, transparent images accidentally getting flattened, layered images by necessity getting merged, etc etc: there are many ways to get a PNG file that is not actually lossless wrt the original source image.
2026-02-08 02:56:50
also PNGs that come from GIFs
RaveSteel
AccessViolation_ <https://xkcd.com/1683/>
2026-02-08 02:59:44
AccessViolation_
AccessViolation_ <https://xkcd.com/1683/>
2026-02-08 02:59:51
interestingly, this is the only xkcd I've found where the responsive '2x' version of the image that you see if it's displayed large enough, actually has a lower 'real' resolution, like it was downscaled and then upscaled. could be a bug or a joke. probably a joke
whatsurname
_wb_ JPEGs saved as PNG, 16-bit images getting saved as 8-bit, 8-bit images getting saved as a color-reduced palette image, transparent images accidentally getting flattened, layered images by necessity getting merged, etc etc: there are many ways to get a PNG file that is not actually lossless wrt the original source image.
2026-02-08 03:11:43
That leads to a question: do you really need "true lossless" on the web? Honestly I think AVIF's lossless RGB mode might be a mistake
dogelition
2026-02-08 03:28:27
for regular website assets, probably not for the web as an application platform, definitely yes
spider-mario
2026-02-08 03:31:05
using AVIF specifically, probably not
dogelition
2026-02-08 03:33:12
only advantage for (non-animated) lossless rgb avif i can think of is that i *think* it had better hdr support, at least before the new png standard
_wb_
whatsurname That leads to a question: do you really need "true lossless" on the web? Honestly I think AVIF's lossless RGB mode might be a mistake
2026-02-08 03:38:10
"the web" is not one thing. Typically lossy suffices. But something like an image editor on a Chromebook is arguably also "the web"...
2026-02-08 03:39:35
Or a radiologist accessing a CT via some web interface, to give another example
lonjil
DZgas Ж What I'm getting at is that using jpeg xl for archiving can have a very serious counterargument because of all this.
2026-02-08 03:40:04
for archiving you should be using storage solutions that apply Reed-Solomon to all data, regardless of what kinds of files you have. Like, I've logged hundreds of read errors from my NAS hard drives, but I've never had any file corruption. Why? Because I've got them in mirror and raid configurations that automatically fix errors.
whatsurname
2026-02-08 03:44:08
For where lossless is needed, lossless YUV is almost always better than the RGB mode except for not being "true lossless"
_wb_
2026-02-08 03:46:06
Lossless YUV is essentially the same as RGB787, it's rather easy to improve compression if dropping some lsb is allowed...
username
whatsurname For where lossless is needed, lossless YUV is almost always better than the RGB mode except for not being "true lossless"
2026-02-08 03:47:15
many color values get shifted around with YUV, and it for example makes it impossible to get original intended color values out of reference sheets that don't put down hex codes
2026-02-08 03:47:33
I persoanlly consider non-lossless lossless YUV to cause more problems then it solves
2026-02-08 03:47:58
*personally
dogelition
2026-02-08 03:49:12
with ycgco-r you're not losing any data, at the cost of requiring an extra bit (and weirdly it seems like av2 will max out at 10 bit initially, so you can't use that for 10 bit rgb data anymore unlike with av1)
username
dogelition with ycgco-r you're not losing any data, at the cost of requiring an extra bit (and weirdly it seems like av2 will max out at 10 bit initially, so you can't use that for 10 bit rgb data anymore unlike with av1)
2026-02-08 03:50:46
that didn't exist before for AVIF which means older decoders will fail on it and a lot of software is still outputting with the old raw RGB mode AFAIK
2026-02-08 03:51:34
for example Valve's gamescope still outputs raw RGB lossless AVIFs when taking screenshots
jonnyawsom3
_wb_ I wonder if you take a random sample of PNG files from the web, how many would be actually lossless. I suspect it may be less than half.
2026-02-08 06:19:19
Recently I found out Britain's biggest bakery chain uses lossy PNG for their food photos, presumably because it has transparency <https://www.greggs.com/menu/product/sausage-roll-1000446>
whatsurname For where lossless is needed, lossless YUV is almost always better than the RGB mode except for not being "true lossless"
2026-02-08 06:26:13
> When you need lossless, lossy is better I don't think most would agree with that
monad
2026-02-08 09:27:50
Last I checked, on my system Reddit was delivering reduced-palette PNG when the original was PNG.
whatsurname
> When you need lossless, lossy is better I don't think most would agree with that
2026-02-09 03:15:02
Perceptually lossless is often what they actually need when people ask for "lossless". For the image editor case you probably need mathematically lossless, but AVIF is mostly served as a delivery format (for which many artists are already happy with high quality JPEG)
NovaZone
2026-02-09 03:30:54
"High quality jpeg" kek
2026-02-09 03:56:29
I wonder how many ppl see jpeg and think high quality fr
whatsurname
2026-02-09 04:02:38
That meant literally high quality, often q > 95, some artists even export it with q 100
NovaZone
2026-02-09 04:03:28
I suppose but even then dct artifacts everywhere
2026-02-09 04:05:18
So even as a "delivery format" most would prefer webp
whatsurname
2026-02-09 04:10:09
Eh... No? You can't get 4:4:4 with WebP
NovaZone
2026-02-09 04:12:02
True but 444 doesn't magically make it immune to dct artifacts/ac errors
2026-02-09 04:14:26
Of which jpeg is well known for, where as webp is mostly just blocking/loss of details
whatsurname
2026-02-09 06:29:42
I doubt dct will be more of a problem than chroma subsampling at high quality. Chroma subsampling will destroy colors at edges (unless with sharp yuv, but that may introduce other artifacts)
jonnyawsom3
2026-02-09 06:31:08
Meanwhile most JPEGs I see are either q100 4:2:0 or q50 and only legible thanks to resolutions being so high nowadays
2026-02-09 06:32:07
We wanted to add sharp YUV to jpegli, since that should give it yet another boost for relatively little cost, but we were running into some issues
whatsurname
2026-02-09 06:39:06
I think Photoshop always uses 4:4:4 if you export with high quality preset
jonnyawsom3
2026-02-09 06:41:20
Not everyone uses photoshop
NovaZone
whatsurname I doubt dct will be more of a problem than chroma subsampling at high quality. Chroma subsampling will destroy colors at edges (unless with sharp yuv, but that may introduce other artifacts)
2026-02-09 07:05:29
Interesting cause I could swear that even with jpeg 95 444, dotcrawl, chromatic aberration, ringing, chroma shift, occur, specifically around borders/edges
2026-02-09 07:05:43
And especially with any kind of text
2026-02-09 07:22:39
don't get me wrong jpeg is still a necessary evil, but if I ask for lossless, I mean lossless, if an artist gave me "perceptual lossless" I would ask for my money back xD
whatsurname
2026-02-09 07:32:04
Well commissions and mass sales are different, some even provide PSDs for the former, while the latter usually only gets JPEGs
NovaZone
2026-02-09 07:36:59
I mean for cdn's I get it their pipelines are fixed and their stubborn or lack the finances, but even for mass buys? Really?
jonnyawsom3
2026-02-09 07:40:14
I've helped half a dozen artists with their export formats, some not realising q100 JPEGs were lossy, others not knowing the compression level for PNG is lossless
NovaZone
2026-02-09 07:41:39
Wild
2026-02-09 07:42:38
Tho with "lossless" jpegs / lossy pngs out in the world I see the confusion
jonnyawsom3
2026-02-09 07:44:10
Well, lossless JPEG pretty much only exists as DNG files, most consumer software just throws an error. Lossy PNG generally has the reaction of "What do you mean lossy PNG?" followed by me telling them not to worry about it
NovaZone
2026-02-09 07:45:45
Was my reaction to lossy png as well 😂
username
_wb_ https://github.com/mozilla/mozjpeg/issues/444
2026-02-09 08:47:55
seems like someone has recently submitted a PR to fix this: https://github.com/mozilla/mozjpeg/pull/453
AccessViolation_
DZgas Ж Not all formats actually suffer from 1-byte corruption, but there is an example, here it is, 1 byte, jpeg, webp, avif, but jxl just died
2026-02-09 09:01:17
what's interesting about this to me is that you can clearly see AVIF and WebP are attempting to exploit spatial redundancy/similarity while the JPEG block is entirely self-contained I don't think JXL really does that? at least from what I've read in the paper
2026-02-09 09:03:18
I know it does for the LF image, which is a Modular mode image and uses can use the predictors, but aside from that and Patches it has no spatial prediction mechanism I think?
2026-02-09 09:05:47
it does do chroma from luma and something else I've forgotten the name of for inter-channel prediction, but that's still done on the block level. it makes me wonder if JXL could've compressed better yet if it had spatial prediction modes in VarDCT like WebP and AVIF do in their lossy modes
2026-02-09 09:09:45
I suspect that in the high quality range, the contribution of these these spatial predictors to compression efficiency of the image as a whole becomes relatively small. I don't expect spatial prediction to work well at all for the higher frequency components as they'll be all over the place, and those probably take up the bulk of the data
DZgas Ж
AccessViolation_ what's interesting about this to me is that you can clearly see AVIF and WebP are attempting to exploit spatial redundancy/similarity while the JPEG block is entirely self-contained I don't think JXL really does that? at least from what I've read in the paper
2026-02-09 03:05:32
I thought that jxl is self-sufficient in 256x256 blocks, but apparently this is not the case. In any case, there is currently no way to use the current decoder and decode the image; it rejects it even before it starts working if there are errors.
Exorcist
NovaZone I suppose but even then dct artifacts everywhere
2026-02-09 04:59:00
Here we go again: https://github.com/victorvde/jpeg2png
NovaZone
Exorcist Here we go again: https://github.com/victorvde/jpeg2png
2026-02-09 07:10:02
Yea I use it sometimes, needs to be tweaked per source tho
2026-02-09 07:10:18
Same with quant smooth
_wb_
DZgas Ж I thought that jxl is self-sufficient in 256x256 blocks, but apparently this is not the case. In any case, there is currently no way to use the current decoder and decode the image; it rejects it even before it starts working if there are errors.
2026-02-09 07:14:34
It does start working, the errors can only be detected at the end of a section. But it will abort as soon as it detects error. In the branch I shared earlier, you can make it continue instead (as long as the error is not in a critical place like in the headers).
DZgas Ж
_wb_ It does start working, the errors can only be detected at the end of a section. But it will abort as soon as it detects error. In the branch I shared earlier, you can make it continue instead (as long as the error is not in a critical place like in the headers).
2026-02-09 11:06:04
This is good, but it doesn't change the futility of this action at the moment in the latest version of the decoder. No one does like this; jpeg, webp, avif, they all create a file even if it is broken due to damage.
_wb_
2026-02-10 06:40:25
What to do with corrupted files is not a property of a codec but of an application. Some applications will also refuse to load broken jpegs or pngs.
2026-02-10 06:41:13
Arguably it is better to know that a file is corrupted than to silently use corrupt data
2026-02-10 06:45:23
Case in point: I remember a customer complaining that their images looked blurry when we served them, even when we served the original as-is! Turned out they were uploading truncated progressive jpegs where the final scans were just missing, and all browsers just happily show that and few software is showing even a warning that the data is truncated. After this, we made sure to check the integrity of uploaded images to make sure it's not truncated or otherwise corrupted.
jonnyawsom3
2026-02-10 06:52:12
I was playing with a truncated JPEG yesterday, DC only but the viewer still reported it as 4K, Quality 98, 4:4:4 naturally
AccessViolation_
2026-02-10 04:02:27
I was going to say it might be possible to recover a lost bit semi-consistently using surrounding context, but of course the whole point of compression is to remove redundancies like that, so the better your compression, harder that's going to be to any extent
DZgas Ж
_wb_ Case in point: I remember a customer complaining that their images looked blurry when we served them, even when we served the original as-is! Turned out they were uploading truncated progressive jpegs where the final scans were just missing, and all browsers just happily show that and few software is showing even a warning that the data is truncated. After this, we made sure to check the integrity of uploaded images to make sure it's not truncated or otherwise corrupted.
2026-02-10 08:28:22
This is a good example, but I want to point out that in this case, JXL is corrupted and says the file is broken, and the user will just delete it because what else can they do with it? But in reality, it could be decoded, viewed, and even restored if the damage isn't critical.
2026-02-10 08:30:00
A damaged regular JPEG is truly damaged, but that's no reason not to decode it if it's damaged, because the chances of getting damaged anywhere just like that are still not zero.
2026-02-10 08:32:59
In the given example, you are to blame for creating a broken file, and not the decoder that decoded it. I don't see much point in not decoding even broken files.
2026-02-10 08:33:56
A decoder that doesn't open a file is a bad decoder.
A homosapien
2026-02-10 09:02:27
Rather, a decoder that rejects bad input is a good decoder.
_wb_
2026-02-10 10:26:12
It all depends on the use case, but in general it's a good idea to make a clear distinction between what is valid and what is not. Making decoders reject invalid input is generally the best way to force encoder compliance. Doesn't mean you cannot have decoders aimed at data recovery, but that shouldn't be default behavior or you are effectively generalizing what is, de facto, a valid bitstream.
DZgas Ж
2026-02-11 12:34:07
Decoders do things strictly defined by the documentation. That's right. But decoder could...
spider-mario
2026-02-11 12:38:53
the jxl specification is only concerned with how to decode conformant bitstreams
2026-02-11 12:39:17
it says nothing about error handling for non-conformant bitstreams (such as corrupted ones)
DZgas Ж
spider-mario the jxl specification is only concerned with how to decode conformant bitstreams
2026-02-11 12:42:14
I don't think that something like this exists in any data compression format, well, so
spider-mario
2026-02-11 12:42:32
you don’t think what exists?
DZgas Ж
2026-02-11 12:43:34
I think that when Webp was developed, nowhere in the specification was it written what to do in case of damage to 1 bit of data, and yet I can decode it
_wb_
2026-02-11 12:56:47
In formats with weak error detection (not sure about WebP but JPEG is an example), random bitflips can often turn a conformant bitstream in another conformant bitstream that just represents a different (corrupted) image. In JPEG for example (or JXL when using prefix codes instead of ANS), changing any prefix code into a different code of the same length will usually keep the bitstream conformant while changing the contents. If I recall correctly, in JPEG you can also change it in more ways as long as you don't cause overruns (but underruns are fine), since padding at the end is allowed to be arbitrary. In that case a decoder has no choice but to decode the file as-is since it cannot detect it is corrupt — it's still a valid bitstream, even if the image data is corrupt, so a decoder has to decode it.
2026-02-11 12:58:20
JXL has somewhat stronger error detection, in the sense that in the typical case (ANS) it is very unlikely that a random bitflip will keep the bitstream conformant. So errors can be detected. What to do in case of error is up to the decoder implementation though; the spec only describes how to decode conformant bitstreams.
2026-02-11 01:19:18
JPEG decoders in browsers tend to be very permissive (showing some image even when it can detect corruption), just like how they are with parsing technically invalid HTML. That's understandable, historically, but it is not a good practice: it basically means you are creating a de facto standard that extends and complicates the real standard, and everyone will be forced to comply with it.
TheBigBadBoy - 𝙸𝚛
2026-02-11 01:37:20
The worst imo is PDF, I saw dozen of malformed/corrupted PDF that are still shown correctly not only by browsers by also other apps
DZgas Ж
2026-02-11 03:23:12
hm
2026-02-11 05:14:42
For experiments, I compiled libjxl https://discord.com/channels/794206087879852103/804324493420920833/1471165602130301073 to work more directly, I connected the library via Python, and also changed a lot of the source code using hard hand. I will continue the rest of the topic of Jpeg XL decoding broken ones in <#794206170445119489> It turned out to be so simple, and JPEG XL handles being turned into a byte field burned out by a nuclear explosion. Now I don't even know why all that whining about standards was said above.
whatsurname
2026-02-12 01:34:50
https://www.androidauthority.com/android-17-beta-1-3639738/ I'm a bit salty that Android is getting VVC support before JXL, I guess it's at some OEMs' request
derberg🛘
2026-02-12 02:09:37
Can then come together with ECM /s
DZgas Ж
2026-02-12 07:11:01
vvc still dead 🥹
DZgas Ж Not all formats actually suffer from 1-byte corruption, but there is an example, here it is, 1 byte, jpeg, webp, avif, but jxl just died
2026-02-12 08:13:15
I created a precise byte mask for JXL files. The bytes that must never be damaged are the head. The entire JPEG XL structure is located in the head, followed by the data. If the head is damaged, recovery is impossible. Only in theory can one understand the block grid stored in the head, but no image is present. The percentage of the head varies and depends on the image size, compression level, and block structure. For example, with e5 compression, the structure is more chaotic, leading to a large division of blocks into subblocks, creating a larger head. With e1-3, e9-10, the head is smaller due to a simple or very well-arranged block structure. The important thing here is that, unlike WebP and Avif, which completely break if a single byte within the stream is damaged, JPEG XL places the entire structure at the top of the file, allowing decoding. The main thing is that the head is intact, which is approximately 10% of the beginning of the file. Everything else can be completely destroyed, anywhere in any quantity. All white pixels are bytes that can be destroyed. I performed a binary search for decoding errors, about 50,000 decodes, to fully search for the exact structure. That is, JPEG XL has a greater technological advantage than any other codec, as long as the damage doesn't occur at the beginning of the file. While AVIF/Webp will be completely broken if the stream is damaged, all subsequent pixels will be damaged, and if the damage occurs at the beginning, everything after that will be damaged. This is because the AVIF structure is located in the same place as the data. (Of course, classic jpeg will always be better in this matter.) ~320 kb files e1 | e5 | e10
spider-mario
2026-02-15 10:53:54
I’m starting to despise the SACD format more and more
AccessViolation_
2026-02-15 10:59:20
just predict the noise, why didn't we think of that
2026-02-15 11:02:32
ah they're using noise shaping
spider-mario
2026-02-15 11:05:05
yes, very aggressive
2026-02-15 11:05:22
DSD is basically 1-bit, 2.8 MHz audio
2026-02-15 11:06:41
wikipedia: > DSD uses delta-sigma modulation, a form of pulse-density modulation encoding, a technique to represent audio signals in digital format, a sequence of single-bit values at a sampling rate of 2.8224 MHz. This is 64 times the CD audio sampling rate of 44.1 kHz, but with 1-bit samples instead of 16-bit samples.
2026-02-15 11:06:50
and all of this for what?
2026-02-15 11:07:26
(https://sjeng.org/ftp/SACD.pdf is a famous critique of this approach)
Exorcist
2026-02-15 11:23:35
spider-mario
2026-02-16 08:38:34
the paper doesn’t address DACs, only ADCs and the resulting signal
2026-02-16 08:38:50
the problems they refer to are baked into the digital signal
lonjil
Exorcist
2026-02-16 12:13:22
the last time I needed a DAC, I hooked up an 8-bit PWM to a low pass filter. Very quality.
AccessViolation_
2026-02-16 01:50:08
AVIF support, tell me about it <:KekDog:805390049033191445>
dogelition
2026-02-18 04:54:41
am i missing something or are OBS's bit depth/format settings wired up in an extremely confusing way? i'm using nvenc av1, and there doesn't seem to be any indication as to what color formats are supported. in the dropdown i can select anything like 16 bit 4:4:4 which isn't even part of the av1 spec, and it seems to just silently degrade to 8 bit 4:2:0
Quackdoc
2026-02-18 09:07:31
obs is dumb, but it uses standard making convention iirc, what options are there?
A homosapien
2026-02-18 09:19:28
Looks like nvenc's AV1 is limited to 420 only https://developer.nvidia.com/video-encode-decode-support-matrix
2026-02-18 09:20:21
It seems like 4:4:4 is h264/h265
Kaguya
2026-02-19 05:05:32
is it ok to use ffmpeg as screen recording
2026-02-19 05:07:06
maybe both for screenshot too
cioute
2026-02-19 11:33:39
you can use ffv1 if you are not stressed in disk space, and re-encode it to any codec later
RaveSteel
2026-02-19 11:47:34
ffv1 is rather heavy on ressources for recording. I would recommend utvideo or magicyuv and encoding to ffv1 afterwards. If archival doesn't matter just use h265/h264 with 4:4:4
Quackdoc
2026-02-19 03:09:21
storage I/O is a resource too T.T
ignaloidas
RaveSteel ffv1 is rather heavy on ressources for recording. I would recommend utvideo or magicyuv and encoding to ffv1 afterwards. If archival doesn't matter just use h265/h264 with 4:4:4
2026-02-19 03:26:01
There is a GPU encoder for ffv1 now, however that might not be that helpful from resource consumption side if you're doing a whole lot of rendering on the GPU
RaveSteel
2026-02-19 03:27:00
ffv1_vulkan does not support full RGB, a common limitation with hardware accelerated encoding
2026-02-19 03:28:54
And although I haven't checked if yuv444p is supported, if you are limited to yuv420p for example, you may as well discard the idea of lossless recording entirely IMO
ignaloidas
2026-02-19 04:29:33
There shouldn't be any fundamental limitation against fotmats, since it's just compute shaders, maybe just not implemented yet
RaveSteel
2026-02-19 05:06:19
I hope so
2026-02-19 05:14:34
Hm, I dug some more and got it working with bgr0 for true lossless — but it is half the speed of CPU encoded FFV1. So I would still recommend a lighter codec
2026-02-19 05:15:35
Maybe the performance would be better on a 5090. If someone here is rich they could try <:KekDog:805390049033191445>
jonnyawsom3
2026-02-19 05:21:53
Did they ever ask for lossless?
RaveSteel
2026-02-19 05:23:16
no, we just went there from the original question lol
lonjil
2026-02-22 10:17:51
https://www.phoronix.com/news/AOMedia-OAC-Open-Audio-Codec
RaveSteel
2026-02-22 10:21:23
if it's based on opus why not just contribute lol
2026-02-22 10:21:55
It's not like opus isn't developed anymore after all
HCrikki
2026-02-22 10:30:46
back in the days we used to call this a new encoder for existing format
2026-02-22 10:33:26
it cant proclaim to be a 'successor' to a codebase thats active and is capable of even more than what the backward compatibility maximizing parameters only default to
2026-02-22 10:34:30
if its serious about that, they should have some way to losslessly recompress already lossy opus audio (like with jxl's reversible lossless transcoding when source is an unmodified jpg)
lonjil
2026-02-22 10:45:31
they're probably planning to make changes to the format
RaveSteel It's not like opus isn't developed anymore after all
2026-02-22 10:48:58
looks like the main dev of OAC is Jean-Marc Valin, so might be that Opus isn't actually actively developed anymore if he's being paid to work on this instead 😄
2026-02-22 10:51:13
From the man himself: > OAC is using Opus 1.6.1 as a starting point, but will keep evolving and will ultimately not be Opus-compatible, so we're not constrained in the changes we can make.
dogelition
2026-02-22 11:01:13
hoping that they'll remove or disable by default the stereo phase inversion
2026-02-22 11:01:28
(see https://gstreamer-bugs.narkive.com/p9l7iv3z/bug-791771-new-opusenc-opusdec-add-option-to-disable-phase-inversion)
2026-02-22 11:02:35
just fucks up any mono downmixing when the decoder (which can be far removed from the place where the downmixing happens) isn't explicitly told to ignore that flag
Demiurge
2026-02-23 09:25:35
Youtube butchers sound and opus didn't help
RaveSteel
2026-02-23 09:40:55
Let's be fair, youtube audio quality is the least offensive while watching videos
spider-mario
2026-02-23 10:59:58
I can at least agree that “Stable volume” (DRC) butchers sound
2026-02-23 11:00:09
whether the encoding itself is bad is less obvious to me (but I have high-frequency hearing loss)
2026-02-23 11:01:18
I upload in the highest quality I can, and as far as I can tell (from YT’s choice of codec, spectrograms, and what I can hear), the result seems okay
2026-02-23 11:02:06
of course, it may be another story if the upload itself was already lossy and YouTube adds generation loss on top
2026-02-23 11:04:54
example: https://youtu.be/ORffRIkY7O0 (for headphones) / https://youtu.be/IirnyKNJNiA (for speakers)
AccessViolation_
2026-02-23 11:06:54
my issue with stable volume is that 9/10 times it's not available on videos that actually need it, whereas it is available on videos that get millions of views that have enough production quality to not need it...
2026-02-23 11:13:23
On that note, is there a word that's like 'consequently' but in the opposite direction? and is there a word for relationships between words like that? > videos that get millions of views and __consequently__ have enough production quality Production quality is not a (direct) consequence of the amount of views, it's the other way around, but I couldn't think of a word to express that. It's not an antonym, because antonyms of 'consequently' are 'whereas' and 'despite that', but I'm not looking to negate it, I want to reverse it
spider-mario
2026-02-23 11:15:08
“because”?
Demiurge
spider-mario “because”?
2026-02-23 11:25:42
That's what I was gunna suggest too
2026-02-23 11:28:29
I haven't done any precise measurements or tests but whenever listening to music on YouTube it always sounds worse than everywhere else like SoundCloud or Bandcamp
2026-02-23 11:28:49
It's like listening to a jpeg
AccessViolation_
spider-mario “because”?
2026-02-23 11:35:05
oh YEAH
spider-mario
Demiurge I haven't done any precise measurements or tests but whenever listening to music on YouTube it always sounds worse than everywhere else like SoundCloud or Bandcamp
2026-02-23 12:16:04
does that apply to my [examples](https://discord.com/channels/794206087879852103/805176455658733570/1475448330103554118) as well?
Mine18
dogelition hoping that they'll remove or disable by default the stereo phase inversion
2026-02-23 07:26:55
julio intends on sending a pr for that
jonnyawsom3
2026-02-24 01:15:27
It was today I found out my Android phone can't play h264 4:4:4
Quackdoc
2026-02-24 01:18:19
most cant
jonnyawsom3
2026-02-24 01:28:00
HEVC 4:4:4 didn't even decode on my desktop, h264 10bit is non-existent, and AV1 is too new for all my devices 😔
2026-02-24 01:28:12
~~Distance 25 animated JXL it is~~
dogelition
HEVC 4:4:4 didn't even decode on my desktop, h264 10bit is non-existent, and AV1 is too new for all my devices 😔
2026-02-24 01:56:13
amd gpu? reasonably new intel and nvidia gpus can do hevc 4:4:4 (but not av1)
jonnyawsom3
dogelition amd gpu? reasonably new intel and nvidia gpus can do hevc 4:4:4 (but not av1)
2026-02-24 01:57:32
An old 1070Ti, playing it went to software decoding
dogelition
2026-02-24 01:58:41
https://github.com/orgs/LizardByte/discussions/220#discussioncomment-8898271
2026-02-24 01:58:49
seems like it should be supported?
2026-02-24 02:00:49
nvidia's site says no though: https://developer.nvidia.com/video-encode-decode-support-matrix
2026-02-24 02:00:57
so could just be misinformation
jonnyawsom3
2026-02-24 02:02:35
Yeah, says 20 series or newer
Quackdoc
HEVC 4:4:4 didn't even decode on my desktop, h264 10bit is non-existent, and AV1 is too new for all my devices 😔
2026-02-24 02:08:07
I recently got a pixel 10 pro xl, av1 decode is such a massive different vs my old s9+ lmao
jonnyawsom3
2026-02-24 02:15:10
Mine just crashes when it tries to play AV1 on Discord
NovaZone
It was today I found out my Android phone can't play h264 4:4:4
2026-02-24 02:50:17
Or 422 kek
2026-02-24 02:54:40
Well technically just discord hw dec can't do it and fallback fails
2026-02-24 02:55:01
Mpv tho, ofc no problem xD
Mine18
2026-02-24 09:12:15
tbf discord swdec seems funky performance wise
RaveSteel
2026-02-24 12:15:22
https://hydrogenaudio.org/index.php/topic,129191.0.html
dogelition
2026-02-24 12:53:18
lots of very uninformed people there for what's supposed to be an enthusiast forum 🤔
2026-02-24 12:54:33
(not that there's something wrong with being uninformed, but here that doesn't seem to stop people from confidently posting things)
lonjil
2026-02-24 01:06:48
why in the actual fuck is there an Opus patent pool
spider-mario
2026-02-24 01:25:32
to pool patents related to Opus
Exorcist
2026-02-24 01:32:02
lonjil
spider-mario to pool patents related to Opus
2026-02-24 01:44:32
I think you mean: to pool patents of dubious relation to Opus, so as to scare companies into paying fees :p
Mine18
dogelition lots of very uninformed people there for what's supposed to be an enthusiast forum 🤔
2026-02-24 07:27:51
you wouldn't believe the av1 off shoot thread
dogelition
2026-02-24 07:43:30
i was referring to that too actually
Mine18
2026-02-24 07:55:18
its wild
jonnyawsom3
2026-02-24 07:55:39
Someone give me a hand explaining to this guy https://github.com/google/jpegli/pull/135#issuecomment-3954307685
2026-02-24 07:55:44
My brain hurts
Demiurge
Someone give me a hand explaining to this guy https://github.com/google/jpegli/pull/135#issuecomment-3954307685
2026-02-25 06:52:02
APP14 marker is required to tell the decoder not to do the reverse YUV tranform.
2026-02-25 06:52:18
It has nothing to do with the ICC profile
jonnyawsom3
2026-02-25 06:52:46
<:tfw:843857104439607327>
Demiurge
2026-02-25 06:53:28
If you aren't using the jpeg YUV mode then always use app14, why is there an argument?
2026-02-25 07:01:29
According to the jpeg specs, the decoder always applies the reverse transform if app14 isn't there to say not to, so the color would be fucked up if the app14 marker isn't there, because of the unnecessary reverse-transformation being performed by the decoder when the encoder never did that
2026-02-25 07:01:46
It seems like a simple issue
2026-02-25 07:02:37
Maybe the confusion is that it's sometimes called "RGB mode" when it really is "do nothing mode"
2026-02-25 07:04:58
Idk it looks like you already explained it perfectly jonny
2026-02-25 07:06:55
Github is a place to fix issues. Doesn't seem like the place to explain basic things like this. Is there any point in arguing?
2026-02-25 07:08:22
The lack of app14 tag is a pretty major bug that needs fixing regardless
2026-02-25 07:08:34
I thought it was fixed already
jonnyawsom3
2026-02-26 03:52:59
Hmm, I just realised FFMPEG is buffering all the PNGs from jxl-rs in-memory before encoding progressive videos, adding up to quite a lot for high res decodes. Don't suppose there's a way to make it lazy-load instead? The command makes the dimensions even for x264 and adds a checkerboard background for transparency `ffmpeg -hide_banner -r 10 -i Progressive.partial%05d.png -vf scale=trunc(min(3840\,iw)/2)*2:trunc(min(3840\,ih)/2)*2:force_original_aspect_ratio=decrease:alphablend=checkerboard -pix_fmt yuv420p -movflags +faststart -preset slow -crf 18 -y Progressive.mp4`
dogelition
2026-02-26 07:26:57
https://x.com/i/status/2027002298417774719
jonnyawsom3
Hmm, I just realised FFMPEG is buffering all the PNGs from jxl-rs in-memory before encoding progressive videos, adding up to quite a lot for high res decodes. Don't suppose there's a way to make it lazy-load instead? The command makes the dimensions even for x264 and adds a checkerboard background for transparency `ffmpeg -hide_banner -r 10 -i Progressive.partial%05d.png -vf scale=trunc(min(3840\,iw)/2)*2:trunc(min(3840\,ih)/2)*2:force_original_aspect_ratio=decrease:alphablend=checkerboard -pix_fmt yuv420p -movflags +faststart -preset slow -crf 18 -y Progressive.mp4`
2026-02-27 01:00:00
<@853026420792360980> you're the resident FFMPEG expert, I don't suppose you have any ideas? The video encode is using over twice as much memory as jxl-rs for what should be a simple chain
monad
2026-02-27 01:55:37
-preset veryfast
Traneptora
<@853026420792360980> you're the resident FFMPEG expert, I don't suppose you have any ideas? The video encode is using over twice as much memory as jxl-rs for what should be a simple chain
2026-02-27 04:15:56
I'm not sure how you can say it's using twice as much memory as jxl-rs when you're encoding it with x264
2026-02-27 04:16:10
x264 on preset:slow uses a lot of memory
2026-02-27 04:17:11
preset=slow for example has an rc-lookahead of 50