It seems like either
-
I get a
.webp
file when I don’t want it (downloading images) -
I try to use a
.webp
format, but it isn’t allowed (uploading images)
So who is trying to encourage people to use it, and who is trying to prevent adoption?
I’m constantly converting it with imagemagick and other tools
It’s being pushed by Google as a format exclusively designed for the web. It’s annoying as fuck because it just started showing up randomly and it’s annoying to use for the reasons you mentioned.
Interestingly enough, the first thing it did was put a speedbump in front of is downloading an image from a Google images search. Imagine that.
I think browsers are starting to convert it locally as saving images seems to be slowly returning to some kind of normal. Maybe.
I don’t think browsers are converting the file when you save. The proper way to implement webp in html is to also include the link to the jpg or png version of the file. So old browser that don’t support webp can still load the image. So the browser just selects the jpg version when you save the image.
I’ve seen file browsers that do implicit conversions which is really helpful. So if you rename a file from
pic.webp
it automatically gets convertedpic.jpg
. That’s quite useful if you don’t care about specific quality parameters. Maybe browsers should just let you save a picture in any major image format.I think that’s just an issue with the image viewer you’re using supporting webp decoding but not its extension. Most media viewers won’t restrict decoding to formats the extension supports.
deleted by creator
Also when uploading images somewhere else you can just change the file extension to png and it works most of the time.
It’s being pushed by Google. If a website includes webp it will have a better score on Google PageSpeed Insights, since webp files are smaller than jpg thus the page loads faster. The PageSpeed score is one of the metrics that Google uses to rank the website on Google search.
It’s why I love Google, they make the best stuff, can only recommend!
I like webp for space and bandwidth reasons (300 KB PNG -> 35 KB webp) but support among applications/services seems spotty.
Like, my phone’s gallery supports webp no problem. But Google Voice or my phone’s SMS app will not handle them at all. I don’t have a converter app, so I open it in gallery, take and then crop a screen shot if I need to send it to someone (crude but effective).
With Lemmy, a lot of admins force conversion of uploads to webp on the backend to save storage costs and reduce bandwidth. If you want to get a different format, such as png, you can add
?format=png
to the end of the image URL. That’s one of the options I’m working to add to my custom UI (download image as [webp, png, jpg] ) because I’ve run into the same problems as you when trying to share things from here.lossless png vs butchered webp is not a fair comparison.
It’s not, but for memes and shit, IDGAF about lossy vs lossless. I just don’t want to have to jump through hoops to text it to a friend.
And for image storage on my instance, I’m not paying for people to upload lossless images. They can take webp or not upload anything :P
I think they meant comparing with jpeg might have been a fairer comparison.
Layman’s suspicion: adoption is hard when nearly everyone and their uncle knows and supports gif/jpg/png. At least for most end consumers there’s no major advantage to adopting early. And in such a scenario most people adopt when they are forced to because everyone else adopted. So it’s a hen-egg problem.
Ideally when you introduce a new format you support both the old and new format concurrently over a long time to allow for a gradual transition. The major advantage of webp/avif is that they need less storage space for the same quality. However if you have to store everything in an extra format whilst also keeping the old ones you are completely reversing that storage advantage and now need even more storage volume than before.
As far as I can tell AVIF has much better prospects of being the future image format anyways. In the long run that is. Plus it’s open source and not just a single tech giant behind it. Suffers from the same slow adoption rates though.
Jpeg XL should win, but Emperor Google disapproved since it isn’t their format.
https://en.m.wikipedia.org/wiki/JPEG_XL
It was mainly based on a combination of a proposal called PIK,[9] submitted by Google, and a proposal called FUIF[10] — itself based on FLIF — submitted by Cloudinary.
I dont know why Google is not supporting it, but it is their format.
Edit: removed insensitive comment.
Please do not use ADHD as a pejorative.
You’re right, my sincerest apologies.
I know that some CDNs like Akamai will serve it up in cases where it optimizes delivery of the image. They have an add-on service for their customers called Image and Video Manager. Among other things it can compress images to improve performance.
Imagine a website that displays a JPG file that has thousands of colors and a resolution of 10,000x10,000 pixels. That’s terrible for loading across the internet because the human eye can’t distinguish that many colors and most devices have screens smaller than 10k. The website in question could also be putting that image in a frame that’s only 1k in size.
So what Akamai’s service does when it sees this image is that after serving it up the first time it optimizes it to a number of different sizes and formats. It will make multiple copies in different resolutions & file formats, and reduce the number of colors without impacting the visual appearance of the image. The optimizations target popular devices & browsers, so you could end up with a bunch of different sized jpegs, webps, pngs, etc.
Once all those versions have been created they are automatically added to Akamais CDN cache. The next time a person visits that website Akamai will look at the characteristics of the device being used and serve the best optimized version of the image that the device supports. So if you’re running Chrome on a mobile device you might get a 800x800 webp version of the image rather than the original 10,000x10,000 version, for example.
That last bit drives me up the wall sometimes, if I find an image that I like, I want to see it in the highest resolution possible, even if it will cause my device to catch fire
In that last case, it is usually performed by rewriting the uri of the image. Those images will typically have query strings after the filename that get the optimized versions. If you remove the query strings parts you normally get the original image.
That’s not always the case. With Akamai I can very easily enable this tool on all images on my employers website and there will be no easy way for a user to bypass it.
I just took a look at www.frankandoak.com which Akamai says is a user of this tool (and not my employer). They’re a clothing retailer and have a lot of images on their site. Using the developer tools in Chrome I can see that a lot of their images of products are being served as webp even though the file extensions are jpg. It looks like they add version numbers as parameters on image urls, and removing those effectively does nothing. I’m still served webp versions of those images.
that site uses shopify (and runs through cloudflare for my origin ip). shopify has their own image and video ‘processing’ available to their hosted sites.
Curious then that Akamai touts them as a success story:
https://www.akamai.com/resources/customer-story/frank-and-oak