In CSS, let’s talk about srcset or image-set. In that context, you can define which image the browser loads using 1x, 2x, 3x, etc. These refer to pixel density. (In the case of srcset, you can use pixel dimensions too, which sidesteps the issue I’m going to talk about, but it still occurs in image-set, and also is still weird to me in srcset, even if you can side step it.)
So, assuming, say, a 20" monitor with 1080p resolution is 1x, then a 10" screen with 1080p would be, technically, 2x - though, in the real world, it’s more like a 6" screen has a 1000x2500 resolution - so, I don’t care about math, that’s somewhere between 2x and 3x.
Let’s imagine a set of images presented like this:
srcset(image_1000x666.webp 1x,
image_1500x1000.webp 2x,
image_3000x2000.webp 3x)
then an iphone 14 max (a 6"-ish screen with a 1000x2500-ish resolution, for a 2-3x pixel density), would load the 3000x2000 image, but my 27", 1440p monitor would load the 1000x666px image.
It seems intuitively backwards - but I’ve confirmed it - according to MDN, 1x = smaller image, 3x = larger image.
But as I understand it, an iphone 14 acts as if its a 300x800 screen - using the concept of “points” instead of pixels - which, in the context of “1x” image size makes a lot of sense - but the browser isn’t reading that, all it seems to care about is how many pixels are in an inch.
I made a little page to demonstrate the issue, tho I acknowledge it’s not hugely helpful, since, other than using your actual eyeballs, it’s hard to tell which image is loaded in the scrset example, but take a look if you want.