I’m confused, but maybe two wrongs here do make a right. Please allow me to explain.
Putting aside for the moment that I personally find Apple’s iOS 26 design objectionable[1], I don’t understand why `backdrop-filter: blur` is the focus of recent implementations jumping on the Liquid Glass hype train.
Using background blur to create UI layer separation (often in combination with darkening/saturation or other contrast enhancements) has been around for over a decade on iOS and almost as long on the web. So what’s new here?
Adding to my confusion, I’m a bit surprised folks here think this is so challenging in CSS. A few commenters have pointed to great implementations elsewhere, but I think they’re underselling them.
Plainly: the solution that involves `backdrop-filter: filter(#filter)` where `#filter` references an inline SVG embedded with `<feImage/>` and `<feDisplacementMap/>` works very convincingly well.
Check out this demo for example:
https://codepen.io/Mikhail-Bespalov/pen/MYwrMNy
This implementation choose to hard-code `<feImage href="data:image/png;base64,..."/>`. But if glass3d.dev chose to implement 3d highlights in a similar fashion (and wanted to support e.g. a dynamic `border-radius`) this image could easily be rendered in the browser runtime in canvas, and dumped into the CSS using `toDataURL()`. Similarly, a component library with a CSS pre-processor could generate these for any static shape at build-time.
The closest thing I’ve seen to this is:
https://ruri.design/glass
These implementations are out there.
Coming back to the why/should for a second though.
[1] In its current realization, Liquid Glass cannot effectively replace what blur accomplishes. Because Liquid Glass layers are more transparent, the contrast issues are real, and the spectral highlights distract the eye as much as (or more than) they help make up for that lack of contrast. It draws attention to the eye where blur would relax it. It’s a UI meant for an XR system (where it arguably solves a real problem, much like a HUD does in a video game) hacked into devices where it makes no sense, all in the name of a “unified OS” design language.
If any aspect of Liquid Glass is successful it will be when it’s used sparingly to delight in places that are low stakes and where other affordances are present (like a small circular button floating in the corner of the screen with hardware concentricity). A circle shape’s refractions would be smaller, softer, more symmetrical, and therefore arguably less noisy/distracting—in a way resembling a softer blurry background.
Which brings me full circle back to two wrongs.
This website doesn’t do anything new, but that’s why it’s good. Because the truth is, Apple failed to deliver a Siri-based LLM on a schedule it announced and is now trying to distract us with some shiny new thing. Damn, it worked.