This is a risky idea, actually — at least in its fully expanded form.
Sure, in the prismjs.com case, I have one of those comments in my code too. But I expect it to break one day.
If a site is a content generator and essentially idempotent for a given set of parameters, and you think the developer has a long-term commitment to the URL parameters, then it's a reasonable strategy (and they should probably formalise it).
Perhaps you implement an explicit "save to URL" in that case.
But generally speaking, we eliminated complex variable state from URLs for good reasons to do with state leakage: logged-in or identifying state ending up in search results and forwarded emails, leaking out in referrer logs and all that stuff.
It would be wiser to assume that the complete list of possible ways that user- or session-identifying state in a URL could leak has not yet been written, and to use volatile non-URL-based state until you are sure you're talking about something non-volatile.
Search keywords: obviously. Seach result filters? yeah. Sort direction: probably. Tags? ehh, as soon as you see [] in a URL it's probably bad code: think carefully about how you represent tags. Presentation customisation? No. A backlink? no.
It's also wiser to assume people want to hack on URLs and cut bits out, to reduce them to the bit they actually want to share.
So you should keep truly persistent, identifying aspects in the path, and at least try not to merge trivial/ephemeral state into the path when it can be left in the query string.