I guess the part where I'm still skeptical are: Google is also still pretty good at search (especially if I avoid the AI summary with udm=14).
I'll take one of your examples: Britannica to seed Wikipedia. I searched for "wikipedia encyclopedia brtannica". In less than 1 second, I got search results back.
I spend maybe 30 seconds scanning the page; past the Wikipedia article on Encyclopedia Britannica, past the Encyclopedia article about Wikipedia, past a Reddit thread comparing them, past the Simple English Wikipedia article on Britannica, and past the Britannica article on Wiki. OK, there it is, the link to "Wikipedia:WikiProject Encyclopaedia Britannica", that answers your question.
Then to answer your follow up, I spend a couple more seconds to search Wikipedia for Wikipedia, and find in the first paragraph that it was founded in 2001.
So, let's say a grand total of 60 seconds of me searching, skimming, and reading the results. The actual searching was maybe 2 or 3 seconds of time total, once on Google, and once on Wikipedia.
Compared to nearly 3 minutes for ChatGPT to grind through all of that, plus the time for you to read it, and hopefully verify by checking its references because it can still hallucinate.
And what did you pay for the privilege of doing that? How much extra energy did you burn for this less efficient response? I wish that when linking to chat transcripts like you do, ChatGPT would show you the token cost of that particular chat
So yeah, it's possible to do search with ChatGPT. But it seems like it's slower and less efficient than searching and skimming yourself, at least for this query.
That's generally been my impression of LLMs; it's impressive that they can do X. But when you add up all the overhead of asking them to do X, having them reason about it, checking their results, following up, and dealing with the consequences of any mistakes, the alternative of just relying on plain old search and your own skimming seems much more efficient.