Built this to test Chrome's new Prompt API, which runs Gemini Nano entirely on-device - no API keys, no external servers; the model downloads once (~2GB) and runs locally
Requires Chrome 138+ with experimental flags enabled (instructions on the page)
Curious to hear if others have been experimenting with browser-native LLMs