1) I use Abstract Syntax Trees to have the LLM modify existing code. It will make more targeted changes than I've seen compared to codex/gemini cli/cursor etc. I wrote a blog post about how I do this if you want to know more: https://codeplusequalsai.com/static/blog/prompting_llms_to_m...
2) I double-charge for tokens. This creates a margin, so that when you publish your app, you get to earn from that extra token margin. An API call that costs $0.20 to the user would break down to $0.10 for the LLM provider, $0.08 for you, and $0.02 for me. I'm trying to reduce the friction of validating ideas by making the revenue happen automatically as people use your app.
3) I've built a "Marketplace" where you can browse the webapps people have created: https://codeplusequalsai.com/marketplace
I'm kind of trying to bring back and old-school web vibe into the AI world, where it's easier for people to create things and also discover neat little sites people have built. I wonder if I can also solve the 'micropayments' idea that never took off really, by baking in the revenue model to your webapp.
4) I envision the future of large-scale software development to be a lot more about writing clear tickets than ever before; we've all dealt with poorly-written tickets, ill-defined use cases, and ambiguous requirements. This site is an early take on what I think the UX might be in a future where ticket-writing may take a greater amount of time, especially to code-writing.
What do you think?
--
Some more quick nerdy details about behind-the-scenes tech: this is running on 3 linode servers: 1 app server (python/flask), 1 db server (postgres), 1 'docker server' that hosts your webapps. The hardest part about making this was getting the LLMs to write the AST code, and setting up the infrastructure to run it. I have a locked-down docker with python and node, and once the LLM responds to a code change request we run a script in that docker to get the new output. For example, to change an html file, it runs a python script that inputs the original file contents as a string to the LLM output, which uses beautifulsoup to make changes to the html file as requested by the user. It's quite custom to each language, so at the moment I support Python, Javascript, HTML, CSS and am currently testing React/Typescript (with moderate success!)
No comments