I haven't ruled out building AI-powered queries into the app, but I am firm about doing it in a way that respects data security (i.e. no shipping database schemas or data to ChatGPT without explicit user consent).
From my understanding, usable on-device LLM models tend to be gigabytes-large, which makes it difficult to roll out to everyone. Apple Intelligence might make this feasible, will need to do more research on this when I do the iOS port.
I worked part time on this project for about 2 years, and full time with a small team for the last year. It's all typescript / webgl / node.js / preact, and lots of microservices.
They are though. What if the API you're relying on doesn't have types and it returns something you didn't expect.
I think this is the biggest gotcha for developers who are using typescript. Typescript doesn't do runtime type checks, so if you're calling out to something third party that isn't written in typescript, you can't trust it'll return what you think it returns.
The DMOZ data is licensed under creative commons and there is a RDF export. I downloaded the data, registered a domain - http://www.zedurl.com/ - and I'll try knock up a mirror of DMOZ tonight as a rails app. I'll update this post if things go well.
Add an AI query tool - you could do it on-device with something like functionary ggml and llama cpp with a few functions:
getSchemaForTables(...) getTableStats() runQuery('...')
Then you could do a query like:
"show me all customers who regularly post between midnight and 1am"