Hacker News new | past | comments | ask | show | jobs | submit login

The use cases here are going to be really wide spread in my opinion, just a few ideas off the cuff. Obviously the 30mb size means it won't really be for regular consumer apps, but for enterprise or specific tasks it can make a lot sense.

1. Training websites

2. Interview challenges involving SQL

3. Client side tooling that loads data into your local machine and displays into a SaaS web app without the SaaS app ever having your data

Appreciate the hard work from Supabase and Snaplet on this!




Supabase developer here.

I've used this to move data from a live Supabase database down to the browser for testing and playing around with things in a "sandbox" environment. Then I save snapshots along the way in case I mess things up.

To move a table over from my Supabase-hosted postgres instance to the browser, I just exit out of psql and run something like this:

pg_dump --clean --if-exists --quote-all-identifiers -t my_table -h db.xxxxx.supabase.co -U postgres | psql -U postgres

Keep in mind if you try something like this, our proxy is rate limited for now to prevent abuse, so it might not be super fast. It's easy to remove rate limiting at the proxy, though.


Correct me if I'm wrong, but given your profile (I assume someone in the tech world), nothing stopping you from doing all of the above with a local pg. If installing is annoying you could run it in docker.


You're absolutely right, you can use a local pg. This just makes it easier for me, as it's sort of a "sandbox" environment and I can easily take snapshots to do A/B testing or roll things back. I can also send a snapshot to a coworker so they can get my entire environment with all my data in a few seconds.


You can do that with docker unless you’re running ARM and your friend uses x86


People really do want to make the browser be the new OS.


But now instead of the annoyance of installing pg, you have the annoyance of installing docker. and then writing a dockerfile. And then bootstrapping docker. etc. etc. =)


Who, working with these things regularly, does not have docker installed?

There is already an official image available so you don’t need to write dockerfile yourself. Having an instance up and running is literally just a docker run command away.


Installing pg is hardly annoying. `brew install postgres`. Done.


Replied to the wrong person, but I'll take that bait anyway: "Ahahahaha, no".

That only works if you live a blissful "all my hosted pg instances use the exact same version" world, which I've never seen be the case for even moderately sized projects. You're going to need multiple Postgres installs if you're going to need pg_dump/pg_restore, which you probably are.

(How you solve that problem, of course, is not a one-size-fits-all, and Docker may be the answer... or it may not)


Doesn’t pg_dump/pg_restore work across versions? (So long as the CLI tools are the latest version). I guess version compatibility could be an issue in theory, but I’m yet to hit into a backwards compatibility issue with Postgres.


I wish. You'd think it'd just be able to check the db to see if there's any schema or procedure incompatibility, but it doesn't. Instead, it goes "Remote uses version X and you're using version Y, sort that out" and then it exits.


Oh, that's incredibly unhelpful!


Generally use docker myself for this... pg + pgadmin + volume.. spin up when needed, down when not. Works pretty well.. I can see this being useful too though.


IMHO the prime use case for all these WASM stuff is going to be platform independence. Web browsers are not that interesting because for regular use they already have ballooning resource use issues and making web apps even more resource intensive is not exactly inspiring, HOWEVER the web technologies are the only true multi-platform solution we have and it makes sense to use it to make everything with it and everything instantly becomes multi-platform.

What I suspect may happen is, the rise of web browsers of a 3rd kind where these are not really for browsing the web but running code written for native domains. So instead of browsing web of linked text, we can have a web of algorithms to process data and requests.


Isn't that Deno and Node.js?


No not really, these are all about JavaScript. Wasm makes everything portable like JavaScript.


I can really appreciate the fun and technical challenge of running postgres in a browser. However the use cases are extremely far fetched.

1. training website: you can use a hosted PG, or use a sqlite wasm

2. same as above

3. if the use case is being offline, then the web browser isn't very relevant. If the use case is to avoid a load on the server, the sqlite in wasm will be just fine.

It's only if you go into triggers and such that it might start being relevant, but then I'd start seriously questioning what on earth are you trying to do :D

All of that to say: well done to the team that has done it, really fun and interesting work, I just can't see the use from where I stand.


> training website: you can use a hosted PG, or use a sqlite wasm

from a supabase POV (which is in the business of hosting Postgres databases), we will definitely be using this for training/tutorials. We have several thousand visitors to our docs every day, and hosting a database for every one of them is expensive.

We can now provide a fresh database for every user, and they can "save/restore" it with the click of a button is huge.

> use case is being offline

The offline use-case is definitely far-fetched in the current iteration. but that's the beauty of technology - something that seems impossible today can be mainstream in a decade.


It is awesome to be able to do things isolated client side and not have to deal with permissions and resources for something like a training website. Which is all stuff you would have to deal with for a hosted version.

And there are plenty of reasons why you may want to use PG over sqlite. Especially if you are trying to mimic a production environment which is PG. Personally I only ever use PG, and never have a reason to use sqlite.


Also, in case you're curious, PostgreSQL logical replication to the browser wasm instance DOES work. I've done it. :)


> Obviously the 30mb size means it won't really be for regular consumer apps

You know that it will end up being used for regular consumer apps. And once everyone is doing it, regular web pages being over 30MB and including an enterprise-grade SQL server engine will simply be accepted as normal, and everyone not doing it is a luddite.


How long until WASM things become "installable" so that other websites can use the same egregious 30MB things?


> 3. Client side tooling that loads data into your local machine and displays into a SaaS web app without the SaaS app ever having your data

Member mini mongo?

Supabase will be Meteor in no time.


As someone who (unfortunately) used Meteor in the past I disagree. IMO from a dev perspective Meteor was just a poorly implemented promise at 0-effort real-time functionality on top of a database you were at the time (~2015) already interested in or using. It compounded all the problems of MongoDB with a non-perfect abstraction and javascript framework.

Whatever Postgres in WASM ends up being used for there's no way it repeats all those circumstances - at minimum Postgres is just a more appropriate tool then MongoDB circa 2015.


> Supabase will be Meteor in no time.

As in they will crash and burn?


Nah I think Supabase will do well. Meteor also did well all things considered. My point is really that they are revisiting all the same technology and product decisions and coming to the same conclusions.

There's only one way to architect a PaaS. Next.js and Vercel's offerings are, essentially, also the same.

The real risk is having one person build it all. They have a team but not really. Personally I believe that's a good risk to take.

But I think it will take a powerful psychological toll to operate this way, having to pretend to have a team (because investors like teams and not solo founders), having to pretend this isn't Meteor (because investors don't like being reminded of "losers"), etc. etc.

Like downvote random Internet comments all you want, but actually I think it's a great idea to have one person do "Better Meteor," it's not my fault investors don't.


> The real risk is having one person build it all

> having to pretend to have a team

In case you're talking specifically about supabase here, we're a full team: https://supabase.com/humans.txt


I just want to say what you've built is really great and works really well.


Without seeing the marketing, I think running a full RDBMS inside your browser is not a great idea. Just idling it becomes my most CPU intensive Firefox tab, out of dozens, according to about:performance.

I shudder what performance a full-fledged application would demand. I know some people will embed this on an Electron app, for double the fun.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: