Is there any tool that does the other way around? I simply need an alternative to cloudflared tunnel (https://blog.cloudflare.com/tunnel-for-everyone/) for exposing localhost port to a public domain that lets me supports anonymous clients. All cloud solutions charge based on users so they unfortunately doesn’t work
I think this would be great as an evolution of MDX (https://mdxjs.com/). MDX is already pretty popular for documentation and it plays well with React but unfortunately there is no framework that adds interactivity to MDX which will enable use cases like data applications.
Do you support scaling to zero? I wonder if native offerings of cloud providers (Cloud SQL/Alloydb or Aurora) still make sense as keeping hundreds of PG instances at scale will likely be a challenge if you're managing them from your control plane.
Also, is there any compliance that requires it to be in different Postgresql servers? I assume most companies just use some sort of isolation (tenant_id column or dedicated tenant database/table) so I wonder if this problem could better be solved as a proxy layer.
We support scaling to near-zero because of Aurora serverless, but we definitely are looking into other solutions that could be cheaper to run or self-hosted.
Some regional regulations (GDPR, etc.) require local and/or isolated hosting. Most companies indeed solve this with either a tenant id column, dedicated tenant databases or both. We want to simplify those architectures, and a proxy layer is exactly our idea there - we're working on a solution that handles connection pooling and routing to remove the need to cache connections on the client.
SyncLite looks great for companies that are starting to build from scratch. Ingest data in an embedded database and via CDC move the data into a centralized database. I'm also trying out a similar idea but only it's a bit reverse approach, from Cloud Warehouse -> Embedded DuckDB to reduce the compute cost for BI and embedded analytics use-cases. The combination of cloud and embedded databases is the future IMO.
For the project I'm working on, the tech such as Apache Iceberg and embedded DuckDB enables querying Snowflake + BigQuery tables directly from your BI tool, without any compute cost: https://github.com/buremba/universql
Absolutely agree on "combination of cloud and embedded databases is the future IMO"
Universql looks interesting as well.
SyncLite also provides an ability to send back custom commands from SyncLite consolidator to individual applications(devices) while edge/desktop applications can implement callbacks to be invoked on receiving these commands.
A command can be anything and could be a away to tell the application to download data from a cloud hosted data warehouse and use it as a starting point.
Hi all! I built this tool for data exploration on Snowflake tables without any need to run a warehouse.
Snowflake is very powerful at scale but for small data (< 100GB) I rather work locally and then deploy my models to Snowflake later on. Universql lets you keep using Snowflake clients with its proxy layer and executes the queries locally on DuckDB.
This tech only became possible with Iceberg as it keeps the data references in manifest files where you can intelligently cache the data locally, similar to Snowflake's warehouse cache.
reply