Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Tools to visualize data in SQL databases?
244 points by dyml on Feb 13, 2022 | hide | past | favorite | 134 comments
I’d like to hear what tools you use to easily visualize the data in a sql table?

Preferably I’d just like to click on a MariaDB table and receive some plots and statistics on the columns.

Whats your experience on this?

Edit: to clarify, I don’t want to visualize the database itself (Schema’s, keys etc). Just the data within it.




This may be an unpopular opinion, but if you have US$70/mo to spare, it's hard to beat Tableau for this exact use case.

"Connect to an arbitrary database, create a view that joins numerous tables (including foreign tables, via blending) together, load to columnar storage on a local SSD for performance if necessary, add arbitrary derived columns (including well-defined lateral lookups for things like 'annotate this action with the date of the first action of this action's user' [0]), group by 4 of the derived columns, map two of the groupings to nested dimensions along the horizontal axis and two to the vertical axis, and show the sum or count at each cell in a resulting table, then when satisfied, drill down into a slice and turn it into a bar chart with colors that match your branding needs" - every one of those clauses can be accomplished with drag-and-drop mouse commands almost at the speed of thought.

And once you get the hang of it, there's zero impedance mismatch with hand-rolled SQL, it's just way faster to iterate on, especially with schemas where you may not remember all the columns available to you, and especially when you're doing so over screenshare with non-technical colleagues.

[0] https://help.tableau.com/current/pro/desktop/en-us/calculati...


Tableau is my go-to too. People think of Tableau as a dashboarding tool (and it is), but it's actually a multidimensional exploratory data analysis tool. You can visualize more than 3 dimensions by way of colors, labels, sizes, etc. You can slice and dice your data and visualize it in so many different ways, and also do drill-downs and filters and aggregations in different ways. The downside is that you have to really understand SQL-like operations (GROUP BY, PARTITION BY, PIVOT) to truly take advantage of its power. Many Tableau users only scratch the surface of what Tableau can do, but there's a lot more underneath.

PowerBI on the other hand is a straight up dashboarding tool. I've used it and it isn't quite as powerful as Tableau, but it's an easier step up from Excel. It doesn't require you reorient your mindset as much as Tableau does. But it also doesn't let you probe your data as easily as Tableau does. It's essentially a supercharged Excel dashboarding tool

Excel viz is rudimentary. It gets the job done for simple plots, but it's a hassle to join data (you have to do cell VLOOKUPs or INDEX(MATCH())) and pivot tables are a poor-man's approximation of the true power of SQL operations. It doesn't scale to large datasets but the cell-based spreadsheet paradigm (vs. a relational database paradigm) is easy to understand which has an appeal of its own. But you're ultimately limited to what fits on a spreadsheet.


Tableau is a webapp version of basically SAS JMP, especially the plotting engine. So easy to create plots - literally 100x faster than matplotlib. I can drag things faster than I can type.

Tableau is really amazing actually.


I'd throw in excel power pivot over SSAS cubes. It fills In some of the gaps you described.


What's the closest Tableau alternative that's OSS?

PowerBI?

Apache Superset? https://superset.apache.org


I'm working on something in between a BI tool and a SQL IDE. It's definitely code-heavy on the user at the moment compared to a BI tool but an improvement on switching between Python scripts, Postman, a SQL GUI and Excel.

https://github.com/multiprocessio/datastation


Hey! Thank You for creating datastation.

Are there some videos that walk a user through setting it up to using it?


There are primarily a bunch of tutorials for getting started [0] (these are up to date) and some old videos (not up to date) [1].

[0] https://datastation.multiprocess.io/docs/

[1] https://www.youtube.com/channel/UCGOQFKonPUVo5LgxQDW26yg/vid...


I've had bad experiences with superset (though it may have been misconfigured).

On the other hand, Redash has served my startup really well for the first 3 years (and now we're finally moving to Tableau)


In spirit, but with no UI, vega js, they follow the same “grammar of graphics” idea from the same research. Altair does vega charting but in Python. Kibana is using vega in more UI fashion but I haven’t tested it. I think someone should put a proper web ui on top of vega…


I really like the visualization language idea.

I have used Vega + Jupyter a few times. As well as the Vega Viewer extension for VS Code. I haven't used it enough to be proficient and fast, but I have used it enough to know that I want to be.


As you get deeper into it they hook you into the server and other stuff and it ends up costing 000s.

Pandas is better but requires programming.


I second pandas, and would also highlight seaborn, plotly, and dash as complimentary data visualization libraries.

Tableau is fine for what it is, but I've found that the requests from stakeholders often grow to a point where you either can't do it in Tableau or have to move mountains to get it to work... so, in essence, sunk cost fallacy makes tableau millions.


PSA: If you haven't already learned pandas, learn dplyr/ggplot2 instead. Yes, R is a pretty clunky language (but it's closer to lisp than pretty much anything else as popular) but ggplot2 and dplyr are 100% the best currently available way to visualise tables in SQL.


I would second the decision to use ggplot2 / dplyr, and would also add data.table to the mix. That combination has been invaluable for me, allowing me to visualize all of my structured data.


I started with R, but then switched to Python because all pipelines were already written in Python (web-scrapers, some data-processing scripts, REST APIs), so I just learned pandas and it's been fine, although I do think dplyr's syntax is great and I prefer it to pandas'.


Yeah, totally. I spend a lot more of my time writing Python for anything that isn't data exploration/analysis, for exactly that reason.

I still refuse to learn pandas enough to replace dplyr though, as it's just so painful to use the API compared to how easy this stuff is in R.


Do you load the whole table into memory though?


$70/mo includes a server (Tableau online)


>This may be an unpopular opinion, but if you have US$70/mo to spare, it's hard to beat Tableau for this exact use case.

This is a popular opinion, in my book.


Tableau is great, but that initial fee, while being peanuts in terms of absolute IT spend, has been a blocker getting it rolled out in one large BI team I worked with.

The backstory was a large BI team having no proper tools for reporting, except maybe Looker for tabular KPI reports. So any addition of Tableau was an all our nothing buy, every single analyst or whoever on the BI team needed a seat, even though most of them wouldn't ever use it. For that matter 95 percent of the BI team would never move away from Excel. Anyway, the costs were huge for something that should have been a leap in capabilities.


My go-to has been Qlik Sense. I have enjoyed using Qlik Sense. Tableau is good also. Compared to Tableau, I’ve found Qlik to be faster and more responsive to slicing and dicing data.


Anything solved by tableau is as easily solved by Excel, which likewise supports direct SQL connections. The template graphs & incorporation of function logic, for versatility & business availability make it no contest. Though Tableau has some gorgeous templates, it doesn't also have DAX & PowerBI.


I'm curious why do you think of DAX as a virtue. My poor SQL-shaped peg brain has never really fit the DAX hole of MS software.

Also, it always struck me as something too complex for the non-technical folks, and not expressive enough for tech-literate analysts/data engineers &c.


Just options. VBA is there as well. Excel's virtue is not specializing in any specific task, but being versatile enough to express a multitude of business solutions. 'Excel is my database' wasn't always a punchline.

That's more than an equally domain specific process like Qlik, and more than a specific vendor tool like tableau. And anyway if PowerBI didn't have a pain point it wouldn't be a MS product.


It's good, but wait until you discover Mathematica :)


What functions of Mathematica can do the job? I'm interested in checking it out.


The various visualizations like MatrixPlot, Manipulate ... just google around and you'll see many interesting examples!


I love Metabase (https://www.metabase.com/) and used it successfully with business users, too (small hedge fund). If it's not too much data, getting it out as .csv and using R + data.table + dplyr, etc; = incredible productivity. See [1]

Use Excel, QlikSense and Tableau if Business Users need visualization. Excel pivot tables = OG data reshaping. Resist more complicated solutions : do you really need more than Excel?

QlikSense doesn't get enough love. It's actually better than Tableau at some scenarios. Or PowerBI if you're a Microsoft shop. Last time I checked, Power BI's Q&A [2] was a KILLER FEATURE. "Show me Sales per Region, Quarterly" and then you get to fine tune it. R and Shiny dashboards = last resort; too much bespoke work. 2 months using R + Shiny can be 1 week in Tableau / QlikSense / PowerBI.

1. "Efficient reshaping using data.tables" https://cran.r-project.org/web/packages/data.table/vignettes...

2. https://docs.microsoft.com/en-us/power-bi/natural-language/q...


If you like the idea of asking questions from your data in plain English, there are also other alternatives to Power BI, like Tableau's Ask Data, ThoughtSpot and Veezoo (free for 5 users). Here is a series of comparison videos to show their strengths: https://veezoo.com/compare-solutions/

Disclaimer: I'm one of the co-founders of Veezoo.


Oh, ThoughtSpot. I spent a couple of months testing that out. At the time, it was hard to deploy amd didn't "scale down". That is, the smallest size we could deploy was something like 750 Gigs.

Also, the UI looked half baked, but I'm sure they have fixed that by now


Metabase is fantastic for this use case.

Create a read replica, deploy metabase, connect it, start building visualizations. Or use the X-ray auto analysis feature to generate a bunch of mostly useful visualizations with a single click.


Metabase is what you're looking for.

https://metabase.com/


We use it and or staff are able to get at least a bit of value without knowing sql. but i still have to write most things or create a template they can copy and paste.

the visual builder still relies on knowing fundamentally what a db is, what a join is. like mega obvious stuff that actually isn't obvious to non technical people.

we area also multi tenent and it's not the best for that. like the default permissions when you add a db are broad, the organization of dashboards, questions, etc is not great. through different versions they have re organized and stuffed things oddly.

i've also messed up the elastic beanstalk a few times. i am not even close to an expert on aws so might just be me being stupid.

probably worth just paying for their service in the end..


My experience is that these kind of tools have an interesting positive effect, if they are well made (and metabase seems to be):

There are maybe 1/10-1/5 or maybe more who will be enabled by them on a technical level. The same kind of people who will build a site out of WP themes and plugins or build quite involved spreadsheets and so on.

I think it’s generally important and useful to enable and respect power users like that. They occupy a unique niche.


metabase is pretty nice, and their "x-ray" features that generates a lot of different visualisation is a clever way to get started and to have example to iterate on.


That looks great, but AGPL makes everyone nervous.


Curious more than anything. If your intent is to use the tool as opposed to importing the tool as a feature or product, is the AGPL a problem?

Also, "everyone nervous" is an interesting choice of words - I'm pretty sure most people wouldn't really care one way or the other. What is it about AGPL that causes stress among the masses?


> Also, "everyone nervous" is an interesting choice of words - I'm pretty sure most people wouldn't really care one way or the other. What is it about AGPL that causes stress among the masses?

Things like Google's AGPL internal policy[1] existing as a public piece of corporate-friendly IP propaganda have snowballed into a lot of FUD about the license.

People like to parrot perceived and secondhand opinions about the AGPL, and "some companies don't like the AGPL" transformed into "the AGPL makes everyone nervous."

[1] https://opensource.google/documentation/reference/using/agpl...


Fair point.

Clarifications: no, using the tool directly is not a problem for us.

"Everyone" - appreciate your question and not guessing :-) I meant everyone at my company. That phrasing maybe speaks to my mindset :-D Basically I really mean legal and those in charge who might listen to legal over tech.


You’re not building against their code, you’re just using the end product, why would AGPL matter here at all?


They have a commercial license, but just using an AGPL product doesn't mean you must AGPL your project.


You should check out Arctype! (https://arctype.com)

We're trying to re-imagine popular SQL clients (phpMyAdmin, MySQL Workbench) to have the design and software quality of modern tools like Superhuman, Linear, etc. We make it very easy to query and create charts from your database then share it with your team.

We currently support MySQL, Postgres, PlanetScale, Yugabyte, and in the next couple of weeks SQLite and ClickHouse.


Looks nice. One question about desktop Mac version – is it an Electron app or a native app or something else?


+1 for Arctype - I'm a happy user :)


Looks like a non-native version of TablePlus, with a strange pricing structure.


We're huge fans of TablePlus but Arctype is free (all features) for most developers. We charge a low price for teams/startups above 10 users in a typical SaaS model. TablePlus is a great piece of software but it charges $79 for individuals (one year of updates) to unlock all features such as having multiple tabs. We're also focused a lot more on visualization and collaboration.


I like "strange" because it is free.


it's great native SQL client since pgAdmin went all browser based, and I like that it supports multiple SQL DBs with more support in the future as well


The screenshot shows a multi-column dark-themed app running on a Mac. Let me guess: Electron?


Nowadays this is pretty much the space of either PowerBI or Tableau.

There used to be a lot of good candidates in this space even just a few years ago but Power BI has improved it's product and integrations very rapidly and with its affordability has displaced them at many big companies. Power BI also recently added some NLP capabilities from one of Microsoft's acquisitions which makes usage by non technical users easier.

If you're willing to put your data on BigQuery, then Google Data Studio / Looker is an even better solution for larger datasets due to the seamless integration and intelligent caching which (purely in my perception) seems to work better than Azure Analytics Services in the Microsoft side. Also BigQueryML works within SQL.

Source: i lead an Analytics and Data Science team at a Fortune 50


One thing that has become apparent to me over the years is that most of these exploration and visualization tools will be pretty ineffective unless the data is modeled correctly. Some of the tools mentioned here will actually do a lot of the modeling before the exploration and visualization kick in, but the tool is probably only doing the best it can but will most likely not understand a lot of the structure and business nuance that have been accumulated throughout its existence.

Moreover, if you plan to adopt and build upon one of these tools that infers and generates the models as well as provide the explore and visualization functionality, you might be painting yourself into a corner and forcing all current and future workloads to use this layer. Otherwise you'll be having to reinterpret and reimplement your models all over the place; one off SQL scripts/reports, web analytics, dashboards/visualizations/reports on other analytics tools. Then you'll also end up having to scale this tool up in both compute and storage to handle the load that grows over time. This can end up being quite costly in time, money and responsibility.

While these tools will offer a lot of value providing visibility and insight into your data, it'll probably be worth circling back and seeing if the data can and should be modeled correctly (semantic layer) before hitching your wagon to your first choice.

Once your data is all modeled, it might be worth re-evaluating all the tools that you started with and see how they manage now that your house is a little more in order.

Remember, your modeling doesn't have to be done by the same tool that does your exploration and visualization.

This is a great article related to these ideas: https://benn.substack.com/p/is-bi-dead


Redash(https://redash.io/) is pretty easy to use and get up and started with, especially if you know SQL. It's free and open source.


https://www.metabase.com/ free and open source.


Maybe not exactly what you are looking for but Datasette is brilliant for SQLite (and csv)

https://datasette.io/


It doesn't speak MariaDB (yet - I have a long-term goal to investigate adding alternative database backends as plugins) but you can instead use the https://datasette.io/tools/db-to-sqlite CLI tool to convert a PostgreSQL or MySQL (or other SQL Alchemy supported) database to SQLite, then use Datasette against the resulting file.

This actually works pretty well for small (<1GB) databases, where you can run a cron periodically to build the SQLite version.

Then you can visualize with plugins such as https://datasette.io/plugins/datasette-cluster-map or https://datasette.io/plugins/datasette-vega

I also often load data into Datasette and then do custom visualizations in Observable Notebooks by fetching data back out through the Datasette JSON API - here's an example notebook that does that, using the Observable Plot charting library: https://observablehq.com/@simonw/datasette-downloads-per-day...



Came here to share this as well Superset doesn’t have quite the cult following as tableau or other tools but the project has come really far since its inception and is quite robust


Metabase is amazing, it's very intuitive and quick to create questions (widgets) and assemble them in dashboards. I really liked the dashboard subscription to mail, and configurable alerts based on any questions built with the UI or SQL I've literally discovered it last Friday morning and now it's up and running at my company and also for 2 customers.


I tried most of the free tools but I don't like any of them. In my experience they're all clunky and don't have a beautiful design (I'm on Windows at work and Linux at home).

I just settled on DBeaver, but don't consider that an endorsement from my part.

I found DB Browser for SQLite to be the least bad, but it's obviously limited to SQLite.

My problem may come from the fact I have simple needs and they're all very complex apps. My SQL queries are rarely longer than 50 lines and I do DB admin tasks from the command line.

Among the unending list of apps I should code for myself there's a SQLPad project. Maybe one day.


If you're looking for DBeaver with better aesthetics you may like Arctype (https://arctype.com). Beekeeper is decent too.


Try https://www.beekeeperstudio.io/

It's simple and open source


Visidata[1] supports sqlite, mysql and postgres.

[1] https://www.visidata.org/docs/formats/


I use Gephi a good deal to visualize connections between tables and schemas in a messy production MySQL database, mostly with the aim to stop people (read: consultants, junior developers, managers trying their hand writing SQL) from developing ad hoc duplicate tables for their own niche purposes.

Gephi lets me show how this kind of table bloat happens over time and helps explain performance degradation.


This is interesting I have seen Gephi before but never used it. Do you have any links or posts explaining how you use it to visualise connections in MySQL? I would like to do something similar with Oracle.


Take a look at the circle pack layout, it's what I use to group by schema

https://blog.miz.space/tutorial/2020/01/05/gephi-tutorial-la...


Couple of tools not yet mentioned:

PopSql - https://popsql.com

Trevor - https://trevor.io


No plots and statistics (well, maybe, but I've not used them if there are there). But DBeaver is nice to browse a database.

For SQLite databases, I use sqlitebrowser.

Both tools are open source.


DBeaver is my choice. Saw it being used at Re:Invent by a presenter a few years back and switched immediately. I had been using SQLDeveloper and don't miss it one bit.


Same here, it's solid and consistent.


Oracle SQL Developer has a data model tab.

This is very much a Java application, and appears to allow several JDBC drivers for 3rd party databases.

It's free, and is designed to compete with (or drag underwater) Quest Software's Toad.

https://www.oracle.com/database/technologies/appdev/sqldevel...

3rd party drivers:

https://www.oracle.com/database/technologies/appdev/sqldev/t...


Off the bat I would say Metabase but it'd be good to know what kind of data you have because you can connect Grafana to Marinade and it'll give you really nice graphics but again, it depends on the kind of data you have


Google Datastudio, if you're fine with something in the cloud hooking up to your DB (and whitelisting the IP).

Metabase or Apache Superset, as others have mentioned, can be deployed on-prem so it's a bit more isolated/secure.


I am building Jig (https://www.jigdev.com), which you could use for that.

It's based on Observable (https://www.observablehq.com), which has a nice Summary table feature, sounds like what you need (https://observablehq.com/@observablehq/summary-table)


Power BI is popular in many organisations https://powerbi.microsoft.com/en-us/


Free: Power BI is probably going to give you everything you need. It's free, easy to use and provides a lot of features to grow into.

Paid: If you have the budget, Aqua Data Studio gives you the database management functionality AND all of the visualizations you'll find in Tableau in the 1 product.

(My company shifted from Tableau to Power BI. At first it seemed like a beta product with lower fidelity. But Microsoft has made the whole power suite into a force to be reckoned with... highly recommended)


Power Bi has been very abusive of my systems, although this has lessened of late.

I can see hundreds of logins to my database per user, and when I cut the logins per userid to 5, their applications collapse. Their queries cannot be tuned to available indexes according to my users.

They remain the very first ones that I throw off a database if there is a performance problem, with some degree of prejudice.


I've always preferred to shunt such analytics work over to a query replica dedicated to the purpose so the people doing analytics can generally only interfere with each other.

Whether that's a viable answer in any given situation is of course highly variable, but even if the analytics queries are my own hand-rolled SQL it's still my preference any time it -is- viable if nothing else so I don't have to worry as much about screwing up and taking too many locks / using too few indices while I'm iterating on the query in question.


Your users may benefit from using import mode rather than direct query mode.

Under the covers Power BI is running a tabular Analysis Services Cube, so import mode will be optimized for reporting regardless of the source database indexing.

The PBI dataset can be shared across users, so only 1 connection is required to update it, instead of dozens of users hitting the db directly.

Also as another person mentioned, reporting is usually done on an separate database to production applications like an operational data store, data lake, or warehouse.


I used to like qlik more than pbi/tableau but they stopped making the personal desktop version free. Qlik’s scripting and “associative” engine is great.


We use PowerBI. Wouldn't recommend. I also don't believe it's free, but perhaps that's just the version we use.


I guess it comes down to what you are trying to accomplish.

The desktop version is free, no strings attached. However the value comes from publishing to the web service for sharing etc. That's not free. It can be cheap, but when you have a lot of users the "premium capacity" can be quite expensive.


PowerBI is not free to use, its part of Office 365 (Microsoft 365 or whatever it is called today). I remember 2 companies ago (in 2019) some data scientists had to get a license for it approved.


For personal analytics the desktop version is free. The web service has different pricing tiers (which can be very expensive)

Edit...just to add to this. I personally pay for it and it's $5/mth. Premium capacity can be a lot more...but if the Data Scientist needed a license, it may have only been $5/mth, paltry compared to a data scientist salary.


It was a non-profit, every single EUR had to be accounted for. On the good side, they did get heavy discounts on some licenses, including from Microsoft Windows/Office. Except IT manager didn't know (even though he was a Microsoft fanboy), he only figured that out after paying full price for years...


I'm curious why you don't recommend PowerBI. What are the pain points?


Thankfully my memory is cloudy :-) maybe two points:

1. In general the idea of a separate reporting DB maintained by separate people who might not understand the semantics of schema changes seems inefficient in a world of cheap compute, compared to an application exposing key metrics directly. Upgrading every two weeks with an ETL pipeline off an app can result in RTL breaking every two weeks. Better just to release metrics from the same team that makes any other changes.

2. I think connecting hosted PowerBI to a IP-whitelisted Azure blob store is not possible, and that just seemed silly.


*Free - just as in "beer".



Since superset, metabase and redash was mentioned, I'll share some development insights regarding these OSS projects. Note, I've never used these solutions before, so I can't speak to their quality, but I can speak to how active and invested they appear to be.

Looking at recently merged pull requests that were less than 120 days old, apache/superset had 89 unique authors, which is very high, as the following shows:

https://oss.gitsense.com/insights/github?q=pull-age%3A%3C%3D...

Metabase had 37 authors, which is also quite high for an opensource project, which the following shows:

https://oss.gitsense.com/insights/github?q=pull-age%3A%3C%3D...

And redash has 11, but most of the contributions were more than 28 days ago, and is significantly less active than metabase and superset as the following shows:

https://oss.gitsense.com/insights/github?q=pull-age%3A%3C%3D...

I was actually quite surprised by superset, as I never heard of them before, but they are backed by serious investment (https://preset.io/about/), which clearly shows in how active their repository is.


With Redash, their team was aqui-hired by Databricks in June 2020 (https://blog.redash.io/redash-joins-databricks/).

They gave their SaaS subscribers a bunch of notice that it was being wound up, and some Redash Community members have picked up some of the load for hosting those customers.

Development of the main Redash repo though... has an uncertain future. There are people (Community members) improving things, and they seem to be getting shepherded/directly decently well by Jesse (was full time Redash employee).

So, things are ticking along. It might go really well and grow over time, or it might not. Not yet sure. Hopefully it does go well though. :)


Thanks for the insights.


I recently started using Arctype (https://arctype.com/) and have really enjoyed the experience so far. Relatively new tool, but the team is awesome. It's a SQL client that has some basic data visualization features seemingly geared towards engineers


agreed it's legit and a great native SQL client and replacement now that pgAdmin has gone all browser-based


It's not what you mean, but I noticed yesterday that Kaggle does what you're asking on datasets.

See https://www.kaggle.com/rhuebner/human-resources-data-set. I think it's a great view on top of a datatable.


Try the Pandas Profiling library [0]. Why do all the clicking and plotting and specifying variables when you can have a couple lines of code do it all for you?

[0] https://pypi.org/project/pandas-profiling/


Probably wouldn't use that to visualise more than a table a time, but it sure is useful if you're anyway on pandas


Maybe a bit off topic, but I created a simple tool which converts text to database scripts, it reduces time to create database tables and foreign key relationships, currently, MySQL, PostgreSql and MSSQL are supported

https://text2db.com/


I’ve used tableau and such, but lately Apache superset is filling my needs. Check it out!


I recently made a tool called Daigo. It's not as powerful as Power BI, Tableau or any OSS alternatives. But if you're looking to create a line chart or a dashboard from a SQL table or a CSV file, you may find Daigo a quick and easy solution.

It works with SQL databases and CSV file. Since it's an offline desktop app, it's free to use and you don't need to set up a server or upload data.

https://daigoapp.com/


Reverse question: What tool do you use to get data into the database? Google Forms is great, but if you want the data to go into your own database what tools are available?


I don't use it, but LibreOffice has the Base component that can be a data entry front-end for many ODBC and JDBC supported databases. I think OpenOffice has/had the same.

If you're using Microsoft, you can use MS Access for data entry to similar ODBC and SQL Server backends. If you want to do some VBA programming you can set up a UI in an Excel workbook too.


Full Convert can copy data between over 40 database formats. https://www.spectralcore.com/fullconvert

Disclosure: I am the Spectral Core CEO (and author of Full Convert).


Depending on the source, debezium might be useful.

Otherwise, there are many ETL solutions.


You can use DronaHQ ( a low code tool builder with drag and drop UI components to quickly build dashboards, frontends, etc) to visualize your data from sql table or other databases via table grid, plotly charts. https://www.youtube.com/watch?v=vfaqaC2rdzs

Give it a go at https://www.dronahq.com


apache superset

or its commercialized offer: ‎Preset (Cloud)


> or its commercialized offer: ‎Preset (Cloud)

Which includes free version for five users. But it is SaaS so your database has to be visible from the internet.

I really like it though.


Not sure about Preset specifically, but most SaaS vendors advertise their IP address so you can limit incoming database connections to the SaaS vendor’s IP address.


wizardmac.com

The most effective, efficient data exploration tool I've ever used. I'm a data scientist, but I use this before I write so much as a line of code.


Piggy backing onto this, which of these options has the best out of the box offering for geospatial data (coming from a postgres backend or REST api)?

I've tried google data studio, superset, but what I need is a integrated control where user's can filter the report based on their location. Or alternatively which would make it easiest for me to develop this control myself.


https://basetool.io

Credentials to admin panel in one click.


ReTool. The fact it’s interactive and scriptable with JS makes it better IMO than all other BI tools I’ve seen.


I'm deeply confused as to why people are downvoting this - maybe it's not a suitable answer for many use cases (it doesn't look like it'd be my thing most of the time, certainly), but I'm aware of enough very happy ReTool users to assume it must be useful for at least a decent subset.

Can somebody please explain why they consider this to've been a terrible answer?


Could be reflexive downvoting due to a perception that "low code" is not a data tool (I would disagree with this.) It could also be because Retool's reputation took a bit of a bit on HN several months ago (though marak has since taken an even bigger reputational hit...): https://news.ycombinator.com/item?id=27252066

Just guesses.


Someone recently posted this list to a similar question:

https://github.com/thenaturalist/awesome-business-intelligen...

From this I picked Metabase and found it to be pretty good.


If you want live updating charts with under < 1s refresh, I make http://www.sqldashboards.com/ . It allows interactive forms as demoed in the video. Disclaimer: It costs $47.


How has no one mentioned Sequel Pro and its successor Sequel Ace? It’s a classic, free native Mac app.

https://github.com/Sequel-Ace/Sequel-Ace


I don’t think it does visualizations.


I briefly used metabase - it looks a little too high level but seem to work.


Check out Appsmith. It can be used to visualise data using charts or tables. There's an integration for MySQL/Mariadb that you can use. You can also write SQL in Appsmith.



This is a Windows app but can do the job: http://pebblereports.com/


You already said "on the columns", so it will be https://columns.ai, :)


I wrote an app recently for macOS called Metaset: https://metaset.io/


Maybe not exactly what you are looking for but https://whaly.io can do the trick !



Grafana


I’ve enjoyed using Cluvio (cluvio.com)


Metabase, looker, redash, periscope


Metabase


grafana is what I use.


HeidiSQL possibly?


Infogr.am


Metabase




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: