We’re Andrew, Gary, and Rohan, and we’re the founders of Explo (https://explo.co). Explo is a platform that allows you to create external-facing dashboards and embed them directly into your application, admin portal, or website. Create usage reports for your dev tool, dashboards for your educators on an online learning platform, or sales and profitability dashboards for sellers on your e-commerce platform.
We applied to YC with an idea in the restaurant space (we knew nothing about restaurants), but quickly pivoted to build a tool that allowed you to analyze data directly in your database or data warehouse without knowing SQL. As former data analysts and engineers, we spent hours diving into databases to understand data and conduct analyses, so we wanted to speed up this process. Our early customers used Explo to analyze data by creating charts and graphs. They then wanted to share their visualizations, so we built dashboards, and then they wanted to share these dashboards with their customers. In fact, we first discounted the request as we wanted to focus on internal analytics. But as we continued to work with our customers, we learned that B2B companies were getting more and more requests to share data with their clients. For example, a construction tech platform we were working with wanted an easy way to surface customer data on purchase orders and contracting costs directly from their database securely and directly within their product. A virtual events platform needed to share stats on registrations, attendance, engagement times, for event admins after each event they hosted.
These companies want a snazzy dashboard in their application, but that usually requires a dedicated engineer weeks or months to build along with ongoing maintenance costs. They also don’t use BI tools such as Looker, Tableau, or Metabase because they are either not great for embedded applications, or too heavy for this use case. Instead, they settle for sending CSVs over email, taking screenshots of internal analytics tools, and uploading pictures to shared drives.
After learning about the various pains in sharing data with customers, we decided to pivot and build Explo. We saw that sharing data with customers was becoming increasingly important, and the external analytics space was much more greenfield than the internal space. Our goal is to be the easiest and cheapest way for companies to create dashboards that can be embedded directly into their application.
Our current platform connects directly to SQL databases and warehouses. We don't copy, cache, or manage any data. This makes security much easier to negotiate and allows us to offer a plug-and-play solution that's easy to stand up. We provide a SQL editor and dynamic parameters that you can inject into your SQL queries so that our users can transform and manipulate data before it renders into charts and tables. We've seen our customers use these temporary transforms as a template for future data pipelines so that the heavy lifting is done ahead of time. We work with companies with a variety of data infra setups from startups who create a read replica of their production database that we connect to directly, to companies that have a dedicated Snowflake warehouse with multiple data pipelines and clean data model built out.
As part of building out our product, we’ve had to tackle some pretty interesting and essential technical challenges. We created a SQL builder that can generate SQL across every major database and data warehouse. We implemented a git-like version control system for our no-code tool so that the embedded solutions could be versioned just like code. We’ve had to put our networking hats on to programmatically connect to very secure databases through firewalls and SSH servers.
We have a lot of ideas as to where Explo will go beyond a dashboard platform to enable our clients to share data better and we’re excited to hear your thoughts on the topic. How do you currently share data with customers, have you built out dashboards for customers before, or used embedded analytics solutions such as GoodData, Looker, or Tableau? We’d love your feedback and to learn more about your own experiences sharing data!
Looks like you are using Highcharts and chart.js ... and Leaflet for the maps? Since you are using React, Airbnb's Visx [0] might be a very good charting solution so you have a unified chart library with flexibility to meet customer demands in the future. Also, you will find customers demanding mobile access to their data and dashboards soon enough and having a system that is built on low level d3.js api's it will be easy to implement the charts using the same strategy visx does in React Native.
Thanks for the suggestion! We are mostly migrated over to Highcharts, though we have a few customers using legacy charts in some of the other libraries you mentioned. We've talked about how we want to build our own visualizations with more low level concepts (mostly haven't due to resource constraints), and visx looks like a really solid place to start.
To your point, we have already had a few customers that display their dashboards on mobile web apps and have had a few native applications want to use us.
Looking at where highcharts are today, I might have jumped the gun on that comment as they have come a long way. Three years ago we were using them. After I built our React Native version of the data dashboard I was giving the hard requirement to implement charts on mobile of which there weren't any out of the box solutions for at the time. Nonetheless, if you find yourself needing to implement custom charts especially if you have larger customers previously using Tableau or if you want custom branded charts, I do recommend considering visx.
Just to add another recommendation to the mix. I'm an extremely happy echarts user. I recently had somebody comment about the graph that was in my product as they were surprised by how it was done, which goes to show how flexible and powerful echarts is.
If you go to https://public-001.gitsense.com/insights/github/repos?r=gith... and look at the timeline chart, you can see that it is pretty flexible. I was able to convert a scatter chart into the timeline chart which supports scrolling left and right fairly easily.
This is super cool, thanks for sharing! I am always excited to see dashboards in different large products because it helps us understand how to push our product to make it so that the dashboard is buildable in Explo.
No problem. I'm also planning on open sourcing the frontend when I have the available resources as the value is really with the data. If others want to learn from what I've done and/or use it to integrate into their own solutions, that's perfectly fine by me.
Dope. Although I think your pricing is high, but so is looker and tableau...
---
What would be cool is dynamic dashboards that are launched via QR codes. I am specifically thinking about the cannabis industry:
A few years ago, I was making cannabis labels - which are regulated as to what information you must have on the label, and what I did was a QR code that went to a bit.ly link that then directed to the lab results for the cannabis as it was tested.
This allowed for me to scan a QR on a package, be taken to the lab results and also have interest tracked to the product by just counting who and what was scanned - where and when...
The point is, that it would be interesting to have a QR generated for every dashboard such that if I put a product dashboard up, and printed the QR on whatever product, I could then be taken to that board with a scan of said QR.
This would allow for IRL metrics to physical products and track how many scans happen... and a QR could refer to a custom board based on a variety of inputs...
This is a really cool use case! Especially with for environmental and ethical conscious brands.
We sort of do the same thing with QR codes in a factory at the moment where we trace production batches, stock management and R&D test data to link the label factory items back to the MRP data and dashboards. This was implemented using Lowdefy [0] - interestingly we also started Lowdefy out of the need for customer facing dashboards and have since widened the scope into a range of other things aswell.
This is a really cool use case! Especially with for environmental and ethical conscious brands.
We sort of do the same thing with QR codes in a factory at the moment where we trace production batches, stock management and R&D test data to link the label factory items back to the MRP data and dashboards. This was implemented using Lowdefy [0] - interestingly we also started Lowdefy out of the need for customer facing dashboards and have since widened the scope into a range of other things aswell.
Explo looks really cool! Congrats on the launch. I would love to see some videos on creating dashboards especially filters etc.
Wow the QR code resurgence is real! I'm curious why you chose to support data sources such as Mongo and google sheets first as oppose to others I saw that were coming soon (Postgres, MySQL, etc) and challenges you experienced building out the connections. Looks like we support a completely different set of databases.
And is your data pulled directly from your MRP system or loaded into another database first?
And we'll definitely be adding more examples and videos creating and embedding dashboards in Explo.
Although we currently support MongoDB, PostgreSQL, MS SQL Server, MySQL, MariaDB, SQLite, and Amazon Redshift, SendGrid, Http requests, Google sheets and S3. We started by building apps with MongoDB and the aggregation framework really allowed us to do more and more complex things ito data analytics which I doubt one could pull off in SQL. (I'm no SQL expert, so forgive me if I'm wrong). Our Lowdefy operators and application schema also took a lot of inspiration from Mongo's query language.
We usually deal with less than 100k records, scaling has mostly not been an issue for us, and in such cases we can run the analytic aggregations directly on the MRP read replica [0]
We built Lowdefy so that we can build better, more flexible, quality apps faster for customers. Then decided to OS it.
Lowdefy is designed to work any number of connections, so we'll be adding as we grow. Also, we prefer to not add a "thin" connection, but rather build out a well scoped and tested connection - this can be tricky, not sure of this is the best approach ito marketing as we have an extensive list of connection requests we would like to get to [1]. We'll start to prioritise these more in the near future. We'll also finalise module federation for connections, which will enable custom connections.
That is cool use case for customer facing analytics/information!
Every part of what you described other than the actual QR code generation is possible today! For each user input in the dashboard, URL parameters can be defined to default the input to a specific value on page load.
Your point around using the Explo dashboard to show more information is super relevant to one of the longer term ways that we are thinking about our product. Rather than just typical "dashboards", what we've built with Explo is a way to create user interfaces that share and communicate data. And we want to take it a step further since we've realized that a lot of web development and user interfaces is really just visualizing and communicating data.
Hi! We offer different pricing packages since we work with a varied set of companies that use the product in different ways. Some example use cases: metrics and data viz on a landing page, admin panel dashboards, billing dashboards, custom dashboard share links, etc. Depending on your use case we'd be happy to chat more and figure out a pricing model that makes sense for you.
In general we don't charge for # of dashboards or even traffic to the dashboards. We charge based on the # of end customer groups you are presenting dashboards to. This has been the most aligned with our customers since you can use the full power of the tool and only pay more as your own business scales up.
Just note that for some people, no pricing means "enterprise"/expensive. (I never even consider a product without transparent pricing myself, but others have had more success with this approach).
Got it - thanks for the feedback. To be transparent, startups and smaller companies we work with are paying $500/month (pretty cheap for replacing months of engineering working and maintenance). Our pricing then goes up from there depending on number of end customer groups.
We work with our clients on how the pricing scales up since some customers have very few end customer groups with tons of usage, whereas some customers have tens of thousands of end customer groups by virtue of being more consumer facing.
Let me know if you have more specific questions about pricing!
"startups and smaller companies we work with are paying $500/month. Our pricing then goes up from there depending on number of end customer groups." It's totally fine to be transparent about the starting price, since you appear to have one.
would make an excellent entry in a FAQ on the site. Likely would help qualify leads as well.
(I don't have any issue with the pricing, but would also not reach out given no pricing typically means "really expensive".)
To echo the above sentiments, no pricing is also synonymous with enterprise for me and I don't give any consideration to products without transparent pricing at all.
That makes sense. We are definitely not trying to hide an outrageous enterprise price tag. I responded to mchusma's comment with how our pricing currently works. Let me know if you have any specific questions about it and happy to dive deeper!
Would it be possible to put it in print where we could find it without iteration over the same query on a third party news aggregation website which is likely to never be found by anyone who could benefit from your services?
I’m excited about the possibility here but as a dev I’m waiting for that self-serve functionality. I would have paid a modest fee for a month or so to try it out and maybe converted to a prod customer if it met my needs.
Super interesting (and thorough) thread of thoughts, thanks for sharing!
We are definitely hoping to launch self serve in the near future but have decided not to while we iterate on the core features and ensure things work with a qualified and controlled set of customers. However, it is helpful to understand how important self serve is to developers.
Very cool! We've spent a lot of time + money doing this ourselves. I'd be interested to understand the pricing model and self serve options. I'd also be interested in adding alarms so we could use this to monitor our system status. We do that now but without visuals and it would be nice to have a slick internal status page with graphs and alerts for unexpected values.
Startups and smaller companies we work with are paying $500/month. Our pricing then goes up from there depending on number of end customer groups.
We are definitely hoping to launch self serve in the near future but have decided not to while we iterate on the core features and ensure things work with a qualified and controlled set of customers. However, it is helpful to understand how important self serve is to developers.
Adding in monitoring and alerts is something we are working on and has been requested previously. With regards to monitoring your system status specifically, we would likely need to investigate a means to connect Explo to where this data is generated or being collected. Explo would need to query this source on a certain cadence (probably pretty frequently) to have high fidelity monitoring. Currently we pull the data via queries and data isn't pushed to us (doesn't make sense in the current model, but is something we've chatted internally about).
Would love to chat more about the push model to better understand how Explo can plug into your system status. Is there a way that we could connect to the system status and pull the necessary information? Or can you push the system status data somewhere that we can then process?
> Our current platform connects directly to SQL databases and warehouses. We don't copy, cache, or manage any data.
Well that may be convenient for you but since these dashboards are external, does scalability become the responsibility of your customers? I can’t imagine thousands of users all hammering expensive analytical queries on a single unoptimized replica will scale very well.
Scalability is definitely something we're thinking about. Agreed that a single replica DB probably won't work well after a certain extent. Currently we do help customers to set up a data warehouse and optimized data model in Snowflake or other sources if needed.
We also refer customers to services or contractors who help them set up a their data infra to fit their needs as it's not the core focus of our product.
We have also thought about implementing a caching layer to improve performance, although that adds a few hurdles regarding data privacy and pulling live data.
Clicked into one of the demos, was greeted by around 10 progress spinners, possibly more. This instantly reminded me of some combination of Google Cloud's deeply mediocre console and waiting for a Jira page to render. I'd simply never prefer a tool like this or see it fit to force someone else to use it, and I don't understand why anyone would intentionally design an app to behave like this.
Please kill those progress spinners, the app is only rendering little bits of 2d art. Count each one as an individual statement of "you asked me for something but I haven't done my job yet, and now I'll make you pay for my laziness, and I promise if you click anything I'm just going to show you a million more progress spinners because I hate you and don't value your time"
Also please don't shoot the messenger, spinners were briefly an acceptable UI cue sometime around 2 decades ago, nobody honestly likes seeing them any more. If you explicitly design an app UI around the expectation of delays, it gives endless room to cut corners and add more delays. In other words it is optimizing the whole user experience for intentional mediocrity.
I'm curious what you would like instead of spinners? This product makes database calls that could take an unknown amount of time to complete. How should they indicate to the user that they are waiting for the data to build the chart?
It knew which queries to run before I clicked, because those queries are baked in. Why did I have to wait for the app to do something it knew it had to do if someone clicked? Of course we can't make e.g. protein folding an instantaneous task, that doesn't mean common workflows and request patterns can't be. I guess it's the difference between "did I ask the computer a hard question?" and "did I simply ask the computer to do its job?"
In the example dashboards, I'm guessing something like 100% of requests make exactly the same queries. Maybe in a typical corporate dashboard, 70% of users will pull up the default view before leaving. These cases easy to optimize for and definitely worth optimizing for
If you own the frontend and the database, sure. But this product hits other people's databases. The only way they can optimize this is by making queries it thinks the user might want. They can't cache the response because they have no way of knowing if it changed, since again, they don't control the data source.
If I owned the database they were getting data from, I'd be mighty upset at the insane amount of useless queries they'd have to make guessing what the user wants.
Easily solved by a user configurable staleness value with some reasonable default. Google don't crawl the entire web in response to every query because for the vast majority of queries it's unnecessary. For those where it might be necessary (like news), they instead crawl at a higher frequency or use some special flow (like they do for tweets), either way the result is seamless, involves no progress spinners and is well suited for the vast majority of users.
Funny you should bring up Google. Google is so hard on infrastructure that most big sites have special handling for Google scraper requests. At reddit we put google on their own slower server cluster just so they didn't break the website.
We only did this because of the extreme value Google brings via traffic. But most crawlers and other things that made speculative queries like that were just banned.
Hi, really appreciate the feedback! I 100% agree that the way we handle initial page load is not ideal and is an area I have been meaning to prioritize for some time. We've just been swamped with other customer requests for new UI interface and charting functionality.
I am curious your thoughts on a better way to approach this. My initial thought is for a single loading state which ends when all the data is ready to display on the page. It feels inevitable that there is some loading moment when we are fetching the data. It sounds like, based on your comment, that you would prefer there to be no loading state thought?
Precompute whatever you can I guess, but I'm guessing the app has parameterized queries and stuff like that which are hard to precompute. In that case even a cache or LRU list is fine, say precomputing the handful of most common views people will encounter most often.
Imagine even if only Jira's new ticket page and 'open tickets' search result were pre-rendered so loading them took <200ms. The amount of hate would probably drop by 90%
Separately if you can reduce the workflow for a page from being a task in its own right (most of which is waiting around) to something as simple as a click, it can increase user confidence a lot. If something that took 5 seconds (+4 of which is just waiting) suddenly completes in 200/300ms, folk learn new tricks for your tool, like middle-clicking open a bunch of screens, or noticing they can open and close it much more easily. It makes the whole experience feel more agile, which definitely has an effect on loyalty
Hi Explo team. Congrats on the launch! My previous company and my new company both provide insights as a service, so customer facing dashboards are the core product, and therefore this is very interesting to me. A few questions for the team:
1. I see that startups pay $500/month. How do you compare that to Metabase which is $85/month (or $385 if you remove the Metabase attribution)?
2. Do you, or do you plan to, offer the ability to embed individual charts instead of whole dashboards?
3. Do you support custom chart creation via python/R?
4. Do you think Explo is a fit for our use case or a different tool: we're experts at SQL, good with Python, and looking to rapidly create and embed customer facing charts?
1. We have a few customers who have switched over from Metabase for a few different reasons. The main ones are that we offer much more extensive UI components, more chart capabilities, better security guarantees, and highly customizable styles. While $500 is more than Metabase's price point, we believe that an embedded first solution is worth the price since it will save you 10-20x the cost per month on development and maintenance costs.
2. Yes! A dashboard is just a collection of one or more charts/UI elements and so if you make a "dashboard" which is just a single chart, you can easily embed that. We have many customers who have this use case to embed analytics granularly throughout their app.
3. Not currently, though I'd love to understand more about how/why you would want that. We've heard this a few times as a "nice to have" but would love to build it out with a customer that really needs it.
4. It sounds like you'd be a great customer for Explo! It takes all of the hard work of building out user interfaces out of the equation and makes it so that you just need to specify data queries with SQL and then use our drag and drop interface for UI building.
Hi, thanks for the thoughts and glad to hear this could be useful for you! Startups and smaller companies we work with are paying $500/month (pretty cheap for replacing months of engineering working and maintenance). Our pricing then goes up from there depending on number of end customer groups.
We work with our clients on how the pricing scales up since some customers have very few end customer groups with tons of usage, whereas some customers have tens of thousands of end customer groups by virtue of being more consumer facing.
Let me know if you have more specific questions about pricing!
Thank you for your response. I work in big corp. I think it is much easier for us to make something ourselves then to go through sales cycle/approvals to get us to use this.
Having said that, I look forward seeing what you will do in next year or two.
“I think it is much easier for us to make something ourselves”
says every developer until burning 100x cost/resources/time building a lesser version of the same thing that has to be now documented, supported and maintained.
From what I know (especially for big corps) $6k annual contract value ($500p/m) can usually be put on a credit card without triggering any lengthy enterprise 6-18 months sales cycles.
How does this compare with Google datastudio and other saas dashboard tools?
Tableau and others are more like native apps that have a separate overpriced "shareable" option but there are already plenty of saas tools.
Explo is different from the other SAAS tools as it's designed specifically for external use cases, so our platform is built with features such as security, flexible design, responsiveness, and version control as first class principles.
This looks great! We're extensive users of retool but while it's great for internal tools, the pricing and auth model don't really work for external tools. So this definitely fills that gap.
We literally just started our big partner dashboards project so I'm sending this over to the engineer working on it now.
One feature that's important to us is the ability to download tables as CSV. Sometimes those tables will be really large, so we'd want to paginate them but still allow the customer to download all the data in their csv. Is that something you can/will support?
We're fans of Retool, but definitely agree the requirements for internal and external use cases are quite different.
We already support CSV downloads from tables, so no worries there. We're excited to chat with your engineer and give them an in depth tour of Explo. Please have them reach out to founders@ or sign up for a demo on our landing page!
Not to jump in here, but have you tried tools like https://github.com/Budibase/budibase that would allow you to build both internal and external tools.
Our initial idea was a platform to digitize the restaurant cookbook! We wanted any chef or restaurant to be able to publish recipes and other food content on our platform.
For consumers, this would be a curated recipe platform with recipes from the best chefs as oppose to random users. For chefs and restaurant owners, this would give them an outlet to publish recipes and promote their work without the need to sign book deals and contracts that are often very unfavorable (unless you're already a celebrity chef).
We tried pitching various iterations of this idea to get restaurants onboard, but unfortunately we weren't able to sell them a platform that didn't immediately boost sales.
Regarding the SQL builder - Is it a visual editor that allows me to explore my database and create queries?
Also curious to know how you would differentiate your offering from say retool or other low-code builders which also support multiple databases and have chart components?
I am assuming that with screen rights, a company can expose a dashboard built using retool to their customers as well.
Hey! You can use Explo to explore your database and visualize data, but it requires using SQL to initially access to data so it is not completely visual.
Retool is a great example of a tool that we feel is adjacent to us but different specifically because it is not built to be customer facing. You can expose a retool dashboard but it isn't built to be embedded natively into your web app and there is no way to customize the styles to fit your app.
Additionally, our charts, visualizations, and data representations are much more catered to application dashboards - whereas retool has generic charting components in a more raw form.
This is interesting. We are currently evaluating Superset to provide dashboards to our customers, which is very feature rich and open source. Why would we want to go for Explo in lieu of Superset?
Not trying to be rude but I’m curious what makes you believe quicksight has a strong embedding story? I’m generally a quicksight fan (although it has so many things it could do better with a little TLC).
What setup are you using that makes quicksight work well when embedded?
Quicksight’s embedding story is worse than tableau’s. And tableau isn’t that great with embedding either.
Great question! Explo is designed specifically for embedding, so our product is designed with features such as design flexibility, security, and responsiveness top of mind.
We allow users to customize the design of their dashboards using CSS and Markdown, and have design elements such as containers to group analyses together. We found that users were hesitant to embed Tableau, Quicksight, or other BI dashboards because they didn't fit into their application.
Also, Quicksight and a number of other tools require you to define a data model, create analyses, and then add them to dashboards. Our platform is oriented around the dashboard, so you write SQL, create visualizations, and design dashboards all in the same view. This gives users a lot of flexibility when creating dashboards and also speed up the time to implementation.
Our product currently isn’t self-serve, but https://www.loom.com/share/fd63361f850d44f68cd395a552f5548d is a quick video walking through how to build dashboards. Feel free to take a look at a few examples of our embedded dashboards here as well: https://www.explo.co/gallery.
We applied to YC with an idea in the restaurant space (we knew nothing about restaurants), but quickly pivoted to build a tool that allowed you to analyze data directly in your database or data warehouse without knowing SQL. As former data analysts and engineers, we spent hours diving into databases to understand data and conduct analyses, so we wanted to speed up this process. Our early customers used Explo to analyze data by creating charts and graphs. They then wanted to share their visualizations, so we built dashboards, and then they wanted to share these dashboards with their customers. In fact, we first discounted the request as we wanted to focus on internal analytics. But as we continued to work with our customers, we learned that B2B companies were getting more and more requests to share data with their clients. For example, a construction tech platform we were working with wanted an easy way to surface customer data on purchase orders and contracting costs directly from their database securely and directly within their product. A virtual events platform needed to share stats on registrations, attendance, engagement times, for event admins after each event they hosted.
These companies want a snazzy dashboard in their application, but that usually requires a dedicated engineer weeks or months to build along with ongoing maintenance costs. They also don’t use BI tools such as Looker, Tableau, or Metabase because they are either not great for embedded applications, or too heavy for this use case. Instead, they settle for sending CSVs over email, taking screenshots of internal analytics tools, and uploading pictures to shared drives.
After learning about the various pains in sharing data with customers, we decided to pivot and build Explo. We saw that sharing data with customers was becoming increasingly important, and the external analytics space was much more greenfield than the internal space. Our goal is to be the easiest and cheapest way for companies to create dashboards that can be embedded directly into their application.
Our current platform connects directly to SQL databases and warehouses. We don't copy, cache, or manage any data. This makes security much easier to negotiate and allows us to offer a plug-and-play solution that's easy to stand up. We provide a SQL editor and dynamic parameters that you can inject into your SQL queries so that our users can transform and manipulate data before it renders into charts and tables. We've seen our customers use these temporary transforms as a template for future data pipelines so that the heavy lifting is done ahead of time. We work with companies with a variety of data infra setups from startups who create a read replica of their production database that we connect to directly, to companies that have a dedicated Snowflake warehouse with multiple data pipelines and clean data model built out.
As part of building out our product, we’ve had to tackle some pretty interesting and essential technical challenges. We created a SQL builder that can generate SQL across every major database and data warehouse. We implemented a git-like version control system for our no-code tool so that the embedded solutions could be versioned just like code. We’ve had to put our networking hats on to programmatically connect to very secure databases through firewalls and SSH servers.
We have a lot of ideas as to where Explo will go beyond a dashboard platform to enable our clients to share data better and we’re excited to hear your thoughts on the topic. How do you currently share data with customers, have you built out dashboards for customers before, or used embedded analytics solutions such as GoodData, Looker, or Tableau? We’d love your feedback and to learn more about your own experiences sharing data!