Tableau is a good product. I use it daily. There simply is no better alternative for quick ad-hoc, enterprise-level, intuitive graph and dashboard making.
I've been at large orgs where we evaluated all the cloud services like Mixpanel and the ilk, but they are very different use cases.
Tableau + Redshift (or Vertica but Redshift is way less expensive) is the typical BI tech stack. When I go a client or a new job and ask them what they use for analysis and if they say "Good Data" or "MicroStrategy", I know my job will be a pain in the ass to get done. Its as if you're interviewing for an engineering job and they say "oh we all use NotePad here". Its night and day in terms of productivity.
I don't know what this means in terms of the stock, but all I can say is that I vouch for the product and think it has a lot of room to grow (from a BI perspective).
We used Tableau for about a year and then decided to drop them. The reason was a sudden change in licensing terms where:
1. The minimum step increase in server cores changed.
2. The cost to run ha increased.
3. The cost per core increased.
The only thing that is worse than an expensive product, is one that has unpredictable pricing.
Tableau is not scriptable. 3 places I have worked are moving away from it because of this severe limitation. You have no control over what queries it writes and you can't easily add simple scripts to make things more seamless or performant, like say a quick scripted query to populate a dropdown menu.
It's not very scriptable but that's not the use case of Tableau. Generally data analysts do not know how to program. They meet with PMs to understand concerns, interpret concerns into metrics, write some SQL, and display them in dashboards.
While Tableau performance is a thing that people debate, its a non-issue. Generally, the data analyst produces a report that is consumed by 1-10 people and if that consumption grows >10, then the organization will "productize" the report by putting engineering resources on it and build something custom with HighCharts or D3.
>Generally data analysts do not know how to program
In my experience, the ones that don't often produce some really erroneous or nonsensical results. Or even if they do get some things right, they have no ability to drill deeper.
Most big companies are actually just fine with nonsensical results. It's a bit odd, but I've realized that nobody who uses BI actually cares much about data quality or integrity.
BI in large corporations is usually done by people having no clue about the data, producing bullshit results, leading to "insights" to management (who at that point have have no clue that all the cost columns were added up, regardless of currency), leading to a final "product strategy" that gets implemented over the next years, while in all the companies still producing innovation, this is usually done by a rogue team applying common sense and ignoring that strategy all day long.
I was a SI/BI engineer for a while; let me chime in to defend my people just a little bit :)
Certainly there are lots of bullshit metrics, there is often very little desire to audit data or even do a by hand sanity check once and a while.
That being said, there were _many_ engineers who actually gave a fuck about making sure we had telemetry in actionable, meaningful, and appropriate places, and unfrotunately the gap in analytics would take place far over their heads, when the upper management would produce documents with all the beautiful charts and graphs, communicating.... absolutely nothing.
Graphs with mixed axis scaling (log vs non log) graphs with mixed units, conclusions that are a total stretch from the data that's there and ignoring obvious conclusions that don't line up with what the managers want to say.
For there to be USEFUL BI (and such a thing certainly can exist) there needs to be a "this isn't bullshit" mindset up and down the whole stack, not just in engineer land.
While what I said was actually an amalgamation of true stories I've seen myself, I was certainly exaggerating a bit :)
I think a lot of things come together for this pattern, which definitely happens quite a lot. Best predictor for this scenario is a clear divide between product and BI people, usually exacerbated by the fact that non-technical people get hired for BI.
Group bias by backend engineers who see BI as "just some point and click" which isn't really that challenging (but usually much better paid, as they are catering sales/bizdev/C-level which is always closer to the money) also doesn't help.
Also, as you say, often times the value of BI doesn't trickle back down the chain and that way, tracking is at most a second thought for application engineers when going prod.
Having built two analytics stacks myself (and seen the perspectives from BI, backend, sales and marketing alike) these are exactly the drivers we tackled first at our current company.
Our recipes against this common failure are: Marketing directly working with product engineers for their tracking requirements (with just some coaching from tracking pros) - and dual-using our analytics stack (Snowplow -> Redshift) for both operations (user segmenting, push notifications) and business intelligence.
This is usually considered a big no-no in BI circles which all tend to duplicate data to be on the safe side, but it helps immensely to make sure product and engineering are just as interested in data quality as the BI guys, as they depend on the very same data.
It's certainly not for everyone, but for us, it works really well.
i've only ever worked at small companies, and i guess to avoid things like this is why, but i can't help but wonder how common this is, and if in fact no one has ever used BI effectively. i find that hard to believe but really i have no idea, and its an interesting thought, makes you want to chuckle or shake your head or both.
Of course I wouldn't go so far as to say no one has ever used it effectively, but it seems like in the vast majority of cases, it's bullshit all the way down.
It's a surreal experience when you realize how executives are making decisions based on something that, after a moment of critical thought and common sense, clearly has no more relevance than some integers pulled out of rand(). Chuckle and shake your head is indeed about all you can do.
Just try not to visibly smirk if you're ever involved in acquisition talks. Play along.
In other words, it is for big shitty companies that have hired a bunch of useless people who can't do anything and need reports to get their bosses off their back.
Run a trace on your database and watch the queries Tableau actually runs while building your worksheets and dashboards. It's doing a lot more than you think.
I've played with Tableau, is there no scripting at all? Their previous Director of Analytic Product Management Stephen McDaniel stated that in v8 they were going to add server side Javascript - and he said this in 2012...
Tableau is a great product but with a high price stigma. I really like it, when clients asking for recommendation my first answer is Tableau. They like the product and the features but the reaction is always the same: nice, but it's too expensive for us, we'll go with PowerBI. Competing with Microsoft is a tough game.
It's not PowerBI they are competing with. It's Qlikview. Their biggest issue is that Qlikview is viral: you can use the evaluation version ("personal edition") for as long as you want, but it has restrictions. With Tableau, you get a 15 day trial. Consequently, it gets used in businesses for small data projects, gets seen to be really effective, then the business realises it's effective and buys licenses.
Tableau isn't SAP, it just isn't. If they want to really get into businesses, then they need to be sensible and give people the opportunity to use it and then get it into the businesses they work in.
For instance, I used Tableau to learn it for the 15 days they gave me, and I learned quite a lot as they have great documentation, but then after the 15 days I got no more opportunities to go through their tutorials. Partially I got busy on other things and wanted to revisit it after a few days, but I also setup SQL Server SSAS which took me a bit of time. I got a maximum of about 5 days usage, after that there's no point having it on my workstation as it's far too expensive for me to justify buying a copy.
If I could have had more time with the product, I guess I'd know how good it is so I can recommend it to the business I work at. Unfortunately, I can't without cracking their trial limit code, which I'm just not prepared to do. For now, I guess I'll be recommending Qlikview which is a known quantity and very good also, though nowhere near as intuitive.
This is definitely an issue. It's difficult to really sell a tool like this to your managers unless you can prove over time it saves you. With orgs I've been at the decision process was like
- order 1 license to test it out
- analyst gets order of magnitude work done more than co-workers
- team manager buys 10 licenses and a tableau server
The price is expensive but if I'm charging $100 - $150 / hr and it takes me twice as long to do something in PowerBI that it does in Tableau, Tableau quickly pays for itself for the desktop software. Then you can multiply that across a team of experienced BI professionals and even the product team because they can easily view workbooks and give feedback. That would easily outweigh the costs of Tableau Server.
Looking at their pricing ($500/user/year) seems pretty reasonable for such a high-impact role. If you're at a company, the question is if it saves you ~5 hours in a year.
It absolutely does. I know what life was like before Tableau (MS Excel + webquery, custom highcharts, cobbling together graphs from multiple SaaS services, custom python+R code) and I can say it probably saves 5 hours in a day or two.
That is not a adequate comparison. You aren't done after you bought the software. You need someone to set it up as week and that cost scales with the quality of the software you bought. You need to look at the cost you have once everything is said and done.
It is expensive if you have a lot of people. Paying $2k for a license may be for two or three analysts is a good deal. Compare this to a start up like Looker which costs substantially more to get your foot in the door (last I checked, correct me if I'm wrong.)
IBM has chosen to display "Analytics" as a major piece of their company. Presumably they want investors to think this could generate billions of dollars in revenue yearly. Worst case scenario for me would be IBM to buy Tableau, but clearly for those of us who use it, Tableau is one of the best tools out there for what it does. Will it be able to keep its marketshare 5 years out? Who knows.
For the record, Periscope has an incredibly poor UX. It suggests "run selected query" when you highlight a single column name... As if that's valid sql.
Periscope is in a distinct, lower tier of usability. You write SQL. Or, put differently, the innovation of Tableau was so you don't have to write SQL for most visual encodings, even non-trivial ones.
However that does limit you a bit when wanting to do deeper analysis. Looker/Periscope are tools for analysts who know and are comfortable with SQL. (Disclaimer- no affiliation, just trying to build a desktop tool that sits in the spectrum between Tableau and Jupyter Notebook)
You can still write SQL in Tableau, that's table stakes.
Funny to me: it took years for Heap to introduce that.
Edit: we use and help customers with IPython/zepellin etc, but that doesn't change that these things are 10 years behind as IDE/visual interfaces. Adding charting & drilldowns is much harder than a button to turn a table into a line graph, and analysts are wasting a lot of time due to this.
Yes, completely agreed. There's a continuum of UX between BI tools like Tableau/Qlikview and the interactive prompts used by most data scientists (mainly around the non-interactivity of display)- it's sad that an 80 character static tty display is still the state of the art in that space. While I'm very comfortable in ipython/terminal I often wish I could easily pop up a Tableau interface on top of my pandas analysis. Jupyter is getting there slowly but the widgets people are building are still more geared toward display than interacting with data (both original and derived) directly. Even the new set of SAAS BI tools (Looker, Mode, etc) leave interaction as a secondary concern. I think that's the main difference between tools geared towards reporting/publishing (most of the BI world) and tools geared toward data analysis (R, python, etc). As you point out, the tools for data analysis are years behind the BI tools in terms of UI/UX (or their focus is more on IDEs for developers rather than analysts).
As a side note, I've been following your work for a while now (both Superconductor and Graphistry) - mind if I shoot you a few questions privately?
I am stuck with SAP Business Objects (Crystal Reports) at my company. It is ok, but the features of Tableau look like they blow it out of the water. I've tried to get discovery projects started up to look at other options for due diligence. However no one seems to go for it. Oh well :/
SAP have been gutting Business Objects for some time now. I know someone who is fanatical about Business Objects Universe Designer, but they tried to use the later version and they found that a lot of features were stripped away, so she went back to the older version and refuses to upgrade until she's certain the new versions are on par with the old one.
MS SQL server is very expensive, especially thanks since their recent license changes. Microsoft BI (PowerBI, etc) is to lock you in to their ecosystem (MSSQL, Excel, SharePoint/Office365, Windows Servers/Azure cloud).
The thing is, if you already decided on MS SQL Server, why buy Tableau? The company I work for has both, and Tableau is the most likely to be dropped of the two.
Hello? The MS SQL Server license has been changed to be per CPU core. Consultants want you to create a seperate MS SQL Cluster (serveral bare metal servers with many CPU cores). So MS BI will be a lot of more expensive as you first think. Most see just the Tableau is more expensive than BI and decide for MS BI just to find out MS SQL is rather expensive. Also you need at least MS SQL Server 2014, despite dark pattern speak (lies) about that 2012 may work too. Oh and don't forget they will mention that you will need a seperate SharePoint cluster (web and app servers) to run Excel webservice and the SharePoint based dashboards. Oh and you will need to upgrade all your clients with Office 2013. And they will invite your CTO to an Windows 10 event and give away some WinPhone10 and tablets. And then you will have to figure out how to comply to local state law by upgrading to the expensive Windows Enterprise LTSB license.
At least in the case I'm describing, the company already has an expensive MS SQL Server 2014 license (Core licensing) which allows for an unlimited (ok, something like 50) amount of VMs, and they already have the expensive bare metal servers, so that's not a concern, since it's a sunk cost.
All clients are already on Office 2013, so that's not a concern either.
The Sharepoint cluster probably is a concern, but that's about the only thing, I'm pretty sure it'll be cheaper than Tableau.
I wish they'd invite the CTO to a Windows 10 events and give us some giveaways :) . No local state laws, this is for an insurance company in South America btw :)
Edit: the opinions are strictly my own, I'm not involved in any decision making sadly (no Windows 10 giveaways for me :P ), etc.
Finally someone else who has felt that pain of GoodData.. I was hired to make it work after GD's consultants set it up at my current company. What a nightmare. It took a year+ but I finally convinced everyone that it was terrible and we switched to Tableau, which let us use our own db's instead of GD's terrible "logical data model"
Less moving parts. If I get a request from a PM, I quickly open up Tableau, select a datasource or write some SQL and deploy. Sometimes data asks that they think would take 24 hours or more can be turned around in under an hour with Tableau. You can also train and hire less technical people to use Tableau...its basically like advanced Microsoft Word.
It depends on whether you want your BI to be generated by a battle tested product, or by a melange of a rickety, buggy, half baked stuff that hardly anyone is able to use without paying $10K+/yr for a support contract.
We jettisoned Pentaho about 2 years ago because of the effort required to develop and maintain it.
At the time, between the ETL and reporting layers, it acted more like a set of different open source apps simply branded together, and interop required more effort than what should have been necessary.
Debugging was a nightmare as well. Huge stack traces on simple errors made locating problems difficult, and there seemed to be little information in the community. The number of Java library layers spewing out on a simple JDBC driver error was mind-boggling. Pentaho of course has a interest in revenue from support contracts, and most inquiries into simple issues in the forums led down that path. It may have gotten better since, but there was a long way to go.
We switched to Tableau on the front end, and "old fashioned" ETL scripting in Python (now some Go) on the backend. At the same point today I would consider something like R/Shiny, but for speed of implementation Tableau would also be a contender.
Basically same here, but a few years earlier. I consider a three hundred line stack trace to be a valid reason to replace a system with something less operationally challenging. When engineers are $100+/hr, and some way more than that, it often ends up being cheaper as well.
No, although it required a little ingenuity. We would have considered using Shiny Server Pro for the auth and other features but we wanted to use our single sign on service. What we did was we put Nginx in front and had it call a tiny Rails app that handles authentication via the SSO. If Rails returns the correct status to Nginx, then the client is redirected to the Shiny page they requested.
Its a bit of a one-off but it does work well! One more step towards making R fit for production :)
Just to offer a different opinion: I know a lot of people who have been switching away from Tableau or not choosing it and using a different solution (or multiple) instead
I've been meaning to write a blog post about this actually. Maybe I'll actually do it :)
I guess it depends what you're trying to do. What part(s) of what Tableau offers do you care most about? How big of an organization?
Chart.io is one product I evaluated that met our requirements. We cared most about collaborative querying and the ability to run templatized reports (including regenerating them via the API) so we ended up going with Mode Analytics
'what it means in terms of stock' Exactly- The problems (big price drop) arise only when the stock price is at unreasonably high levels based on unreasonable growth assumptions by Wall Street. So the same people that cause it go 'bubble' then cause it to go 'burst' wreaking all kinds of collateral damage (employee stock options, morale, etc).
Wall Street isn't responsible for the bubble. It's the company leadership misrepresenting the company's road map. A stock won't become overpriced if the CEO blatantly says that sales will probably only go up 5% this year.
Telsa appears to have been overpriced for quite some time. The stock kept rising based on emotional reactions to videos of falcon doors, robot recharging arm, hypertube test tracks, and reusable space craft landing. If Elon Musk can make all that stuff happen surely he'll find a way to drive the stock up to $500? But even Musk was saying the stock was overpriced. And now it is correcting.
From experience, there's a nontrivial intersection between the BI stack and the operative administration stack.
However, in the difference, the admin stack with ELK, Riemann, Graphite, InfluxDB, Airbrake, Icinga and whatever else assumes that 30% of the information right now is worth more than 100% of the information 3 weeks later. You know, my application port is closed, first level support is getting hell, I need any available information right now.
On the other hand, the full data warehouse/tableau stack assumes that 100% of the information is more valuable than anything else. Good DWH-Guys and their analysts can do very awesome data voodoo, predictions and analysis, and the admin stack won't be able to reproduce most of that. It just takes 2 month of data collection and 2 - 3 weeks of analysis to get the result of that grand voodoo. And Tableau can automate that voodoo after it's been done once.
The ELK stack is really fancy next generation "tail -f | grep" in a web browser. Which is awesome and really important, but a very different problem from "tell me how many users have used my app for three or more days in a sliding thirty day window".
Furthermore, their support is amazing. I've emailed their support in the last couple of weeks and they've called within an hour to help resolve the issue.
Their desktop publisher or Tableau Server? Because Tableau Server is the biggest POS I have ever used. Their API is fundamentally broken, their authorization model misses large gaps, and client-integration is seemingly something they never considered.
Never heard of DevExpress but just by going to the website it feels far less intuitive than Tableau. Tableau is so freaking easy that you don't need to be experienced in BI to be user.
Though the outcome is a bit worse than I think most predicted, this has been telegraphed since about November of last year. Tableau has had a very high short interest ratio for sometime, everyone was expecting poor results.
What really exacerbated their issue was that they didn't get out in front of this and let people know how bad they were going to miss. So when they released earnings everything went to shit on them.
I can't really find any excuse for this that passes Occam's razor except to assume this is a very rookie CFO /CEO combo in charge.
I made a comment about 3 months ago saying that 2010-2014 were very kind to new tech IPO's and 2016 would be the year that companies would have to either make money or face the consequences.
I'm following this over the next 2 weeks to see what the stock does. Its now a pretty good Pre-Merger Arbitrage(take over candidate) candidate.
- Market cap under 5 Billion, check
- Institutional owner ship > 50%, check
- Insiders hold less than 5% of the company, check,in fact its at 1.09%, wow, management doesn't even want to own the company!
- below its IPO price, not yet, that was $31 but its still falling.
LinkedIn got crushed today in a very similar fashion on weak earnings. So, it's not just Tableau. In fact, the whole tech sector is starting to show some weakness which can be seen from the NASDAQ's recent performance. A lot of last year's tech-related IPOs have also performed terribly (e.g. Etsy, Box, Match, First Data, Square). Especially recently a lot of the previous momentum darlings like Apple, Amazon and Netflix have started rolling over, too. Facebook and Google were holding much better after strong earnings, but even those are starting to show signs of weakness now. Seems like we might be at the end of the phase of explosive growth over the last couple of years.
Maybe the people buying yesterday didn't expect poor results. Or at least they expected the kind of poor results that would make the stock go up. Definitely it did come out "a bit worse" that they predicted.
Do you have any insights into how badly core infrastructure institutional investors are going to get hit when these large, institutional-owned tech companies deflate? Any metrics on how much of the equities market is owned by pension/retirement vs. free, non-dependent, non-core institutional investments?
About the time Power BI was hitting general availability, Microsoft was announcing the acquisition of Datazen, an alternative mobile dashboarding solution. Both technologies are components of SSRS in SQL Server 2016.
Acquisitions aren't out of the question where MS BI is concerned, even with apparent conflicts in product.
I don't know what Tableau Software is and I found it pretty amazing that when I Google it I could still not figure it out because all the results were about the stock price. I think people might be focusing too much on stock price is Googling a company's name doesn't result in seeing the company's website on the first page of results.....
Tableau is a surprisingly good desktop application for business intelligence and analytics that, IMO, grew too big for its own good. You plug it into data (spreadsheets, databases, you name it) and it makes it very simple (drag-and-drop simple) to crunch some numbers and produce interesting graphs, visualizations and analysis.
Some clever stuff they had was (early-RoR-style) guessing of data semantics based on heuristics (Column labeled "date"? probably worth grouping by months. Number pairs like -0.3242,0.12345 ? probably worth plotting on a map).
I have used trial and educational versions in the past and was always pretty impressed with the ease of use and the results. The product itself was > $1k so I never actually purchased a license.
Even though I didn't follow the strategy very closely, from what I've run into it seems lke their offerings were all over the place. I saw a local newspaper using a hosted Tableau product to display data visualizations on their pages (the kind of visualizations made by a company whose DNA is desktop apps, so kind of underwhelming). Their website is all about Gartner and enterprisey lingo.
The product itself is, well, some advanced version of Excel. This means you either sell a lot of it very cheap, or very little of it for a lot of money. Website design suggests the latter. Too bad it seems to not be working.
I'm honestly sad to see them do badly as I think the product was really innovative and it really changed my way of thinking about report building and analytics tools.
It's a pretty good product but everybody and his brother sells analytics and visualization tools, most of the companies that have been around a while sell a few different tools since they've grown by acquisition.
Tableau is more differentiated than it looks, but it doesn't look very differentiated. If Microsoft ever gets around to making many of the "smart" features in Excel actually work (i.e. csv import and automatically choosing what kind of graph to make) they will have nothing to stand on.
I'm just waiting for the street to realize that Hortonworks and their competitors also have zero moat, and if anything, a GUI interface for a Hadoop cluster is value subtraction in the age of devops.
Honestly I have never used anything similar to Tableau in the balance of "straight-forwardness" and quality of results.
My use case is: here's some data, let's take a look at it and make some decent graphs including any sort of comparative analysis.
Excel is... well, Excel. The kitchen sink bundled in it contains pivot tables. Graphs are hideous out of the box, and very limited.
Apple Numbers makes cute looking graphs, but is extremely basic.
Viz toolkits in JS make you code for stuff that should be straightforward, at different levels of abstraction (Highcharts -> D3). And I have to put stuff in SQL and do the plumbing if I need any meaningful preprocessing.
Self-proclaimed "business intelligence" tools don't really target this use case. They solve enterprise problems like connecting to MDX sources that are not there for me.
I think if you are a smaller operation and you have specific requirements that you've thought out and aren't covered well by basic tools, then getting comfortable with JS and SQL (or R or matplotlib or Jupyter or whatever) is going to be the best way.
I've spent too much time watching people spend months and tens of thousands trying to bend 12 different third party tools together when a developer could do the whole thing in a week and it would actually do what you want it to do.
Nailing down the actual requirements is the hard part of building your own dashboard, it sounds like you've done that. If your needs are stable-ish or you have consistent developer resources it's going to be hard to find a better fit than that.
This. Freelancing is the future. Hiring in-house developers, or better still, freelance programmers will ensure that you will have custom-built your maximum BI tools and with a reasonable price. Of course, finding and hiring good freelancers who are not only dedicated in their craft, but also reasonably priced is a bit involved task, but certainly doable.
That's why I was repeating the importance of him having some well thought out requirements. In situations where this is true (and it really happens sometimes), I've regularly seen good developers come in under budget and mediocre ones at least come close.
In 99% cases, that happens because your requirements change. If you hire a competent developer, the total cost of development is always a pittance compared to what you pay monthly/quarterly to a product based company.
> And I have to put stuff in SQL and do the plumbing if I need any meaningful preprocessing.
Actually, the plumbing isn't as difficult as most people make it out to be. Nowadays, its the era of abstraction and FOSS infrastructure tools like jQuery, Bootstrap and like you mentioned, Highcharts. I'm a freelancer who quite recently developed a Tableau replacement for one of my clients. This client realized that all he basically wanted from tableau was a chart and some basic data manipulation like sorting, filtering and grouping (count, sum, average, etc.). All the things needed to develop this little app already existed in the FOSS world:
1. Highcharts/jqplot for charting.
2. Twitter-Bootstrap for showing a professional front-page and UI elements.
3. jQuery and jQuery-ui for DOM manipulation, AJAX handling for SQL queries, enabling drag/drop, etc.
4. PHP/Mysql on the backend (which is needed in anycase).
As for plumbing, each of these tools is so well-documented and also a simple Google search will point to tons of StackOverflow links that happily provide an answer to any and every question you may have!
tldr; Library/Framework plumbing might seem complex initially, but for a practiced Web-Developer, its like a cake-walk!
Hey, question on how licensing for highcharts works as a freelancer. Do you have your client buy a single website license with you as the dev? Or you have a highcharts dev license? Or something else?
Nope, we went for jqplot (http://www.jqplot.com/), it is GPL. The client initially wanted to go with Highcharts, but when we found about its licensing, I started looking for a FOSS alternative and this was what I found.
rms_returns - are you interested in discussing a project I'd like to have a freelancer work on? You can reach me at http://www.simplelegal.com and my HN username.
jqplot is what I use extensively. I haven't used D3 yet, but heard that it takes a lot more initial coding to come up with even a basic line/bar chart.
Power BI is Microsoft's answer to Tableau. You can judge on how well it achieves this. There's a free (as in beer) standalone Windows desktop application, and an Azure-hosted collaboration and hosting portal.
Microsoft now has PowerBI in its Office suite that is pretty powerful. Its a Tableau competitor that I believe is a big reason that their stock is falling. It integrates with Excel, Access, etc. and a ton of different other non-Microsoft services.
I'm not sure about the Desktop app, but the web app should work in any modern browser today. Features are generally prioritized using user feedback, and this is one of the top suggestions on the ideas site:
The free version of the service is too limiting to be very interesting to us. However, it is available from Microsoft resellers as a standalone for about $10.00 per month per user with no contract.
The free desktop version is not limited. It's very powerful. The growth rate is astounding. We get significant new features every month, sometimes weekly.
If you compared PowerBI to Tableau 6 months ago (or even 3 months ago) you are out of date. Its data model capability is far superior to Tableau. The formatting and graphics capability is catching up quickly. The intuitive UX is first rate.
Zeppelin is really neat, but still quite early stage (I can't even export notebooks :( ). For day to day use & teaching I've had good results with Jupyter + Spark
I'm still amazed that csv is still so poorly implemented in excel. Especially since its such an easy fix to honor byte order marks and default delimiters. I have to assume "sorta works" is the strategy. We all know how well that worked out for internet explorer.
or very little of it for a lot of money. Website design suggests the latter
In the BI market, Tableau is very much one of the low-cost, high volume players. They've done a great job getting into lots of people's hands, the question has always been whether or not they can roll the volume of cheap(ish) single user licenses into large enterprise-type deals.
Nail on the head regarding either selling a lot cheap, or little at high price. They went for the latter as you mention. I manage software licenses at my employer, and have been put off the price of Tableau, and the inflexibility they have. We're a consulting house, and mostly use Qlik. We started using Tableau last year.
The problem for us is that as a consulting house, our staff turnover rate is high (as expected I suppose), and this doesn't work well with Tableau's named license model. We've run into troubles where we reassigned licenses after people either left projects using Tableau, or left our company altogether. They ended up forcing us to buy extra licenses, which I was unhappy with.
My stance since then has been to avoid buying more licenses. We're getting our team skilled up more on web-based viz, so we can reduce our dependence on desktop applications.
A recent client of mine dealt with this by buying dedicated laptops to run Tableau on, when you needed to use it you just tracked down one of them. Seems like there should be a rule of thumb around this--software licenses should not be so expensive or inflexible that it's cheaper to buy a dedicated $1000 piece of shareable hardware than to get one for each user.
Interesting approach! How practical is it though? I can export a tbwx or whatever the extension is, so that those who need the dashboards can use Tableau Reader. I'm however wondering if such model would work for us.
The one thing that I like about Tableau is that its documents are XML, which plays nice with git.
It seemed to work reasonably well since the typical user was only needing it an hour or two each week. All of those users needed some authoring ability so Reader wasn't a solution in this case.
Extracts are binary yes. If you're using a workbook without extracts embedded, you're working with an XML file. We do that most of the time. We only embed extracts when someone with a Tableau Reader license needs to use a dashboard. We normally keep those out of source control.
Would you consider building the visualisations yourself with something like R and Shiny? Self host the server, develop in-house, no more Tableau like problems.
I was working on a project where the client wanted to use Tableau and I pushed instead for Shiny, which is the way we went. Quite happy with the decision.
It mostly depends on client needs. We had one project where I was thinking of going d3.js and wrap it around an Electron app. Our investment in R is still not at the level I'd want, we're mostly a SAS house ...
Most of our clients are tier 1, so in some instances they already have heavy investment in Qlik, but some are willing to shell out cash for Tableau where it makes sense. I've seen senior management in clients complain about the price though at times.
> My stance since then has been to avoid buying more licenses. We're getting our team skilled up more on web-based viz, so we can reduce our dependence on desktop applications.
Anecdotal, but I've seen this happen several times with clients who previously used Tableau.
It's a bit ridiculous. We get global discounts, but even then we still end up concluding that it's expensive. We only have desktop and online licenses, server licenses are rooftop crazy.
My last gig paid high five figures for a server license, and after a year of trying to get it to work as wanted, we threw it out and wrote our own in-house web solution with open source tools.
I used to work at a company where, under duress, I had to push tableau to its limits. My company spent hundreds of thousands of dollars on it. The main use was to provide reports on the data created by the application we sold.
It would be great if it just stayed as a better version of the excel chart maker. Instead, it was seen in my company as a way to replace programmers. All of these types of applications have the same flaws:
They sell a dream that you can turn a complex task that requires experts, into a simple task that anyone can do. In reality, you either have a simple application that doesn't do much, or an extremely complex application that still doesn't offer the performance and flexibility of just using an expert.
You end up creating a programming via drag and drop application that is more complicated to learn than actual programming. You replace general programmers with useless tableau specialists.
It took the tableau experts weeks to do what a programmer could do in days when complex requirements came up. Most requirements were complex.
Tableau Server is very, very slow and requires massive resources to run.
Tableau had to run the full, unfiltered query so that it could generate filter lists with all of the possible options. This query was often too large to run.
> desktop application for business intelligence and analytics
I tried to buy the desktop app a few months ago and just got bogged down with the sales guy. They tried to sell the server products (with very high per-user pricing) when all I needed was 1 desktop licence and the free viewer.
It was difficult to work out if they were a consultancy, a service, or selling a product. I think those things are in conflict with one another and make it difficult to understand the level of commitment and risk.
Tableau is a visualization software that you can bolt on top of many different data sources, from spreadsheets to BI tools and even AWS RedShift to get awesome looking dashboards and interactive visuals.
They are/were extremely popular with many sales organizations for internal dashboarding and even some media outlets used their stuff for public websites.
Sad to see this state of things for them, as the product was true good at one point. That said, I have not touched it in 3-4 years, so not sure how competitive it is now.
The product is awesome. I made some really neat visualizations and analyses with it during an MBA project using the 15-day trial. But it's ridiculously expensive - $1000 per year with some real limitations on how many data sources you can connect.
If I could dabble with it for $50/year I would. $1000 though? That limits their market tremendously to big companies only. It's hard enough to get an $80 tool approved.
--
Looks like they give it free for students now, which is cool: https://www.tableau.com/academic/student. But after learning it, there's no path to bring the software with them when they move to actual companies, because of the cost.
Tableau is sitting on top of a data warehouse anyway. $company will be shelling out much more on the data warehousing and ETL side than on the vis layer.
Even with open source tooling, it's a large and ongoing development effort to do BI right.
PowerBI may be subsidized or not. But in our use-cases and in their documentation, it really doesn't funnel customers to SQL Server. Sure, it plays nice with SQL, SSAS/MD and SSAS/Tabular. But it also plays nice with many, many other data sources individually and in mashup. Its internal data modeling and ETL capabilities rival the power of SSAS and SSIS and they are free -- I see them as more of a competitor to SQL than a front end to it.
You may, but that's not how the strategy guys at Microsoft see it, according to their own words. Data modeling and ETL capabilities rival SSAS and SSIS only in very trivial scenarios.
I'm not saying that to bash on PowerBI. I love that product and use it heavily and promote it to clients. It just serves different purposes than Tableau.
The data modeling engine IS SSAS. The only major missing pieces are row-level security and defined KPIs. The query language, DAX, and the backing columnstore database are the same engine used in SSAS Tabular.
xVelocity is only a (small) part of SSAS and it has a very limited application compared to OLAP even in the upcoming 2016 edition which greatly enhances the columnar store functionality.
Tableau offers a richer end-user customization interface. Power BI offers custom visuals, but these are definitely developer-only at this point. The behavior you get from the baked in pieces is more rigidly defined than that from Tableau (monthly releases address these things one item at a time).
Out of the box mapping is miles better in Tableau.
Overall Tableau is more fully-featured than Power BI. Tableau aims to be a complete BI presentation layer. Power BI is positioned as the self-service and personalized consumption component as a part of a larger BI presentation layer. Microsoft would prefer that the entirety of the BI stack be made up of their technologies, but Power BI can consume other data sources, and its reports can be embedded in other apps, so it fits into other technologies as well.
We could go feature-by-feature and Tableau would win the majority of presentation sophistication bullets (more fine-grained control of display, filtering, interactions - richer collection of built-in visualizations).
The differentiator for Power BI is more on the self-service end. Personalized dashboards (dashboard and report are two distinct concepts in Power BI) can be trivially created from published reports. Customized reports can easily be extended from published reports and datasets. There's a strong collaboration framework based on Office 365 groups and with lessons learned from SharePoint. There is also a pretty seamless upgrade path. Power Pivot models can currently be promoted to SSAS, and the expectation is for the same to be possible with Power BI models (all the same backing database technology).
I hate to be so vague, but there's a lot to both products. I'd be happy to dive deeper into some specific cases if you've got questions.
Since you sound fairly familiar with both platforms, what's your take on if and how well MS is doing on making PowerBI functionally equivalent to Tableau? At the rate they are going do you expect they will reach parity in the foreseeable future?
I think the answer is yes if you buy into the Microsoft stack, which includes SSRS and SSAS being used in conjunction with Power BI. SQL 2016 is a big BI release for Microsoft.
Power BI as a standalone product is pretty brutally limited in terms of data volume (250MB compressed data model is the max that can be hosted in the cloud service), and is missing the extensibility and flexibility that comes from a tool like SSRS.
As a self contained product, Tableau will likely hold the lead for some time. As a platform, I think Microsoft is beyond Tableau - they cover far more of the BI spectrum (and well, especially with SSAS) than Tableau seems to ever intend to.
Those are quite big products, making feature-by-feature comparison would be quite a daunting task.
It all boils down to this, in my opinion: PowerBI is for dashboards and reporting mostly. Tableau can do dashboards, but it can also do "analysis". Uncovering insights in the data that go way beyond simple cross-filtering.
In my mind, they are different tools for different purposes.
Just because you can, doesn't mean everyone will rush to do it. This is a self-service platform. Putting the data in the hands of decision makers. There is hierarchy in how people use such software:
The user: He uses only the apparent features in the GUI. Like the grid and aggregation formulas in Excel. He learns how to use the software from other people showing him how to do stuff.
The Power User: He has deeper needs, but can only be bothered to use features 1 or 2 levels deep in the GUI. Like pivot tables, vlookups and index/match, logical operator formulas in Excel. He learns how to use the software from tutorials.
The Advanced User: He has a task and does not mind getting his hands dirty in order to fix it. Uses DAX and Cube formulas. Perhaps even Macros. He learns by googling his problem and reading documentation.
The Developer: Solves the problems at the programmatic level.
Tableau occupies a very specific spot. It is brilliant for the User who only consumes dashboards via clicking on them. No explanation needed and it is super polished. It is also powerful enough for the Advanced User who can perform relatively sophisticated analyses from the interface. Generally speaking, it is not a good fit for the Power User who doesn't have the need to justify using it. It is also not a terribly good choice for the developer because it is too restrictive and the programmatic features are not well thought out.
Qlikview costs quite a lot and I'd argue is also enterprise-grade, and doesn't have the same cost issue plaguing Tableau. Which is sad, because Tableau is probably the better product in many ways.
I am pretty sure Qlick deployments are way more expensive than Tableau ones because as far as I know it is more capable data modeling-wise. It is also nowhere near as polished.
We're using them for a client, and pretty much every client I've worked with lately is using them. I think they'll be just fine, it's still a good product.
The AWS RedShift integration was a really good move, as one of the reasons we did not adopt them at my previous company was the requirement that the data set fit in memory of the server.
With RedShift, you offload the actual crunching to the db that can handle billions of rows, and then visualize the results.
They still are popular, especially in the government contracting world. It's almost a mandatory requirement their business analysts use Tableau when creating dashboards for clients.
I find that hard to believe unless you were searching just on Google News.
Searching "tableau" in Google stills shows their website as well as a large knowledge graph on the right hand side which tells you exactly what the company is about. There's News results at the top of the page, but there's still plenty of information, including the company website, on the first page of results.
This will happen for google searches where there's a hot news story that is more likely to be what people are looking for. For spiking searches, Google automatically applies a principle called "Query Deserves Freshness" and they rank pages that are more fresh. So the brand site and wikipedia and other more authoritative explanations of the brand will be temporarily deranked.
Linkedin is down 38%. Tesla is down quite a bit. My guess is that some VCs were leveraged and had to liquidate positions. Free money from leverage does not look as good when equities go down... I think we are heading for a most violent year on the markets. Pharma and biothechs will most likely be the only bright spots, especially anti-anxiety drugs... :-)
Keep plugging at building great software that actually helps make this a better place. Charge for it so you can survive this downturn. It's a cycle, this will weed out the nonsense.
This is probably wrong. IMHO what looks like has happened is there are a handful of HN-relevant companies had weak earning announcements yesterday[1] after market close, and that is coupled with a weaker US jobs report [2], [3] today. The overall market is off and there are a handful of major drops for the companies mentioned but this is not some public tech market panic event (yet).
Jim Cramer, et al are saying that it's the strength of the jobs report that is hurting. Because as long as unemployment is ~5% the Federal reserve might raise rates, opposed to if unemployment worsens, the fed might not.
You're saying that VCs (i.e. funds that do not invest in public companies) figured out they're too leveraged and had to sell the (public!) share of LinkedIn exactly as LinkedIn announced horrible results/outlook?
A few things in the above explanaiton aren't completely consistent...
And cannot choose to just liquidate these shares whenever they want. There are rules, and these things require multiple months advance warning and lock in.
I don't think gas prices impact the Model S/X, the price point is just too high to be a bracket where that's a major factor. I think that for the Model 3 they're going to be production constrained to a point where we'll see gas prices come back to normal.
Also a big factor in the TCO is the lack of maintenance and overall simplicity in a electric drivetrain and that's completely independent of gas prices.
"Tableau Software tumbled 36% after management warned it was unlikely to realize benefits of certain tax assets. The news sent shares spiralling despite the software company's better-than-expected quarter. The company earned 33 cents a share, more than double estimates."
Suppose hypothetically that you have an agreement from a firm to pay you money in the future. That agreement is carried on your business' books as an asset.
Your friendly neighborhood tax agency is a special case of firm, which can commit to paying you money in the future. One way they can do this is, when you lose money in year N, they can let you carry that loss forward for up to X years, so that when you're taxed on your income in year N+3 you might be able to offset some of that income with the loss you made years earlier, reducing the amount of tax you incur.
The guesstimated value of that offset in taxes is your tax asset. If your marginal rate is 30%, and you can offset $1 million in revenue, the implied value is +/- $300k. Importantly, if you guesstimate poorly or your friendly local tax agency decides to change rules on you in the interim, you have to adjust the value on your balance sheet. This can be problematic if the tax asset is a material portion of your notional value.
Thanks for the explanation! I would say that if your PAST LOSSES are a "material portion of your notional value" you have a pretty problematic company anyway.
This would not be uncommon for a high-growth VC-backed company. If you've raised a billion dollars and burned through $800 million you've probably lost a significant fraction of that $800 million. (Some is capitalized but much will be straight-up OPEX loss.)
If you've got, oh, $300 million in cash ($200 million in remaining investment plus we'll say $100 million in collected revenue) and another $25 million in accounts receivable then the tax asset is worth about, round numbers, $250 million, or 43% of the book value of the company.
Having to write down 43% of your book value would suck.
This is, again, not outlandish for a company on that trajectory. If the revenue is growing rapidly and forecast to continue growing rapidly the company is in a wonderful spot.
Usually it was a loss in an earlier period that you've rolled forward to offset profits in a later period. They can expire though, so if you don't make a profit in time they become worthless. Companies carry them as an asset on their books until one or the other happens.
To add-on to previous commenters, they're usually the result of discrepancies between how taxes are actually charged, and how they're calculated for financial statement purposes.
For example, companies will calculate depreciation on their capital assets using various methods. However the IRS/CRA have their own methods for calculating depreciation on these assets. The difference between these two amounts can create a deferred income tax liability or asset. That is, if you record depreciation higher than what the IRS calculates as, the income tax expense you record on your income statement will be higher than the amount you are actually charged.
There are also rules which identify whether or not companies can put a deferred tax asset on their balance sheet. Under International Accounting Standards (IFRS), companies must establish that they can realize these assets by having sufficient income to apply them against in the future. In the U.S., as was the case here, a valuation allowance was applied to decrease the asset as they indicated that they weren't likely going to have a sufficient net income in future periods to apply that asset to tax expenses.
Anyways I'm a bit rough on it as while I've got an education in accounting it's not my day-to-day job anymore. Additionally I'm not that familiar with U.S. accounting standards
It's basically when you can claim that you operated at a loss in a given year and for tax purposes carry over that loss to a different year. This lets you pay less taxes in a year when you actually make profits.
Tableau is a decent product, but a decent product doesn't make a great company. Main issues are:
A) Its sales team sells it as an end-to-end analytics suite, but in nearly all actual implementations I've seen companies just use it for management to view a dashboard... essentially just a slightly fancier version of the Excel doc some analyst used to mail out
B) It's crazy expensive for what it is
C) If the company has actual data scientists onboard then those individuals can do far better analysis just using free open source tools
For the above reasons and more I've seen a lot of large companies that are far less excited by Tableau than they were a year or two ago. They've either halted larger roll outs that were planned or just moved away from the platform entirely. That leads to softer sales and slams the brakes on growth rates... hence why the stock has tanked. There's also the issue of valuation multiples against their financials which are also still frothy, even after the latest downward movement in the stock.
Those are excellent points. As an exploratory tool Tableau is well worth the money, but if you are just using it to replace some basic C-level reporting than its really expensive for that.
Before this drop they were trading at 14x forward revenue; now they're at 7x. Splunk was at 12x, now at 9.7x. We'll have to revisit what the typical multiple range is for BI/enterprise saas companies.
Seems like quite the correction for an earnings beat and the removal of a ~50M deferred tax asset though, especially since people already had negative expectations before the earnings release.
Tableau is definitely a useful tool. Tried getting folks with Tableau skills in previous whoishiring post. The other positions I posted got a lot of response but not the Tableau-oriented ones. Looks like plenty of enthusiasts in this thread through. Where were you all? Hey, in the off chance you are Iinterested in working in Tableau at a think tank (RAND) send me (Chris) a note at dev.hiring@rand.org.
There's a reason for this. Unless you work in an organization that has already implemented Tableau, or is willing to take a punt on using it, then there is just no easy way to get skilled up on it.
It's limited to flat files, so you have to plan ahead on what data you'll explore. In terms of the viz - feature parity, but not quite the same at all if you want to do exploration. Which is too bad, because Tableau's ease of exploration is what makes it so much more valuable than PowerBI or custom d3.
What is it lacking in terms of exploration? I thought most of the exploratory benefits were in the visualizations, unless you mean the ability to ad-hoc connect to random databases and begin exploration (because you have to plan ahead to extract flat files from those sources)
I mean in terms of learning to use the exploratory/dashboarding functionality - it's fairly easy to do that with Tableau Public and public data sources (or your own if you are fine with the saved dashboards being accessible to the public).
Honestly I don't care about stock rhetoric, it's an awesome product I use every week. I evaluated lot's of BI's options because it's a pricey product and I'm super happy with it, Tableau is actually fun (because its so easy yet powerful) and meaningful. The only other decent alternative was Qlik but it was more pricey for the options I needed.
I'm gonna take a wild guess and say that they're going to keep going on that guy, possibly with a bit of delay. Running joke over here is that Fremont is basically TableauVille. Or they're to Fremont as Microsoft is to Redmond. Hell, rumor has it that new buildings are being built in Fremont just for their damn expansion right now (take a wild guess if you're in the neighborhood which ones--it's not hard, just look for glass). I doubt even a big dip in share price is gonna quash their thirst for real estate for very long.
Ouch, good question. Although the building they want to lease in is part of a major development happening in Kirkland over the next few years. It will be full of other businesses and retail, so not just Tableau. It's also right across the street from their current Kirkland office I believe.
No you don't get it. Tableau is for people who can't hack (or don't want to hack) d3js or crunch billions of records or whatever techie thing you seem to be so proud of. The mission is supposed to be self-service business intelligence, i.e, allowing a typical business analyst to drag drop and create good visualizations and analytics. Not dick around and waste time with databases and javascript.
Except that it only lets you do a set number of things. Most requests for information are not simple, and once you leave the comfort of simplicity, you have to become an expert in tableau, or in many cases, just accept that it can't do what you want it to do.
Unless your queries are very simple, you are better off finding a programmer and asking them to do your job.
Lol, so I use Tableau every week. For example today I had a government request to measure quantity of customers outside a particular business radius of 40km. Meaning what percent of total visits are in the radius vs outside it. I had 8k postal codes.
This took me literally 5 minutes in tableau, I would hate to see how long this would take you writing code.
A programmers whose job is to make these kinds of reports will already have information about postcodes and so on, and take less time than you. Tableau is quick at doing simple things, but takes a lot of learning to do complex things. Often, you will have to resort to custom scripts, creating views, etc. Often, it just won't have the ability to do certain things.
Anyone with any experience in drag and drop applications will tell you that once things get complex, it ends up taking much more time using the simple drag and drop solution as you spend all of your time trying to get it to do something that is outside of a basic drag and drop model.
You say "will already have information about postcodes and so on". How exactly do you get your hands on postal code geo data, have you tried that? Have you created your own geo postal code database, or maybe used an API. I'm talking international postal codes too.
I have, and I can tell you it's not easy, for example in Canada there are like over 10k postal code regions. Then imagine maintaining this dataset, no thanks.
Anyhow that's not the point, that's just one data set Tableau does very fast, but tomorrow could be something totally different and the next day, etc.
Sure for complex custom requirements no BI tool will work, but it's sure better than what people were doing just a few years ago with excel or hiring expensive firms.
Are the complex requests in regards to getting/manipulating the data, or in the final visualization. I'm trying to build a desktop tool similar to Tableau but with the ability to easily drop down into python and do data manipulation with the pydata stack. Intended audience is the new breed of quantitative analysts/data scientists.
Both. While tableau has a visual query builder, sometimes the queries are too complex. For example they may need self-joins, non-standard join criteria, temporary tables, etc. So ultimately, the tableau guy has to ask a programmer or database person to create a script to use as the tableau source.
One example that involves both is when the visualisation needs a contiguous date range in the data, but the data is missing some dates. As a programmer it is easy to just loop through a date range and put 0 where there is no data.
There are also lots of limitations in terms of the visualisations. The end result was that multiple visualisations had to be created to show something that a programmer would be able to create as one visualisation.
I would say mainly visuals, think Photoshop meets Excel meets Javascript. One of the founders of Tableau founded Pixar studios so you can imagine that graphics are important. But it also has some great data options.
Seriously? It's almost as if companies don't want to hire a badass hacker to generate a business report used by non-tech workers. Tableau has a huge place in the BI community, and the last thing that a manager wants is somebody hacking d3js to to create some visualizations.
$1,000 to generate a business report? That's how much a single Tableau license is going to cost you. You still need to train up the staff member to know how to use it (yes, it's probably the most intuitive BI tool on the market, but like anything it has a learning curve).
1,000 USD/yr for a license.. If you pay someone 100 USD/hr to do the work compared to 50 USD/hr for a random Joe Blow analyst then it only needs to save 20 hours of time.
ITT: A lot of people who don't realize 1k a year isn't an absurd enterprise price.
Also to clarify, the licence is $1000 for the first purchase. Subsequently, you only have to pay the maintenance fees of about usd 200 per year to remain eligible for support and upgrades. You can still continue using the latest version you were on in perpetuity.
I can do all that too, but I'm fairly sure experienced Tableau people can do it significantly faster than I can for a comparable quality of result. (Tableau is the only non-open-source software in the stack my company runs).
Yup! Tableau isn't just about the end visualization. It is about the exploration of your data and the analytics that go behind the final visual. That is where the power of Tableau lies.
To some extent, companies might be trying to use too much data? Who knows.
Life Time Value capture in the analytics space is sufficiently difficult. Let's say you're an analytics customer:
Early stage customer: Use any one of the 100 integrations analytics companies listed on segment.com's website. You DO NOT need to track hundreds of parameters. Conversations with customers have 100x the value of tracking the small things at this stage....usually.
Mid-Stage Customer: Maybe you choose one of the 100 companies that makes the most sense. You start paying for it.
Having these early and midstage companies as customers is tough. The vast majority of them fail.
Big Customer: SnowPlow Analytics? Splunk? Tableau?
That being said, I think there's businesses to be built, but only semblances of unicorns and minotaurs.
15,000 web and mobile apps are coming out tomorrow. What % of them are analytics related applications? (Hint: A larger one than you might think). Just have a gander at the applications that are listed on promotehour.com and startuplister.com's list of app launch sites. There's now 100+ launch sites.
To win big in VC money, you have to take risks, the analytics space seems well established with a slew of best practices that are extremely well known. Ie. not risky enough to warrant pouring VC-istan money into.
I would very much like to know what particular issues you are having as well? I work at Tableau and I would like to make sure that issues people are seeing are on our radar, if they aren't already.
Thanks for asking. I just installed 9.0.4, and here are a few examples:
- Tableau Server runs only on Windows, so why can't it use a TLS certificate and key from the CryptoAPI certificate store, rather than requiring these to be converted to PEM format (with Unix line endings!) and saved in the file system?
In an enterprise with an internal CA using Active Directory Certificate Services, these extra steps have to be done not only at installation but also every time the certificate expires. Compare the experience with Microsoft IIS: the server automatically requests a renewal from AD CS, retrieves the new certificate, and begins using it.
- Tableau Server should be able to run as a Group Managed Service Account, so we can give it access to remote data sources without having to assign (and regularly change) yet another service account password.
- It would be helpful to have an scriptable installation process; as far as I can tell, there's no way to install Tableau Server without clicking through wizards.
Thanks for the input. I am going to forward these on to the server dev team and follow up with them in person. They may be aware of some of these already but it is important to us to keep track of what is causing our users the most headaches. I appreciate you taking the time and letting me know your suggestions and the issues you are having!
1. No ability to use a 3rd party auth provider AFAIK, which means either keeping tableau passwords in a database or having users remember two different passwords
2. Embedded views use synchronous requests, which can easily hang the browser. Synchronous XMLHttpRequest has been deprecated for a while. I think I even saw a version of dojo from 2005 being loaded.
3. Reports are either static size or dynamic size, and unless you're using the (clunky but well documented) JS SDK, there's no way to tell.
4. Viewing reports in the browser is sloooooow. Browser console output is filled with warnings.
5. In order to put together sheets from multiple workbooks into a browser-based view, you need to either a) load the jssdk for each of the workbooks and query for sheets, which is extraordinarily slow, or b) do it with the REST api, authentication with which is asinine in nature (see #1).
> 1. No ability to use a 3rd party auth provider AFAIK, which means either keeping tableau passwords in a database or having users remember two different passwords
The answer is SAML/ADFS. You should look to enable this integration. If you are not using AD/LDAP, that's a whole different story. But SAML/ADFS is pretty much the standard way since Tableau is a Windows service, it is very natural to just use AD/LDAP/SAML.
[1] I had to set the client-side map rendering threshold to a very high number (100000 I believe) to get maps to render at all. Server-side rendering doesn't work, even though it can contact the map servers and display all of the examples in the documentation (Miami/Havana I think?).
[2] It's been a few months, but I remember getting the license activated offline was a weird process. Something like, point tabadmin toward a license file, which generates a number or json or some other file, which you then paste into or point the UI toward, which gives you another file to use in tabadmin... and at the end tabadmin gave me error. Now when I go to "Manage Product Keys" it acts as though it is unregistered, but the server still starts without error (it did not before the failed activation ritual).
I do have a ticket in with support for [1].
Given how much of a bitch it was to activate (or half-activate) I'm reluctant to investigate [2] further.
Also, I'd like to see a linux server. Tableau is our only Windows server, which weighed heavily against the product when we were considering alternatives.
So, I am not on the server team specifically. A lot of these issues that are mentioned may already be in the pipeline/on our radar. However, I think it is beneficial to make sure that we continually follow up to ensure the squeaky wheel gets the grease, so to speak.
All of these issues mentioned here will be sent to the server product owners and managers. :)
I am, however, on the maps team. I am curious about [1] above. I'll see what I can find internally on this. I am rather curious since this isn't something I have seen.
Offline license signing is a solved problem, Sophos for one has figured this out with the way they license their UTM product.
When they give you a license file, it's cryptographically signed with their GPG key, and the public key resides on the appliance for verification. All you have to do is get that license into the system, either by USB key, typing it in yourself in Vim, or simply uploading the license file in the webUI if you have access to it.
Trusted Authentication is a poor solution to the problem of how I can embed views in my web app without having the end users of my web app have Tableau server accounts. For the following reasons:
- I have to explicitly add each server IP address. I have no way to trust an entire subnet or range of addresses. This is a huge problem in an auto-scaling app server environment where I don't know the IP addresses my app servers will have. It is a major annoyance to developers whose DHCP-assigned, dynamic IP addresses keep changing.
- There is no API for adding trusted IP addresses. It is a manual process.
- The Tableau server must be stopped and restarted to add new trusted IPs.
There is so much low hanging fruit, I feel like anything related to actually running and maintaining tableau is ignored and I don't seem to be the only one judging from comments here.
I would add that I'm disappointed the only way these issues get attention are articles and threads like this.
It takes an ungodly amount of resources to do not very much. It's very buggy, frequently the HTML charts time out or just break with no indication why. Upgrades require a lot of manual work. The javascript API is reasonably documented, but, again, buggy.
It's an unholy combination of rails and postgres somehow hacked to run on windows. Really, they should just ship a linux VM that runs these things decently.
Many Linux services have a concept of reloading. If the config file changes you can send the running program a signal and it will re read the config. This is very useful for production systems.
Tableau (9 at least) has no such concept.
Change the email address it reports to? Restart tableau.
Change the location of the SSL certificates? Restart tableau.
Want to apply an update for tableau? Uninstall your current version and install the new one. Oh and until recently when you downloaded the installer for tableau server the file name didn't actually contain the version number.
This product was not designed with ops in mind at all.
Edit: I forgot, I've actually had a tableau server fill itself up with logs. Tableau has logs in many different locations outside of windows event viewer and doesn't include log rotation facilities for all if them.
Those things are indeed a pain with Tableau, and also common with a lot of Windows based applications.
Never understood why Tableau is either Windows only or has the restart to reconfig issue. Last I looked, it was largely a Tomcat and PostgreSQL based product.
What's the actual issue with restarts? Is the downtime due to a restart unacceptable? Is it that users lose application state when a restart happens?
Just trying to understand since I've written software with the same restart to config workflow and would like to understand what causes it to be problematic.
If you run tableau in an enterprise environment you will likely have a lot of c level executives, global sales teams and more relying on tableau to be available outside of your local business hours. This means any maintenance needs to be planned and communications sent out to all stakeholders.
If reloading was an option then there wouldn't be downtime, and I wouldn't need to schedule a maintenance window for something as simple as updating an email address. The idea being that if there is a config error during a reload, the system just continues uninterrupted with the original config. If I have to stop the system completely in order to run the config sanity checks when it starts again, the potential for prolonged downtime is much greater.
Thanks for that perspective, I hadn't thought of that.
Would a system that did something like an internal cut-over be useful? e.g. try to start a whole new instance of the application, if it loads, then let it become the running application, if not, write an error log and shutdown?
It would still lose all the state associated with the previous instance, e.g. user sessions, but would avoid this specific issue.
I agree that it's pretty silly that things like email addresses need a restart, but I'm wondering in general how bad this pattern is.
This is an extremely frustrating thread to follow. We have some people who have run a Tableau server with no issues, and now two people who are effectively saying "oh, it's awful but I have no interest in telling you why".
> I think a better question is what problems they haven't faced.
Deploying visualizations without having to develop any HTML or JavaScript code.
Publishing to the desktop (Windows or Mac), to the web, to the cloud, or mobile devices (iOS and Android). Publish to the server once, consume on all supported platforms.
Deploying a copy of a current site for redundancy, testing or development. Install the app, backup the primary Tableau database with its admin utility (command line), restore it on the new box. All data, visualizations, users and permissions are contained in that single restore step.
Tableau means I spend time working with my data, instead of the presentation of it. Its not a perfect product by any measure, and could obviously use some improvements, but is a timesaver in many areas.
Which is pointless since they had a great earnings report, were cash flow positive for the quarter and raised their guidance. Additionally they mentioned in the call that no sector is more than 20% of their customers so they are diversified if a certain sector really tanks - like tech. They have 180 million in the bank so won't need credit markets any time soon and gradually approaching profitability.
Disclaimer: I have a position in NEWR and added today at the 21 level. This will certainly bounce back in the comings months or years. Will add if the irrationality continues.
It's trading at about 6x revenue right now and even lower when looking at forward revenues.
Their growth is tremendous and are on a path to profitability. I get it - when the market are getting banged around like it is, there are few safe places. I just think the baby is being thrown out with the bath water in a lot cases. I'm using this to average down a bit and hold.
Yeah I see your point, but what I mean is that when so much of the current value of a company is based on future events, fear unfortunately is never irrational.
And for New Relic the expectations are really high, could still turn out to be a great business story (and I also think it will) but not necessarily a good investment because so much of this future success is already embedded in the price.
AppDynamics, their direct competitor, is a late stage VC Unicorn ($2 bio evaluation) and delays its IPO.
Dynatrace another rather old-school competitor (like CA), which was bought out of Compuware by Thomas Bravo (private investor), has been taken private.
New Relic depends on smb sales, AppDynamics and Dynatrace more on enterprise sales. New Relic has the easier to use product, AppDynamic and Dynatrace are better at enterprise bullshit bingo and show more "nested pages" with often less detailed data to hide that they have worse/lacking insight. They stopped innovating like 2 years ago.
All three companies have a lot of employees and burn through their capital. Doesn't look good.
I use Tableau almost daily. Initially I found it harder to work with and avoided it for a while. On 2nd attempt, for some reason, everything clicked into place and I now find it fairly intuitive. Some of my opinions/experiences about it:
- Beyond simple queries, it becomes more easier to setup a view or two behind Tableau than to do it purely in Tableau. E.g. multiple level of latest times, more complex aggregations etc are more easily done in SQL than via Tableau.
- For some things Tableau doesn't do what we typically expect. Couple of examples:
- If you run a custom query joining two tables with some of the column names being same in both, Tableau cannot deal with it. Setting alias for columns using "as" works fine in SQL so I think it is not unreasonable to expect that to work.
- I recently tried to chart lag in request/response in microseconds. The data is stored as TIMESTAMP in our Vertica database. However, I found that Tableau doesn't show time more granular than seconds. That is kind of odd in today's common low-latency high-performance technology environment
Having said that, I really like the features it offers viz. Calculated fields which don't go away just because I change the data source, drag-and-drop dashboard constructions, very nice visualizations etc.
I wasn't part of the decision making effort but I believe Qikview was also considered but the analysts preferred Tableau. It is used by the analysts as well our for our internal monitoring/charting/analysis purposes.
I see. I've also found myself in a situation where the company was using it and I had them drop it.
For reporting the idea of transforming & moving the data around really doesn't make much sense when compared to solutions which directly leverage your db or at least create an unique consistent datasource instead of many so-called extracts scattered around.
I was previously a Tableau expert at my then job. I was amazed at the missed opportunities by Tableau to make it indispensable to organizations. The core product was fine but it seemed like they could use a good PM to implement features around sharing, permissions, pricing etc. $1000 license for clients to view some basic report in a browser, the other option being a desktop installed reader.
I'm sure Tableau has its own right place in the plateau of BI & Analytics for big companies. As for small businesses and custom-built analytics as well as for operational data UIs you'll probably have no alternative than hacking it yourself with d3js. Or using it Tadaboard-like alternatives.
I was one of the first people to start using Tableau, I even bought a license for my own use so I can learn (Yep sick of the 14 day trial), but these days Excel / Power BI does most the work, Salesforce with Wave is competing with them plus so many open source software.
About 5 minutes of searching shows me that Tableau has had a massive growth of revenue, but a slightly more massive growth in sales/administrative expenses. What that tells me is this: The market loves Tableau, and keeps buying more of it. But the company has serious cost control problems in its sales and marketing departments, and has expanded there far too fast.
Anybody comparing typical open source or low cost BI solutions favorably to Tableau hasn't used it, or doesn't value their own time. Tableau is easy and deep.
Tableau's board needs to attack their out-of-control cost of sales (the pay-me-everything egos), and hopefully ignore the internet idiot mob effect on their stock price.
I don't own Tableau stock. I just like the product.
The problem is that you make something cool in Tableau and then want to collaborate with someone... and even if you talk them into it, soon their trial runs out. Yes, it's powerful and featured, but today's scene is about collaboration and sharing... and the sticker shock is just too much for that.
I was an early adopter of Tableau and fairly enthusiastic. But then I had a truly terrible licencing experience with them and have, since, refused to have ANYTHING to do with them.
Nice enough software, but a sleazy organisation. Use one of the other good alternatives.
> upside: expect house prices in the SF bay area to become more affordable.
Until the Chinese peg falls, I don't think you can. I mean, aren't SF house prices driver by Chinese investors about as much as by domestic Twitter millionaires?
Are there any sources / links you can share on cash real estate purchases in SV or Bay Area? I'm interested in reading more about this and a preliminary search isn't yielding any recent news.
This sort of data would be interesting/ useful for any location. I was recently in Phuket where I was told property prices are dropping due to the Russian currency tanking, so many of the Russian owners are selling up. Knowing how much of an area is owned by certain foreigners would definitely be valuable knowledge.
I don't know about SV, but foreign investment is pretty dam low in Victoria, BC, which could be used as a proxy for Vancouver. 97.8% were domestic buyers.[1]
It's not foreign buyers driving up the market in Vancouver, it's Canadians will to put themselves into serious debt in order to buy a house.
Did you read the article? They looked at the last names on the titles and if they were Mainland Chinese names they assumed they were foreigners, which is utterly ridiculous in the city of Vancouver where the population is almost 50% Asian, with most being multi-generational immigrants.
That's like doing the same study in Boston and claiming foreigner buyers from Ireland own 50% of Boston homes!
You get echo effects from places like Vancouver. Wealthy foreign buyers use high end real estate as a bank account increase the prices of top of the line housing significantly. Wealthier natives (Dentists?), now priced out of the highest end go down the scale and start purchasing lower end stuff with more money, increasing the price for everyone in a domino effect.
Government, not wanting the housing market to go down, create laws to allow subprime mortgage lending and buyers buy houses without real limits on price, and the middle class can now keep up with the price boom.
Well, it's been reported that foreign buyers are invested in a lot of the more expensive properties (> $3 million) in certain regions, but a lot of Canadians are STILL buying Vancouver properties. There's been a lot of concern over Canadians getting in over their heads with mortgages. It also doesn't help that Vancouver has crappy wages.
I'm not so sure about that, a lot of the conditions previously supporting real estate investment have given way. Oil's down, China is volatile, interest rates have risen, arguably you could say the trend could reverse due to people who previously bought as investments needing their money back.
That's assuming that they need their money back. The way I understand it is that there's a glut of money, and the rich are just piling it up and moving it around to where ever the best return seems to be.
It's the dot-com bubble all over again. Look at the NEWR stock chart, it looks like the dot-com burst in 2000-2001 compressed to a few months. It's happening a lot faster now, then it happened back then.
The additional 18k people that Amazon apparently plans to hire will probably keep the Seattle market hot for the foreseeable future. Also, Chinese investors.
Tableau makes most straightforward analyses much simpler than attempting to do it in Ipython/pandas, especially when connecting to standard corporate data warehouses (probably on the order of 10x faster). Of course you quickly hit a ceiling in terms of complexity, but most analysts never really get to that point.
Simple analysis usually answer recurrent questions, for this there is reporting and Tableau is not a good tool for reporting (too slow).
For ad hoc analysis instead, there is really nothing more powerful than R or python and it's plenty of tools & libraries that give you the speed you need.
Tableau is a great product, but it's very expensive, like Oracle expensive. I used Saiku to hack together some BI dashboards for free http://www.meteorite.bi/. It did the job with a little bit of work.
My exp with tableau was always that they obfuscate the final query...
Fuck that.
Looker doesn't && they even allow a meta Lang on top of them.
So for you to say "just do X" leaves many less sophisticated orgs left in the dark.
So my comment is both:
Fuck their position and yours.
I don't want to pay bullshit fees to tableau (with perf hits) && not be able to see the final query && not need an actually fairly highly competent DBA to know that x+x+x needs to be enabled in order for me to get to root.
If I pay service X give me the full fucking understanding as to how service results are generated.
(Yes yes I do understand how saas blah blah works and I'm just saying I personally find this process bullshit and immoral)
It's more that they imply the poster is really angry. If you saw two speakers in the street, one arguing reasonably and the other swearing his head off, I think most people would place less value on the ranter.
Uh... Maybe you dont talk to your friends too often...
You should see me with many many of my friends - dont imply anger where there is none, as we often get in "mock heated arguements" with smiles on our face.
It makes sense to talk that way to your friends, because you're already friends and there are stronger bonds holding you together. That makes shredding each other fun, because everyone knows it's in jest.
A large semi-anonymous internet forum like HN is at the opposite extreme. We're almost all strangers and near-strangers. The community has low cohesion, so the same behavior, with the same good intention, does harm.
You know how rugby players are famous for beating the shit out of each other on the pitch, then going out drinking after the game? I always envied those guys that experience. It works because they have high cohesion. They also know not to walk up to a stranger, smack them onto the pavement, and expect to be bought a beer later. You need to adjust for context.
That makes sense, and yeah that is kind of sad. But HN is what it is. It would be wonderful if the community could evolve to become more cohesive but that's a slow process.
If you have problems with "violent language" then that shouldn't lead to valuable content being hidden from others who don't have mental issues triggered by words they fifth like.
I've been at large orgs where we evaluated all the cloud services like Mixpanel and the ilk, but they are very different use cases.
Tableau + Redshift (or Vertica but Redshift is way less expensive) is the typical BI tech stack. When I go a client or a new job and ask them what they use for analysis and if they say "Good Data" or "MicroStrategy", I know my job will be a pain in the ass to get done. Its as if you're interviewing for an engineering job and they say "oh we all use NotePad here". Its night and day in terms of productivity.
I don't know what this means in terms of the stock, but all I can say is that I vouch for the product and think it has a lot of room to grow (from a BI perspective).