I've been at large orgs where we evaluated all the cloud services like Mixpanel and the ilk, but they are very different use cases.
Tableau + Redshift (or Vertica but Redshift is way less expensive) is the typical BI tech stack. When I go a client or a new job and ask them what they use for analysis and if they say "Good Data" or "MicroStrategy", I know my job will be a pain in the ass to get done. Its as if you're interviewing for an engineering job and they say "oh we all use NotePad here". Its night and day in terms of productivity.
I don't know what this means in terms of the stock, but all I can say is that I vouch for the product and think it has a lot of room to grow (from a BI perspective).
The only thing that is worse than an expensive product, is one that has unpredictable pricing.
While Tableau performance is a thing that people debate, its a non-issue. Generally, the data analyst produces a report that is consumed by 1-10 people and if that consumption grows >10, then the organization will "productize" the report by putting engineering resources on it and build something custom with HighCharts or D3.
In my experience, the ones that don't often produce some really erroneous or nonsensical results. Or even if they do get some things right, they have no ability to drill deeper.
BI in large corporations is usually done by people having no clue about the data, producing bullshit results, leading to "insights" to management (who at that point have have no clue that all the cost columns were added up, regardless of currency), leading to a final "product strategy" that gets implemented over the next years, while in all the companies still producing innovation, this is usually done by a rogue team applying common sense and ignoring that strategy all day long.
BI is pure feel-good for upper echelon.
Certainly there are lots of bullshit metrics, there is often very little desire to audit data or even do a by hand sanity check once and a while.
That being said, there were _many_ engineers who actually gave a fuck about making sure we had telemetry in actionable, meaningful, and appropriate places, and unfrotunately the gap in analytics would take place far over their heads, when the upper management would produce documents with all the beautiful charts and graphs, communicating.... absolutely nothing.
Graphs with mixed axis scaling (log vs non log) graphs with mixed units, conclusions that are a total stretch from the data that's there and ignoring obvious conclusions that don't line up with what the managers want to say.
For there to be USEFUL BI (and such a thing certainly can exist) there needs to be a "this isn't bullshit" mindset up and down the whole stack, not just in engineer land.
I think a lot of things come together for this pattern, which definitely happens quite a lot. Best predictor for this scenario is a clear divide between product and BI people, usually exacerbated by the fact that non-technical people get hired for BI.
Group bias by backend engineers who see BI as "just some point and click" which isn't really that challenging (but usually much better paid, as they are catering sales/bizdev/C-level which is always closer to the money) also doesn't help.
Also, as you say, often times the value of BI doesn't trickle back down the chain and that way, tracking is at most a second thought for application engineers when going prod.
Having built two analytics stacks myself (and seen the perspectives from BI, backend, sales and marketing alike) these are exactly the drivers we tackled first at our current company.
Our recipes against this common failure are: Marketing directly working with product engineers for their tracking requirements (with just some coaching from tracking pros) - and dual-using our analytics stack (Snowplow -> Redshift) for both operations (user segmenting, push notifications) and business intelligence.
This is usually considered a big no-no in BI circles which all tend to duplicate data to be on the safe side, but it helps immensely to make sure product and engineering are just as interested in data quality as the BI guys, as they depend on the very same data.
It's certainly not for everyone, but for us, it works really well.
i've only ever worked at small companies, and i guess to avoid things like this is why, but i can't help but wonder how common this is, and if in fact no one has ever used BI effectively. i find that hard to believe but really i have no idea, and its an interesting thought, makes you want to chuckle or shake your head or both.
It's a surreal experience when you realize how executives are making decisions based on something that, after a moment of critical thought and common sense, clearly has no more relevance than some integers pulled out of rand(). Chuckle and shake your head is indeed about all you can do.
Just try not to visibly smirk if you're ever involved in acquisition talks. Play along.
Huh? Just write a view or a stored procedure and point Tableau at that. There's no need to compose a query in the Tableau interface.
Tableau isn't SAP, it just isn't. If they want to really get into businesses, then they need to be sensible and give people the opportunity to use it and then get it into the businesses they work in.
For instance, I used Tableau to learn it for the 15 days they gave me, and I learned quite a lot as they have great documentation, but then after the 15 days I got no more opportunities to go through their tutorials. Partially I got busy on other things and wanted to revisit it after a few days, but I also setup SQL Server SSAS which took me a bit of time. I got a maximum of about 5 days usage, after that there's no point having it on my workstation as it's far too expensive for me to justify buying a copy.
If I could have had more time with the product, I guess I'd know how good it is so I can recommend it to the business I work at. Unfortunately, I can't without cracking their trial limit code, which I'm just not prepared to do. For now, I guess I'll be recommending Qlikview which is a known quantity and very good also, though nowhere near as intuitive.
- order 1 license to test it out
- analyst gets order of magnitude work done more than co-workers
- team manager buys 10 licenses and a tableau server
IBM has chosen to display "Analytics" as a major piece of their company. Presumably they want investors to think this could generate billions of dollars in revenue yearly. Worst case scenario for me would be IBM to buy Tableau, but clearly for those of us who use it, Tableau is one of the best tools out there for what it does. Will it be able to keep its marketshare 5 years out? Who knows.
Funny to me: it took years for Heap to introduce that.
Edit: we use and help customers with IPython/zepellin etc, but that doesn't change that these things are 10 years behind as IDE/visual interfaces. Adding charting & drilldowns is much harder than a button to turn a table into a line graph, and analysts are wasting a lot of time due to this.
As a side note, I've been following your work for a while now (both Superconductor and Graphistry) - mind if I shoot you a few questions privately?
So what is cheaper?
All clients are already on Office 2013, so that's not a concern either.
The Sharepoint cluster probably is a concern, but that's about the only thing, I'm pretty sure it'll be cheaper than Tableau.
I wish they'd invite the CTO to a Windows 10 events and give us some giveaways :) . No local state laws, this is for an insurance company in South America btw :)
Edit: the opinions are strictly my own, I'm not involved in any decision making sadly (no Windows 10 giveaways for me :P ), etc.
What does Tableau offer that I can not quickly set up in either Mondrian coupled with a frontend like Saiku or Pentaho or eg. R and a Shiny server?
At the time, between the ETL and reporting layers, it acted more like a set of different open source apps simply branded together, and interop required more effort than what should have been necessary.
Debugging was a nightmare as well. Huge stack traces on simple errors made locating problems difficult, and there seemed to be little information in the community. The number of Java library layers spewing out on a simple JDBC driver error was mind-boggling. Pentaho of course has a interest in revenue from support contracts, and most inquiries into simple issues in the forums led down that path. It may have gotten better since, but there was a long way to go.
We switched to Tableau on the front end, and "old fashioned" ETL scripting in Python (now some Go) on the backend. At the same point today I would consider something like R/Shiny, but for speed of implementation Tableau would also be a contender.
Its a bit of a one-off but it does work well! One more step towards making R fit for production :)
I guess it depends what you're trying to do. What part(s) of what Tableau offers do you care most about? How big of an organization?
Chart.io is one product I evaluated that met our requirements. We cared most about collaborative querying and the ability to run templatized reports (including regenerating them via the API) so we ended up going with Mode Analytics
However, in the difference, the admin stack with ELK, Riemann, Graphite, InfluxDB, Airbrake, Icinga and whatever else assumes that 30% of the information right now is worth more than 100% of the information 3 weeks later. You know, my application port is closed, first level support is getting hell, I need any available information right now.
On the other hand, the full data warehouse/tableau stack assumes that 100% of the information is more valuable than anything else. Good DWH-Guys and their analysts can do very awesome data voodoo, predictions and analysis, and the admin stack won't be able to reproduce most of that. It just takes 2 month of data collection and 2 - 3 weeks of analysis to get the result of that grand voodoo. And Tableau can automate that voodoo after it's been done once.
What really exacerbated their issue was that they didn't get out in front of this and let people know how bad they were going to miss. So when they released earnings everything went to shit on them.
I can't really find any excuse for this that passes Occam's razor except to assume this is a very rookie CFO /CEO combo in charge.
I made a comment about 3 months ago saying that 2010-2014 were very kind to new tech IPO's and 2016 would be the year that companies would have to either make money or face the consequences.
I'm following this over the next 2 weeks to see what the stock does. Its now a pretty good Pre-Merger Arbitrage(take over candidate) candidate.
- Market cap under 5 Billion, check
- Institutional owner ship > 50%, check
- Insiders hold less than 5% of the company, check,in fact its at 1.09%, wow, management doesn't even want to own the company!
- below its IPO price, not yet, that was $31 but its still falling.
Maybe the people buying yesterday didn't expect poor results. Or at least they expected the kind of poor results that would make the stock go up. Definitely it did come out "a bit worse" that they predicted.
Privately disclosing them is different, Message me if you want numbers.
Acquisitions aren't out of the question where MS BI is concerned, even with apparent conflicts in product.
Some clever stuff they had was (early-RoR-style) guessing of data semantics based on heuristics (Column labeled "date"? probably worth grouping by months. Number pairs like -0.3242,0.12345 ? probably worth plotting on a map).
I have used trial and educational versions in the past and was always pretty impressed with the ease of use and the results. The product itself was > $1k so I never actually purchased a license.
Even though I didn't follow the strategy very closely, from what I've run into it seems lke their offerings were all over the place. I saw a local newspaper using a hosted Tableau product to display data visualizations on their pages (the kind of visualizations made by a company whose DNA is desktop apps, so kind of underwhelming). Their website is all about Gartner and enterprisey lingo.
The product itself is, well, some advanced version of Excel. This means you either sell a lot of it very cheap, or very little of it for a lot of money. Website design suggests the latter. Too bad it seems to not be working.
I'm honestly sad to see them do badly as I think the product was really innovative and it really changed my way of thinking about report building and analytics tools.
Tableau is more differentiated than it looks, but it doesn't look very differentiated. If Microsoft ever gets around to making many of the "smart" features in Excel actually work (i.e. csv import and automatically choosing what kind of graph to make) they will have nothing to stand on.
I'm just waiting for the street to realize that Hortonworks and their competitors also have zero moat, and if anything, a GUI interface for a Hadoop cluster is value subtraction in the age of devops.
My use case is: here's some data, let's take a look at it and make some decent graphs including any sort of comparative analysis.
Excel is... well, Excel. The kitchen sink bundled in it contains pivot tables. Graphs are hideous out of the box, and very limited.
Apple Numbers makes cute looking graphs, but is extremely basic.
Viz toolkits in JS make you code for stuff that should be straightforward, at different levels of abstraction (Highcharts -> D3). And I have to put stuff in SQL and do the plumbing if I need any meaningful preprocessing.
Self-proclaimed "business intelligence" tools don't really target this use case. They solve enterprise problems like connecting to MDX sources that are not there for me.
Do you know any other tools for this?
I've spent too much time watching people spend months and tens of thousands trying to bend 12 different third party tools together when a developer could do the whole thing in a week and it would actually do what you want it to do.
Nailing down the actual requirements is the hard part of building your own dashboard, it sounds like you've done that. If your needs are stable-ish or you have consistent developer resources it's going to be hard to find a better fit than that.
It's for when you want to figure out what the data is telling you, and not how to wrangle the data into saying something.
In 99% cases, that happens because your requirements change. If you hire a competent developer, the total cost of development is always a pittance compared to what you pay monthly/quarterly to a product based company.
Actually, the plumbing isn't as difficult as most people make it out to be. Nowadays, its the era of abstraction and FOSS infrastructure tools like jQuery, Bootstrap and like you mentioned, Highcharts. I'm a freelancer who quite recently developed a Tableau replacement for one of my clients. This client realized that all he basically wanted from tableau was a chart and some basic data manipulation like sorting, filtering and grouping (count, sum, average, etc.). All the things needed to develop this little app already existed in the FOSS world:
1. Highcharts/jqplot for charting.
2. Twitter-Bootstrap for showing a professional front-page and UI elements.
3. jQuery and jQuery-ui for DOM manipulation, AJAX handling for SQL queries, enabling drag/drop, etc.
4. PHP/Mysql on the backend (which is needed in anycase).
As for plumbing, each of these tools is so well-documented and also a simple Google search will point to tons of StackOverflow links that happily provide an answer to any and every question you may have!
tldr; Library/Framework plumbing might seem complex initially, but for a practiced Web-Developer, its like a cake-walk!
Disclaimer: I work for Microsoft.
The free desktop version is not limited. It's very powerful. The growth rate is astounding. We get significant new features every month, sometimes weekly.
If you compared PowerBI to Tableau 6 months ago (or even 3 months ago) you are out of date. Its data model capability is far superior to Tableau. The formatting and graphics capability is catching up quickly. The intuitive UX is first rate.
(We are NOT Microsoft employees.)
In the BI market, Tableau is very much one of the low-cost, high volume players. They've done a great job getting into lots of people's hands, the question has always been whether or not they can roll the volume of cheap(ish) single user licenses into large enterprise-type deals.
The problem for us is that as a consulting house, our staff turnover rate is high (as expected I suppose), and this doesn't work well with Tableau's named license model. We've run into troubles where we reassigned licenses after people either left projects using Tableau, or left our company altogether. They ended up forcing us to buy extra licenses, which I was unhappy with.
My stance since then has been to avoid buying more licenses. We're getting our team skilled up more on web-based viz, so we can reduce our dependence on desktop applications.
The one thing that I like about Tableau is that its documents are XML, which plays nice with git.
I was working on a project where the client wanted to use Tableau and I pushed instead for Shiny, which is the way we went. Quite happy with the decision.
Most of our clients are tier 1, so in some instances they already have heavy investment in Qlik, but some are willing to shell out cash for Tableau where it makes sense. I've seen senior management in clients complain about the price though at times.
However, one needs to do the analysis to create a worthwhile dashboard anyway and the report can be generated easily with the knitr package.
Anecdotal, but I've seen this happen several times with clients who previously used Tableau.
It would be great if it just stayed as a better version of the excel chart maker. Instead, it was seen in my company as a way to replace programmers. All of these types of applications have the same flaws:
They sell a dream that you can turn a complex task that requires experts, into a simple task that anyone can do. In reality, you either have a simple application that doesn't do much, or an extremely complex application that still doesn't offer the performance and flexibility of just using an expert.
You end up creating a programming via drag and drop application that is more complicated to learn than actual programming. You replace general programmers with useless tableau specialists.
It took the tableau experts weeks to do what a programmer could do in days when complex requirements came up. Most requirements were complex.
Tableau Server is very, very slow and requires massive resources to run.
Tableau had to run the full, unfiltered query so that it could generate filter lists with all of the possible options. This query was often too large to run.
I tried to buy the desktop app a few months ago and just got bogged down with the sales guy. They tried to sell the server products (with very high per-user pricing) when all I needed was 1 desktop licence and the free viewer.
It was difficult to work out if they were a consultancy, a service, or selling a product. I think those things are in conflict with one another and make it difficult to understand the level of commitment and risk.
They are/were extremely popular with many sales organizations for internal dashboarding and even some media outlets used their stuff for public websites.
Sad to see this state of things for them, as the product was true good at one point. That said, I have not touched it in 3-4 years, so not sure how competitive it is now.
If I could dabble with it for $50/year I would. $1000 though? That limits their market tremendously to big companies only. It's hard enough to get an $80 tool approved.
Looks like they give it free for students now, which is cool: https://www.tableau.com/academic/student. But after learning it, there's no path to bring the software with them when they move to actual companies, because of the cost.
That's what the rest of the world might call a "bad omen."
On top of that, PowerBI is subsidized in order to funnel customers for SQL Server.
Even with open source tooling, it's a large and ongoing development effort to do BI right.
I'm not saying that to bash on PowerBI. I love that product and use it heavily and promote it to clients. It just serves different purposes than Tableau.
Out of the box mapping is miles better in Tableau.
Overall Tableau is more fully-featured than Power BI. Tableau aims to be a complete BI presentation layer. Power BI is positioned as the self-service and personalized consumption component as a part of a larger BI presentation layer. Microsoft would prefer that the entirety of the BI stack be made up of their technologies, but Power BI can consume other data sources, and its reports can be embedded in other apps, so it fits into other technologies as well.
We could go feature-by-feature and Tableau would win the majority of presentation sophistication bullets (more fine-grained control of display, filtering, interactions - richer collection of built-in visualizations).
The differentiator for Power BI is more on the self-service end. Personalized dashboards (dashboard and report are two distinct concepts in Power BI) can be trivially created from published reports. Customized reports can easily be extended from published reports and datasets. There's a strong collaboration framework based on Office 365 groups and with lessons learned from SharePoint. There is also a pretty seamless upgrade path. Power Pivot models can currently be promoted to SSAS, and the expectation is for the same to be possible with Power BI models (all the same backing database technology).
I hate to be so vague, but there's a lot to both products. I'd be happy to dive deeper into some specific cases if you've got questions.
Power BI as a standalone product is pretty brutally limited in terms of data volume (250MB compressed data model is the max that can be hosted in the cloud service), and is missing the extensibility and flexibility that comes from a tool like SSRS.
As a self contained product, Tableau will likely hold the lead for some time. As a platform, I think Microsoft is beyond Tableau - they cover far more of the BI spectrum (and well, especially with SSAS) than Tableau seems to ever intend to.
It all boils down to this, in my opinion: PowerBI is for dashboards and reporting mostly. Tableau can do dashboards, but it can also do "analysis". Uncovering insights in the data that go way beyond simple cross-filtering.
In my mind, they are different tools for different purposes.
Just because you can, doesn't mean everyone will rush to do it. This is a self-service platform. Putting the data in the hands of decision makers. There is hierarchy in how people use such software:
The user: He uses only the apparent features in the GUI. Like the grid and aggregation formulas in Excel. He learns how to use the software from other people showing him how to do stuff.
The Power User: He has deeper needs, but can only be bothered to use features 1 or 2 levels deep in the GUI. Like pivot tables, vlookups and index/match, logical operator formulas in Excel. He learns how to use the software from tutorials.
The Advanced User: He has a task and does not mind getting his hands dirty in order to fix it. Uses DAX and Cube formulas. Perhaps even Macros. He learns by googling his problem and reading documentation.
The Developer: Solves the problems at the programmatic level.
Tableau occupies a very specific spot. It is brilliant for the User who only consumes dashboards via clicking on them. No explanation needed and it is super polished. It is also powerful enough for the Advanced User who can perform relatively sophisticated analyses from the interface. Generally speaking, it is not a good fit for the Power User who doesn't have the need to justify using it. It is also not a terribly good choice for the developer because it is too restrictive and the programmatic features are not well thought out.
With RedShift, you offload the actual crunching to the db that can handle billions of rows, and then visualize the results.
I have limited experience with it (only POCs), but I liked it very much.
They usually figure well in Gartner rankings
Searching "tableau" in Google stills shows their website as well as a large knowledge graph on the right hand side which tells you exactly what the company is about. There's News results at the top of the page, but there's still plenty of information, including the company website, on the first page of results.
Keep plugging at building great software that actually helps make this a better place. Charge for it so you can survive this downturn. It's a cycle, this will weed out the nonsense.
That's not to say we're not due a correction.
Jim Cramer, et al are saying that it's the strength of the jobs report that is hurting. Because as long as unemployment is ~5% the Federal reserve might raise rates, opposed to if unemployment worsens, the fed might not.
A few things in the above explanaiton aren't completely consistent...
You're not a VC if your money is in publicly-traded firms.
Also a big factor in the TCO is the lack of maintenance and overall simplicity in a electric drivetrain and that's completely independent of gas prices.
VCs cannot just sell public stock on a whim, these things are scheduled far in advance and can only happen during trading windows.
Also their growth is down.
Your friendly neighborhood tax agency is a special case of firm, which can commit to paying you money in the future. One way they can do this is, when you lose money in year N, they can let you carry that loss forward for up to X years, so that when you're taxed on your income in year N+3 you might be able to offset some of that income with the loss you made years earlier, reducing the amount of tax you incur.
The guesstimated value of that offset in taxes is your tax asset. If your marginal rate is 30%, and you can offset $1 million in revenue, the implied value is +/- $300k. Importantly, if you guesstimate poorly or your friendly local tax agency decides to change rules on you in the interim, you have to adjust the value on your balance sheet. This can be problematic if the tax asset is a material portion of your notional value.
If you've got, oh, $300 million in cash ($200 million in remaining investment plus we'll say $100 million in collected revenue) and another $25 million in accounts receivable then the tax asset is worth about, round numbers, $250 million, or 43% of the book value of the company.
Having to write down 43% of your book value would suck.
This is, again, not outlandish for a company on that trajectory. If the revenue is growing rapidly and forecast to continue growing rapidly the company is in a wonderful spot.
For example, companies will calculate depreciation on their capital assets using various methods. However the IRS/CRA have their own methods for calculating depreciation on these assets. The difference between these two amounts can create a deferred income tax liability or asset. That is, if you record depreciation higher than what the IRS calculates as, the income tax expense you record on your income statement will be higher than the amount you are actually charged.
There are also rules which identify whether or not companies can put a deferred tax asset on their balance sheet. Under International Accounting Standards (IFRS), companies must establish that they can realize these assets by having sufficient income to apply them against in the future. In the U.S., as was the case here, a valuation allowance was applied to decrease the asset as they indicated that they weren't likely going to have a sufficient net income in future periods to apply that asset to tax expenses.
Anyways I'm a bit rough on it as while I've got an education in accounting it's not my day-to-day job anymore. Additionally I'm not that familiar with U.S. accounting standards
I am not an accountant, here's a likely better explanation:
A) Its sales team sells it as an end-to-end analytics suite, but in nearly all actual implementations I've seen companies just use it for management to view a dashboard... essentially just a slightly fancier version of the Excel doc some analyst used to mail out
B) It's crazy expensive for what it is
C) If the company has actual data scientists onboard then those individuals can do far better analysis just using free open source tools
For the above reasons and more I've seen a lot of large companies that are far less excited by Tableau than they were a year or two ago. They've either halted larger roll outs that were planned or just moved away from the platform entirely. That leads to softer sales and slams the brakes on growth rates... hence why the stock has tanked. There's also the issue of valuation multiples against their financials which are also still frothy, even after the latest downward movement in the stock.
Seems like quite the correction for an earnings beat and the removal of a ~50M deferred tax asset though, especially since people already had negative expectations before the earnings release.
"Tableau confirms big Kirkland expansion, plans to hire 1,000"
Answer: We want someone stable! Not some crazy hacker. Plus if we don't spend the cash, we don't have it in our budget next year.
Oh, I get it now. :-)
Unless your queries are very simple, you are better off finding a programmer and asking them to do your job.
By your logic, there's no need to use MS Word, people should just use TeX.
This took me literally 5 minutes in tableau, I would hate to see how long this would take you writing code.
..Just the tip of the example iceberg.
Anyone with any experience in drag and drop applications will tell you that once things get complex, it ends up taking much more time using the simple drag and drop solution as you spend all of your time trying to get it to do something that is outside of a basic drag and drop model.
I have, and I can tell you it's not easy, for example in Canada there are like over 10k postal code regions. Then imagine maintaining this dataset, no thanks.
Anyhow that's not the point, that's just one data set Tableau does very fast, but tomorrow could be something totally different and the next day, etc.
Sure for complex custom requirements no BI tool will work, but it's sure better than what people were doing just a few years ago with excel or hiring expensive firms.
One example that involves both is when the visualisation needs a contiguous date range in the data, but the data is missing some dates. As a programmer it is easy to just loop through a date range and put 0 where there is no data.
There are also lots of limitations in terms of the visualisations. The end result was that multiple visualisations had to be created to show something that a programmer would be able to create as one visualisation.
Some examples: https://public.tableau.com/s/gallery
That's a lot of money for a business report.
1,000 USD/yr for a license.. If you pay someone 100 USD/hr to do the work compared to 50 USD/hr for a random Joe Blow analyst then it only needs to save 20 hours of time.
ITT: A lot of people who don't realize 1k a year isn't an absurd enterprise price.
Software eats the world and the education system. It's becoming increasingly easy to find developers. https://gh-prod-ruby-images.s3.amazonaws.com/uploads/image/f...
To some extent, companies might be trying to use too much data? Who knows.
Life Time Value capture in the analytics space is sufficiently difficult. Let's say you're an analytics customer:
Early stage customer: Use any one of the 100 integrations analytics companies listed on segment.com's website. You DO NOT need to track hundreds of parameters. Conversations with customers have 100x the value of tracking the small things at this stage....usually.
Mid-Stage Customer: Maybe you choose one of the 100 companies that makes the most sense. You start paying for it.
Having these early and midstage companies as customers is tough. The vast majority of them fail.
Big Customer: SnowPlow Analytics? Splunk? Tableau?
That being said, I think there's businesses to be built, but only semblances of unicorns and minotaurs.
15,000 web and mobile apps are coming out tomorrow. What % of them are analytics related applications? (Hint: A larger one than you might think). Just have a gander at the applications that are listed on promotehour.com and startuplister.com's list of app launch sites. There's now 100+ launch sites.
To win big in VC money, you have to take risks, the analytics space seems well established with a slew of best practices that are extremely well known. Ie. not risky enough to warrant pouring VC-istan money into.
- Tableau Server runs only on Windows, so why can't it use a TLS certificate and key from the CryptoAPI certificate store, rather than requiring these to be converted to PEM format (with Unix line endings!) and saved in the file system?
In an enterprise with an internal CA using Active Directory Certificate Services, these extra steps have to be done not only at installation but also every time the certificate expires. Compare the experience with Microsoft IIS: the server automatically requests a renewal from AD CS, retrieves the new certificate, and begins using it.
- Tableau Server should be able to run as a Group Managed Service Account, so we can give it access to remote data sources without having to assign (and regularly change) yet another service account password.
- It would be helpful to have an scriptable installation process; as far as I can tell, there's no way to install Tableau Server without clicking through wizards.
Any specific difficulties you faced with this..?
1. No ability to use a 3rd party auth provider AFAIK, which means either keeping tableau passwords in a database or having users remember two different passwords
2. Embedded views use synchronous requests, which can easily hang the browser. Synchronous XMLHttpRequest has been deprecated for a while. I think I even saw a version of dojo from 2005 being loaded.
3. Reports are either static size or dynamic size, and unless you're using the (clunky but well documented) JS SDK, there's no way to tell.
4. Viewing reports in the browser is sloooooow. Browser console output is filled with warnings.
5. In order to put together sheets from multiple workbooks into a browser-based view, you need to either a) load the jssdk for each of the workbooks and query for sheets, which is extraordinarily slow, or b) do it with the REST api, authentication with which is asinine in nature (see #1).
The answer is SAML/ADFS. You should look to enable this integration. If you are not using AD/LDAP, that's a whole different story. But SAML/ADFS is pretty much the standard way since Tableau is a Windows service, it is very natural to just use AD/LDAP/SAML.
 It's been a few months, but I remember getting the license activated offline was a weird process. Something like, point tabadmin toward a license file, which generates a number or json or some other file, which you then paste into or point the UI toward, which gives you another file to use in tabadmin... and at the end tabadmin gave me error. Now when I go to "Manage Product Keys" it acts as though it is unregistered, but the server still starts without error (it did not before the failed activation ritual).
I do have a ticket in with support for .
Given how much of a bitch it was to activate (or half-activate) I'm reluctant to investigate  further.
Also, I'd like to see a linux server. Tableau is our only Windows server, which weighed heavily against the product when we were considering alternatives.
All of these issues mentioned here will be sent to the server product owners and managers. :)
I am, however, on the maps team. I am curious about  above. I'll see what I can find internally on this. I am rather curious since this isn't something I have seen.
When they give you a license file, it's cryptographically signed with their GPG key, and the public key resides on the appliance for verification. All you have to do is get that license into the system, either by USB key, typing it in yourself in Vim, or simply uploading the license file in the webUI if you have access to it.
- I have to explicitly add each server IP address. I have no way to trust an entire subnet or range of addresses. This is a huge problem in an auto-scaling app server environment where I don't know the IP addresses my app servers will have. It is a major annoyance to developers whose DHCP-assigned, dynamic IP addresses keep changing.
- There is no API for adding trusted IP addresses. It is a manual process.
- The Tableau server must be stopped and restarted to add new trusted IPs.
There is so much low hanging fruit, I feel like anything related to actually running and maintaining tableau is ignored and I don't seem to be the only one judging from comments here.
I would add that I'm disappointed the only way these issues get attention are articles and threads like this.
Also, lack of Sharepoint integration / ability to handle federated login services with OData connectors.
Minor issues though, I'm a huge Tableau fan.
It's an unholy combination of rails and postgres somehow hacked to run on windows. Really, they should just ship a linux VM that runs these things decently.
I would be interesting to know what the problems you have faced are.
Many Linux services have a concept of reloading. If the config file changes you can send the running program a signal and it will re read the config. This is very useful for production systems.
Tableau (9 at least) has no such concept.
Change the email address it reports to? Restart tableau.
Change the location of the SSL certificates? Restart tableau.
Want to apply an update for tableau? Uninstall your current version and install the new one. Oh and until recently when you downloaded the installer for tableau server the file name didn't actually contain the version number.
This product was not designed with ops in mind at all.
Edit: I forgot, I've actually had a tableau server fill itself up with logs. Tableau has logs in many different locations outside of windows event viewer and doesn't include log rotation facilities for all if them.
It's like that because R&D and Operations never talk; and of course your average Windows Ops person has a poor understanding of operating systems.
Never understood why Tableau is either Windows only or has the restart to reconfig issue. Last I looked, it was largely a Tomcat and PostgreSQL based product.
Just trying to understand since I've written software with the same restart to config workflow and would like to understand what causes it to be problematic.
If reloading was an option then there wouldn't be downtime, and I wouldn't need to schedule a maintenance window for something as simple as updating an email address. The idea being that if there is a config error during a reload, the system just continues uninterrupted with the original config. If I have to stop the system completely in order to run the config sanity checks when it starts again, the potential for prolonged downtime is much greater.
Would a system that did something like an internal cut-over be useful? e.g. try to start a whole new instance of the application, if it loads, then let it become the running application, if not, write an error log and shutdown?
It would still lose all the state associated with the previous instance, e.g. user sessions, but would avoid this specific issue.
I agree that it's pretty silly that things like email addresses need a restart, but I'm wondering in general how bad this pattern is.
An interrupted session isn't a big deal if it's an infrequent occurrence and they can just login again.
I think an improved solution in this vein would be a tool that would let you sanity check the config before reloading.
But yeah, this thing is a mess.
did you ever consider there might be valid reasons some folks prefer *nix based servers?
Check the timestamps of those messages vs mine. At the time there were only the two responses I mentioned.
Publishing to the desktop (Windows or Mac), to the web, to the cloud, or mobile devices (iOS and Android). Publish to the server once, consume on all supported platforms.
Deploying a copy of a current site for redundancy, testing or development. Install the app, backup the primary Tableau database with its admin utility (command line), restore it on the new box. All data, visualizations, users and permissions are contained in that single restore step.
Tableau means I spend time working with my data, instead of the presentation of it. Its not a perfect product by any measure, and could obviously use some improvements, but is a timesaver in many areas.