Hacker News new | past | comments | ask | show | jobs | submit login
Salesforce is buying Tableau for $15.7B (techcrunch.com)
950 points by albertwang 11 months ago | hide | past | web | favorite | 376 comments



So I have to deal with both tools at my company. The reality is that it all feels so incomplete from a BI standpoint. Managers are throwing money at front end tools like Salesforce and Tableau, but the entire back end stack is still pretty much the same as 20-30 years ago (big expensive Oracle-ish databases).

I think the development of Python and Jupyter and other less known things like Vega are much more interesting. Python is today the only "glue code" that puts all of it together, from data to insights.


> the entire back end stack is still pretty much the same as 20-30 years ago (big expensive Oracle-ish databases).

Other than the expensive part, is it really such a bad thing? I feel like relational databases are a pretty good fit for a wide set of use cases and have a huge amount of tooling.


The businesses are coming with requests that require complex SQL on millions of records of data that normally is sitting in various sources (warehouse, salesforce, etc.). Unless you hire expensive data engineers, you can't do this type of work reliably. You can stick things together with expensive GUI oriented prep tools like Alteryx, but you pay in reliability and, quite frankly, sleep. And forget IT, IT is so stuck in their ways that you'd be waiting years for each analysis + you'd spend 10x what you should.


Isn't the problem space here genuinely complex in terms of business-complexity? Is there some better alternative that doesn't entail some other massive tradeoff such as managing your own servers, creating ingress mechanisms from multiple systems, building your own version of salesforce etc.?

In short, is there any solution that "does everything you could possibly want" while ensuring you _never_ need to hire a data engineer? This is a holy grail that I don't think exists.


Yes it is :)

You have to normalize data taken from various sources of various age and complexity. So you really have to understand the data. You also have to really understand the questions.

I've worked with (and on) lots of these tools and projects; the complexity is never in the frontend, it's dominated by getting the data, getting the data right and into the right format.

If all you want in the end is a good looking dashboard on a website then you might as well build it yourself; because of the cost structure that can even cost less than buying one of the BI frontend tools (there's not a lot of difference in development time, the the BI frontenders are more expensive because they are rarer and the licencing is high).


from my humble experience if you have a sales or product team keeps pumping out spreadsheets in weird formats you need someone dedicating a few hours to get a proper etl, and if they are constantly changing the format or adding new things you need a dedicate person just for that. Modern tools like Python or Power Query are not enough for this eternal war.


It's not that, it's the systems. 15 years ago I built a pretty sophisticated for its time data warehouse for a company that ran call centers. The amount of data that came off of the call systems was staggering, and the format arcane. Every vendor patch had the potential to wreck the ETL process. Then there was account data from clients, and other internal systems.

The people and their spreadsheets was the easy part to control.


This basically reads like, "You need to have a data engineer." Or half an engineer and half an analyst.


Let’s say you have 20000 tables in total for a company. They are in 10 different databases. You have no overview over the data and no comments. You don’t have a starting point for where information x are.

Welcome to my reality.

Would I love a data architect and a domain expert in my team? Yeah.

Will I run around booking meetings with everyone that even hints at working with data like a headless hen? Yeah.

Is this the normal procedure for Data Scientists in big and old companies? More so than I would like.

Oh! And I forgot that the security department will constantly deny your access to data you need (until you force their hand).


Everything you mention is true and is compounded if the data healthcare related. Privacy concerns, data from different systems that claim to be the same. Preventing reidentification.


If you can get your data safely to S3, Athena can handle a lot of reporting and analysis use cases. The table or view definition can handle the normalization process. Full on ETL pipelines are sometimes (but not always) more engineering than necessary.

(Disclaimer: I work in data engineering at Amazon and use those tools in my day to day)


The tools and stack of Salesforce make building your own version extremely appealing.


Expensive data engineer here, I see nothing wrong with this >:)


I am hiring one in Krakow! Seriously though, in a team of 10 business analysts I can barely afford 1 data engineer. Business analysts tend to cost less and also be more "business focuses", so they are an easier sell to management.


You seem to have an engineering problem, so hire engineers and perhaps fire some of those analysts. Don't make your business depend on someone else's tailored IT products, they will reap the profits you could be making.


Yes, I am a bit confused by that statement too. Isn't it good that complex tasks are handled by specialized people? Maybe it is my bias as an ex-big data engineer / current data scientist, but it seems to me that a lot of the tooling is is pretty simple as it can be (yes yes complacency is the enemy of good, I mean no obvious things to improve as low-hanging fruit)


Tableau has a data etl tool called “Prep” that helps with this problem. But it only goes so far. But I think that’s where the problem truly requires a data engineer.


There are plenty of modern day ETL tools like Funnel, Improvad or Dataddo to help with that part of the puzzle, though it does mean you have to pay another saas each month on top of Tableau.


Exactly. Instead of ETL, start writing your own Perl and various logical, reusable components. Roll your own ETL, however you want it, in a terminal. So what, you have to learn vim, big deal! Mouse driven interfaces are a huge part of the dysfunction.


Yea I've been a little confused here until I realized I would just write some bash, Python, Perl...etc script where some would advocate for complicated tools.


And after a few years you leave your job, a new person comes in and gets stuck with your script soup and lack of documentation.

Companies prefer well known products like Alteryx or Tableau because, despite the cost, it makes people easier to replace.

But i cant blame you for writing your own things. Im currently replacing a large SSIS-based etl proces with Python, because i'm sick of SSIS randomly breaking.


Ungodly expensive.

PostgreSQL on the other hand - so good, so free!


Just make sure that you tune it for analytical workloads. Its defaults are very conservative and lean towards transactional workloads.


Do you have a link for more info on how to tune Postgres for analytical workloads? TIA


RDBMS = data silo


You make it sound as if this were a bad thing. RDBMS work well for many use cases. There are plenty of tools around to work with them. Good open source implementations exist.


The problem space (Business) is not complex, its incredibly simple. Unfortunately the design patterns we use and the tooling is flawed. It has been this way for at least 25 years and the RDBMS, languages, design patterns, and architectures ( 2-tier/ 3-tier) are the cause.

They make it simple to get started and even without knowing what you are doing you can easily churn out something that works if it is simple, doesn't change often, doesn't need to scale and deals with small amounts of data.

But similar to http://blogs.tedneward.com/post/the-vietnam-of-computer-scie...

"It represents a quagmire which starts well, gets more complicated as time passes, and before long entraps its users in a commitment that has no clear demarcation point, no clear win conditions, and no clear exit strategy."

RDBMS are the root cause.


I feel like you are coming from an alternate universe. Nosql is the quick and easy thing to start with, then your needs become more and more complex and nosql just won't cut it anymore. Sure, nosql can scale and perform, but only if your needs are very specific and simple.

There are no major systems out there of even moderate complexity that aren't built on an rdbms.


Sorry I didn't explain it clearly. RDBMS are the reason we write business applications the way we do now. They are the root cause, swapping out RDBMS with NOSQL will solve nothing, because our languages, architectures, patterns, and libraries, how we even think about solving these problems all evolved on top of this and are flawed


> There are no major systems out there of even moderate complexity that aren't built on an rdbms.

I don't think this claim is accurate.

https://www.dynamodbguide.com/the-dynamo-paper/


That's the problem, business systems are not complex, they are incredibly simple, they are made complex by the way we think about and structure our models and then interact with them. We have the wrong boundaries, and the wrong languages


Would you be able to point me towards modelling approaches/boundaries/languages that would be more appropriate? I'd be interested to learn about better alternatives, as I don't yet see the big flaws in relational models


Datomic https://www.datomic.com/ a not-sql cloud database with a robust and principled information model competitive in use case with rdbms. In agreement with your parent, Datomic is very different and most commonly used from the Clojure programming language which rejects similarly much mainstream thought/practices. https://clojure.org/ https://clojure.org/about/state The talk Database as a Value https://www.infoq.com/presentations/Datomic-Database-Value/ is a good starting point to understand the flaws in relational model implementations (it has been argued on r/clojure that Datomic more faithfully implements relational algebra than RDBMS does – https://www.reddit.com/r/Clojure/comments/99f4ln/datomic_is_...)

Counter-intuitively Datomic is in violent agreement with /u/rqmedes where he said "A better alternative is having the data, data model and business logic tightly bound in one place. Not separated in multiple "tiers"" – Datomic inverts/unbundles the standard database architecture such that the cached database index values are distributed out and co-located with your application code such that database queries are amortized to local cost. Immutability in the database is how this is possible without sacrificing strong consistency, basically if git were a database you end up at Datomic.


Unfortunately there is no real alternatives, its like operating systems, one or two system have so much momentum that using anything else becomes extremely difficult even when they are inferior in certain domains. see http://www.fixup.fi/misc/usenix-login-2015/login_oct15_02_ka...


A better alternative is having the data, data model and business logic tightly bound in one place. Not separated in multiple "tiers"

When one of these things change it changes the rest.


Consider the case when you are a big company and have hundreds of them


I played with Denodo (data virtualization software) a couple years ago and thought it was pretty legit.

In theory, it could be used to provide that industrial strength abstraction layer between your Tableau/Looker/etc. and your bajillion weird and not-so-weird (RDBMS) data sources.

That would seem to make sense to me from the point of view of -- I would want my data visualization/analytics-type company to be able to concentrate on data visualization/analytics, not building some insane and never-ending data abstraction layer.

The part that surprised me was that Denodo could allegedly do a lot of smart data caching, thus speeding things up (esp hadoop-oriented data sources) and keeping costs down.

I'm guessing the other data virtualization providers can do similar.


I have had to work with the Denodo for the past 1+ year, a total nightmare. Data virtualization is a "good in theory" concept but "doesn't work in practice" reality. Going back to the original sources for each query doesn't work, it will always be slower than using a proper analytics data warehouse. Caching doesn't help because at that point you can just do ETL. Also Denodo itself is full of weird behaviors and bugs, my team collectively decided it's worth the most hate of all the "enterprise" tools we use. One thing Denodo is good for is as an "access layer", but then maybe PrestoDB would be worth a shot or maybe even just a sqlalchemy and python.


I don't understand why this gets downvoted. While it may lack context (with the aim of being controversial), it sparked a healthy amount discussion here!


My Salesforce clients have increasingly been considering Tableau. Tableau has great out-of-box discoverability of geo fields, dimensions and facts. The ability to auto-suggest a map of the U.S. based on data containing City/State is "magic" to power users.

The only barriers to Salesforce + Tableau adoption I noticed were cross-object JOINs and live vs cached data extracts.

Both issues were remedied by denormalizing the data prior to export. For example, a nightly flattened "view" of Opportunities with key related objects moved into columns.

Mulesoft is well suited to perfect the ETL challenges. Bringing them to the table could be a win for everyone.


Tableau might be friendly to analysts/end users, but its a piece of at the back end. The linux version they released is no more than a wrapper of original C code into JVM and thus causing huge performance issues and memory leaks. Being that said GL CRM


> I think the development of Python and Jupyter and other less known things like Vega are much more interesting.

In that case you may be interested in Dash (dash.plot.ly). It’s a free and open source library that you can use to create dashboards online with Python only.


We love Dash on our team, anything more than a Tableau dashboard goes into Dash. You can basically just treat it as a Flask app.

We write our back ends with FastAPI[1], which is usually just a wrapper around our ML models. Then serve both Dash and FastAPI with gunicorn. The backend is provided the uvicorn[2] worker class with the gunicorn -k arg[3] to greatly increase the speed as well.

For personal projects you can use this stack in GCP's AppEngine standard environment to basically host your (relatively low traffic) apps for free.

1. https://fastapi.tiangolo.com/ 2. https://www.uvicorn.org/ 3. https://docs.gunicorn.org/en/stable/run.html#commonly-used-a...


Another +1 for Dash!


+1 for Dash. It's very good.


plotly aint exactly free, you get 25ish free plot and then you gotta pay.


That's not totally accurate... If you use our SaaS Chart Studio product then yes, but otherwise (i.e. in 'offline' mode) you can use our Python, R and Javascript libraries as much as you like: they're MIT licensed.


PS: and Dash, which is what the GP was talking about, is totally free and unrelated to Chart Studio :)


I'd like to add that Dash helped me grok React far quicker than I'd expected. I code in a lot of React now and I'm only here because of Dash. Thank you for your hard work!


The backend has also evolved massively from Hbase to the cloud-powered data warehouses. We have the ability to ingest and query petabytes with single-second delays now. There's also on-demand querying like Presto/Drill/Dremio, ETL systems like CBT, and the growing space of "data lineage" for seeing how data is connected and has evolved over time.

The real issue has always been the organizational problems of larger teams and companies as data gets split into multiple silos and needs ETL and cleanup before it's useful. The new abilities we have gained have increased the complexity and scale which can lead to new challenges, but the tools are definitely getting better every day.


If you have to write code for this stuff 90% of people won't be able to analyze things, and 90% of analyses won't happen because they won't be worth the time.


big expensive Oracle-ish databases

Don't forget Teradata.

I found the same thing with MicroStrategy. I spent a lot of time reverse engineering what I could from MicroStrategy jars to expose additional functions in their plugin interface (which is so incomplete it shouldn't be advertised). But the reality is its 20+ year old system with front end updates, can only put so many band aids on it.

I think the only thing keeping MicroStategy alive is its cube functionality and the businesses who have invested to much into it.


What is Vega? I only found the genome browser tool, unless that is what you're referring to?


Its a javascript graphing library. We've started using it at work a bit for our online dynamic graphs. Its a graphic grammar on top of the d3 js library.

We seem to prefer the lite version, which is simplier.

If you look at the examples, you can click a button a go to a dynamic editor which we rather like.[1]

https://vega.github.io/vega-lite/

If JS and web browsers aren't your thing, they have python version called "altair"

https://altair-viz.github.io

[1]https://vega.github.io/editor/#/examples/vega-lite/stacked_b...


My guess is this, since we are talking about tableau: https://vega.github.io/vega/


Thanks. Seems extremely similar to Plotly upon first glance and their viz data structures.


Vega is nice for its declarative, portable nature. Also, it makes certain things very easy such as embedding interactive Vega charts and generating images of charts without needing something like PhantomJS.


Hey - I'm hacking on a product which allows you to build interactive browser-based apps/dashboards from Python and dataset blocks. You can return altair (vega) plots from your Python and they'll get rendered in the app, and we're just adding the ability to import existing Jupyter notebooks this week.

It's still really early, but feel free to have a play and create an app. Here is an example app examining using the Prophet forecasting library: https://nstack.com/apps/rdA647Q/

I'd love any feedback, and if you'd like to chat to learn more, reach out to me on leo@nstack.com.


I can relate. The sales pitch is still “so easy anyone can do it”. As a result, corporate money is thrown after tools that enable business folks to build dashboards...which isn’t a good use of their time. :). In truth, they need good data folks working with them. I guess that doesn’t make for a sexy product sales pitch.


I think the counterargument is that it's not a good use of their time to convince someone to build it for them, when they can finally do it themselves in half the time :-)


Lindy effect. If it has survived that long it probably will survive in future too. Unless the quality of individuals working in non-tech goes significantly up in tech savvyness I find it hard that it will change.


When I look at Jupyter notebooks, I ask myself why is anyone bother with ootb reporting tools or growing their own? Jupyter notebooks are the right mix of customizable and self-documenting.


The kinds of analysis and visualizations you can build in 5 minutes with Tableau (and the ability to explore the space of possible analyses and visualizations) would take hours of futzing to reproduce with python.


You can build a basic report in 5 minutes. Then you’ll spend hours tweaking it and making all the changes the boss wants. And then hours more discovering there’s things you simply can’t do (but you’re boss won’t accept that).

And next week you will have to do it all again, because it’s all manual.


How many business users are going to use Juypter notebooks? Meanwhile, someone with basic computing experience can create in-depth reports with Tableau in less than an hour.


> How many business users are going to use Juypter notebooks?

None, because it's too much programming for IT to let business people have access to it, and it's not disguised as an office productivity app the way Excel is.

If they had access to it and had basic training on it that anyone already competent in any vaguely quantitative domain could handle, plenty of them could and would.

At least judging by my experience with SQL shells and similar told that are both less powerful and less friendly than Jupyter + Python, and yet plenty of business people used them productively in enterprise environments (often right up until IT ripped it from their hands.)


Python is making some inroads in shops that have been using SAS and/or R, but it’s a hard sell. Jupyter is even harder to push because of the server element.


This is a repost but since you’re interested - Netflix data science is very pro-Jupyter/Python/data analysis packages. https://medium.com/netflix-techblog/notebook-innovation-591e...


SAS and R are pretty common business user tools that are basically the same thing as Juypter. Tableau is nice, but you generally need to use something else to prep data for it whether that's an ETL process IT sets up or something manual that an analyst publishes.


Netflix uses Jupyter for data science quite heavily: https://medium.com/netflix-techblog/notebook-innovation-591e...

But that’s for data science which is (hopefully) the foundation of actionable BI.


There are loads of options for BI backends. Snowflake, Panoply, Redshift... the list goes on and on and on.


The backend stack of both make development like swimming in glue. The same managers you are referring to cut us over to these tools and we subsequently experienced a massive decline in our business.


Are you referring to APM/infrastructure monitoring, etc.? I was curious to find out if you're thinking BI capabilities need to be just as powerful at the backend as these frontend tools.


In business analytics backend > frontend. If you are a business analyst you want all your data at your finger tips + the ability of running somewhat complex transformations and calculations. You also need to be able to validate data with your own rules, because you can't always trust data from warehouses and other sources. Then you want to take all your work and schedule it, so it runs automatically and refreshes all the reports depending on it. Finally, you want triggers on data changes/updates and good looking email notifications with screenshots, attachments, etc.. Of all of the above, end users might only see the dashboards, pdf's or Excel's on the frontend part. If you have a solid backend, you will be able to serve your customers at multiple times the speed and with a lot more insightful reports. At that point the difference between a Tableau dashboard (which does look good) and, say, Superset (which does not look as good) is somewhat negligible.


Confused why you call SalesForce “front end”?


I interpreted that as front end to a database.


It's really a whole working methodology that includes an Object model, which one would call backend.

Did I mention its a nasty fugly POS?


Yeah, it's certainly more than just a CRUD, and I agree it's awful. I never had to use it in the early years, so I don't know if it's that their original platform was poorly made, or that the accumulation of bloat & bolt-ons is what has made it this way.


Yeah but to someone who is in to working directly with RDBMSs, Salesforce is 100% front end. It's like how CEs think C is high level, but web developers think C is low level.


Mostly non-functional?


What is Vega?


> The goal of the Vega project is to promote an ecosystem of usable and interoperable tools, supporting use cases ranging from exploratory data analysis to effective communication via custom visualization design.[0]

0: https://vega.github.io/vega/about/


Wow, Looker last week and now Tableau. I’m only 90 days in to running the open source equivalent inside GitLab, and these big revenue multiples are telling a fascinating story. I would’ve loved to be a fly on the wall for this negotiation, what a great outcome for Tableau shareholders!

With all that’s happening we’re definitely looking to pick up the pace, and would love to work with more contributors on the free open source alternative at Meltano (www.meltano.com)

Edit: just wrote a quick post with some open questions I'd like to explore around this deal https://meltano.com/blog/2019/06/10/salesforce-is-acquiring-...


Feedback on your homepage, https://meltano.com/ There's no single visual on it :( Please show what kind of results one can get with this product.

Contrast with https://www.tableau.com/, which has a sample graph and "See it in action" right at beginning.


Also second this, product guy here who has been using Tableau for years. Curious about Meltano. Went through some of the site / installation guide / videos and then the gitlab issues log and then found myself back here :)

Would like to see a few simple, visual stories about how one could derive business value from meltano, ideally real-life use-cases but if you dont have those yet just make them up.


Thank you for taking a look, you're right and we are working on it right now. This is perhaps a little more attention than we expected for our tiny 5 person project, and lights a great fire under us!


Also, you may want to set up a Patreon page, or other crowd-funding site if you're making this all open source.

There are companies that will pay for it just to have the alternative available to places like Tableau.


Thank you for the feedback, I've created an issue here: https://gitlab.com/meltano/meltano/issues/691 and we are working on it now

In the meantime, one way to see more is to checkout our getting started guide: https://meltano.com/docs/quickstart.html and also our YouTube channel which has weekly "Demo Day" videos sharing our progress: https://www.youtube.com/meltano

Really appreciate you taking a look at what we're up to!


Here we all are in the comments complaining about how all the focus is on front end instead of fixing the back end, and one of the first comments to a legit alternative is "yeah so what's it look like?" !!


Reality is that both need to be covered.


Exactly what we’ve been looking for. An open source ETL tool.

This looks great. Will check it out for sure. Keep up the great work.

I wonder if there are any other open source tools in this space?


There are some great tools in this space, definitely worth exploring.

Our vision is to glue the steps from ETL to dashboard together in an end-to-end solution. We pick whatever we consider best in class and integrate it. So far, we've got Singer, DBT, Jupyter Notebooks and Apache Airflow and we're using VueJS for both the product UI and our website.

We're also working on a blog post exploration what other acquisition might happen in this space. We're adding suggestions to the spreadsheet as we hear them on Twitter, HN, etc https://meltano.com/blog/2019/06/10/first-looker-and-tableau...


thanks for the product. I checked Singer, DBT, both of which are great idea and product. I build one very similar (conceptually) to DBT in my last data project. (I would not if I knew dbt is there).

Airflow is too heavy weight for me, I use Scons to do the workflow management.

meltano must do a lot of work to integrate all this together. I wonder what is the general user experience is. To me, seems too heavy weight.


Ideally, to abstract away complexity (which is what causes that “heavy” feeling)


I'm building one called Ohayo: https://github.com/breck7/ohayo

It is still beta but should be ready to go in weeks.

Here's an example of the top 100 stories on Hacker News:

https://ohayo.computer?filename=help-for-hackernews.top.flow...


Would love to see if we can collaborate. Can you drop me a note danielle@gitlab.com?


done


Back when I was at a company that had frequent ETL needs, we tried a few, and eventually settled on Rhino ETL. It's not a pretty drag-n-drop tool like some of the others but it was simple and (therefore) worked really well.


I did some integration work with the venerable Pentaho project some years ago. It's got both an Enterprise edition and a Community edition that's open source with tools including ETL and more. https://wiki.pentaho.com/ looks like the best place to start these days.


I too need an open source ETL tool where I could programmatic specify the object mappings and it should work.


> I would’ve loved to be a fly on the wall for this negotiation

I'm not so sure, I'd probably be too worried about spiders to focus much.


Funny coincidence: I was reading about you in relation to Mattermark, then found your blog post on Meltano, and now I find your comment on HN. You're everywhere! :)

Would it be possible to chat with you about your past experience at Mattermark? I am trying to answer a few questions (not related to the company, but related to VCs and investing). If so, my HN username @ gmail. Thanks!


Of course! Dropping you a note now


I just wanted to pop in and say Meltano looks awesome! Good luck with it.


What's the business model?


Have not decided yet, but I’m leaning toward consulting services and possibly even marketplace. There are so many more independent analysts, data engineers, and other experts/specialists who could provide a service network beyond anything we could hire internally.

It’s still very TBD while we get the product to a place where those opportunities start be more emergent. We definitely expect it to be an open core model though, similar to GitLab.

By remaining self-hosted we avoid the big expense (and risk) of storing users data, and they can pick whatever cloud they want. Our team is 5 core members working at GitLab, and we have about a dozen contributors. So it’s kind of a startup within a late stage “startup” (unicorn).


The work you are doing at Meltano is simply brilliant. It needs to be done and if done well I have no doubt you will be very successful. I plan to use it.


One thing I've had to remind myself of - coming from a CS/Engineering background similar to most folks on HN (I'm guessing) - is that there are 2 types of people: Those who program and those who don't.

To me, Salesforce looks like a big shared Excel file with a bunch of sheets. Tableau... well I can do the same thing with some scripts or spin up a web server.

To others, this tech is just magical. Pay the money, do the integration and it just works... And clearly people will pay a lot of money for things that "just work".


>well I can do the same thing with some scripts or spin up a web server.

That's a massive waste of your time and effort. The maintenance costs become massive as well should you choose to create your own system. At the VERY least you should leverage existing free open-source tools such as Metabase or Superset or Dash, or free tools such as Google Data Studio or Mode Analytics, if you're not going to spend cash to get a tool like Periscope Data/Looker/Tableau. I mean this gently, but you likely underestimate the complexity of a reliable reporting/analytics infrastructure. Think about it this way - these tools are either collaborated on by a large open-source talent pool, or are created by teams of dedicated software engineers just as talented as you.

I've worked with quite a few companies in an analytics consulting type role, and your "I can do the same thing with scripts" statement is one I've heard countless times. The long-term maintenance costs and technical debt (and "rigidity cost") of rolling-your-own analytics far outweighs the cost of a true analytics platform.

If you decide to roll-your-own anyway, look at tools like DBT and Airflow to reduce long-term maintenance costs.


> that's a massive waste of your time and effort.

Yeah I work at a company in the analytics space and see that all the time. It peaks the curiosity of people who are software developers (yet their core competency at their job is something else). They think its a fun work side-project and go after it. Write some python scripts to do ETL and process the data...make a backend with pg, a web server, then do charts in d3.js.

A year later they have a bunch of nice demos to their bosses but nothing that they can actually use in production because it crashes, there's no UI for interactive queries, no reports for people in the business groups, no user management etc. Then they drop it because they're busy with their actual job. So the cost of that engineers time to do something that didn't work was about $20-30k over a year. While the product they could actually use in production was around the same price.


just an FYI, the word you're looking for is 'piques' not 'peaks'


This sounds like it should be included in my yet-to-be-written biography. I fell into exactly this trap at one point.


Out of genuine interest and perhaps even a need for such a solution, is there anything that does what Metabase/Superset/Dash/GDS/ModeAnalytics does but for realtime datastreams? For instance, parsing and recording and visualizing the events coming from a websockets or some event queue/bus?


Yes perspective is built for streaming data - https://perspective.finos.org. Open source streaming pivoting engine that operates in the browser (using wasm in a webworker, with integration with Apache arrow for ingest of binary data strrams off websockets)


I believe you'd want a tool such as Grafana (which I believe is free/open source), which I have seen eng teams implement for realtime streaming. There is also Kibana which I am less familiar with.


Those all work for that already, because they're just frontends to whatever backend system you're using. You a database like BigQuery with real-time streaming inserts and rerun your queries for results.

If you want analytics directly on the stream, then there are plugins available to support reading the query results of something like Kinesis Analytics or Confluent's KSQL.


You might want to check Striim - it wraps kafka + streaming transformers with a scripting language, and a visual pipeline design tool. It then offers some really nice dashboards with a real-time “feel”. Not affiliated in any way but I took it for a spin and was happy to see how much could get done with it in just a few hours and without prior knowledge.


If you familiar with Druid then Check out Metatron Discovery( https://metatron.app) Druid is strong for data streams.


Ah yes like the other user mentioned AWS Kinesis Firehose and GCP Pub/Sub may fit the bill as well

Grafana/Kibana are the actual charting tools


How does Tableau make your reporting more reliable? My understanding is that Tableau can hook into different data sources -- like your warehouse and SalesForce, for example. Then you can write some kind of SQL to generate charts.

The auto-chart generation is nice. But what about Tableau makes it more likely to be accurate? Aren't you just as likely to make an error on the SQL in Tableau than if you didn't use Tableau?


I never said that about tableau in particular, which is why I listed half a dozen analytics platform solutions. Using any of those makes for a more powerful, flexible, sharable, usable tool than one engineer's self-rolled internal webpage with "analytics". Nothing special about Tableau in particular, in fact, I categorically prefer SQL-based visualization tools over Tableau.

The only exception to "using any of these is better than creating your own" is large companies like Google and Facebook, where they have entire teams of engineers who are dedicated to creating an in-house SQL+Visualization tools. It is absolute hubris for one engineer to think they can make a robust analytics platform!


Tableau is a GUI first. It's not designed for SQL based access and barely supports it other than as a type of custom datasource.

Since people aren't typing code, it can be more accurate to use, and it provides visual results beyond just a table that can be useful in detecting anomalies in your data.


Na. Roll your own and munge text how ever you want. Don't cede control to bad GUI/web tools.

If you are good, you aren't "scripting", you are making a rad MVC system.

Salesforce puts all the MVC together in nasty, nasty ways.


This... is not good advice. Do you tell people to roll their own databases and networking protocols as well?


As an engineer, this has really fascinated me. Tableau, Salesforce, Excel, etc. were things that never made a significant amount of sense to me. I thought mostly of Salesforce as a CRUD app (which it is!), Tableau as d3 with nicer ergonomics, and Excel as... well... something I never understood.

If this is something you guys are interested in, I started a company called Retool (https://tryretool.com) that is essentially Excel for developers. Imagine if every Excel cell — instead of being a cell — were instead a React component. So you drag and drop these components around, and you can connect them to any back-end datasource (postgres, APIs, etc.). So you could drag on a table and have it pull data from `select * from users` from postgres, and then drag on a button and have it `POST` the selected row back to your API, in order to ban a particular user. The goal is to let end users build CRUD apps (like Salesforce) around their existing datasources quickly.

If you guys have any feedback... I'd really appreciate it. We're just starting out, and really curious to get any feedback from developers. Thanks!


When I looked into this a few months ago, your pricing didn’t make sense to me. You’re saving developer time, but charging on user time. So I’ve got 800 people in this org and I could either pay a shit ton to use retool or just build it from scratch. The difference between both of these is my time so it makes sense I’d pay you for that.

That made it not viable for me. Building from scratch was way cheaper with Upwork. Anyway, Product was cool.


Thanks for the feedback — that's really helpful. I agree that our pricing isn't good for a large org that is looking to build a tool or two (instead of say, a few hundred).

Do you mind if I email you? (I can't see your email in your profile, but my email is david@retool.in if you're interested in reaching out.) I'm really curious — just to learn — what kind of pricing plan might work for you. Would a per-app model work for you, for example? Thanks!


I've since left the company but one of my friends working there is the one who suggested we try the product anyway. I'll tell him to contact you.


so you're rebuilding Visual Basic in react.


Do you mean that as a good or a bad thing? VB was fantastic to many people!


absolutely a good thing.

It was a light hearted jab at the description that made it sound like something completely original and never tried before :)

I hope he does well, those tools are super useful.


I would have to delve deeper to give actually useful feedback, but I wanted to say from a presentational point of view, it looks great, well done for putting something like this together


Unrelated to the tool, your homepage is absolutely gorgeous. I imagine the rest of your execution is along those lines, which is to say - excellent. Well done!


Uh this is super super cool, well done.


To put a finer point on it, they'll pay a lot of money for tools that require less-skilled labor to operate. They will gladly spend a lot of money on Tableau to avoid paying the salary premium to have "those who program" build them an analytics system. It costs much less to hire business analysts who don't program (aside from SQL).


Now if they just taught business analysts how statistics work.

If I had a dollar for every time someone showed me a wrongheaded graph they were using to make business decisions, I could retire.


There are varying degrees of skill and competence in business analytics just like there are in software development.


>is that there are 2 types of people: Those who program and those who don't.

Is that fully accurate, though?

Where does that leave data scientists / data analysts? I know SQL very well, and I know python's data stack (numpy, pandas, matplotlib, plotly, seaborn, various stats toolkits). I have a strong understanding of the "programming ecosystem" i.e. concepts, terms, definitions, and so on. I understand (basic) computer architecture, I've used and am familiar with (basic) shell/terminal, and services like Docker/Heroku on the command line, and can certainly use GUI cloud tools for AWS, GCP, etc. I can read and understand code and how systems fit together. I've worked alongside engineers of all types.

But I'm not a software engineer. I don't tell people I "program" because my strongest skill is SQL and generally people do not refer to that as "programming".


I’d instead use the distinction “those who can sling code when necessary” versus “those who cannot/will not sling code.”


you are a programmer. At least in my book. You may not be experienced in architecting software projects, or writing highly maintainable code, or dealing with niche frameworks and tools (gui frameworks, network systems). But that is just a lack of training, interest, opportunity, etc...


This is nice to hear. Truth be told, I'd like to get some professional software engineering experience. The problem for me is:

1. It's hard to get considered to be a SWE in the first place when your job titles are more in the realm of data analyst. They'll toss out my resume for a fresh grad, much less someone with experience, without a second thought.

2. If I were to make the switch I'd likely have to start at level 0 on the scale ladder. I've already career changed once into tech, and at this point I do not wish to "reset" my experience another time


"Data Analyst" is often programmer lite. If you've been doing that work for a few years you could easily transition into a full time SWE role. If you really need a confidence boost and/or some on-trend training, consider doing a full stack bootcamp.


And those who can program are split into those who can estimate work and those who think they could replace a product like tableau with a few scripts.


Yea no this is wayyyy underestimating the costs of data engineering. I can tell you from my very limited experience some queries I've seen, on some systems and some datasets, have estimated TTLs in YEARS -- and that's the difference between companies that invest in tooling and infrastructure (and big O analysis during system design) and those that don't. And you will invest a good deal of effort and time and money before you realize it, it's like the perfect ambush.

Never start a land war in Russia, and never neglect your data infrastructure if data is in any way a key business differentiator/fundamental in your market.


And this attitude right here perfectly exemplifies “NIH syndrome”. Beware if you work somewhere and a colleague suggests that some homegrown hacked-together non-feature-complete solution would be competitive and “cost less” than the professional product.


Wait until a manager asks you to do something that Salesforce was not built for and IT tells you a back office resource can do it. None of those of the shelves enterprise solutions just work, they all need some degree of customization and integration. I am struggling sometimes to justify the cost of a vendor solution + customization versus just rolling out my own or an open source one that can be more easily built on top of.


Everything you write, I cheer.

Massive customization, and then you are bound by this broke ass Object model in which to get it all done, between Apex and VisualForce nauseating crap.

I don't want to pay hundreds of thousands of dollars for the right to do write database driven web pages.


> I can do the same thing with some scripts...

I thought that the point of Tableau was to provide a tool that end-users (who can't program) could use to interactively explore their data. That's not something you can replicate with a bunch of scripts.


I can see how Tableau would be great if you are in a situation where you can have business guys creating their dashboards and exploring their data.

The issue in practice is almost always getting the data in a workable state so that you can manipulate it easily in Tableau. In my experience in smaller and mid-sized places, Tableau tasks get punted to analytics and data science, because they are needed anyway to get and transform the data in the first place. And these people usually prefer and are capable of using more technical tools than Tableau. I know I would rather use Shiny or Dash.

Maybe that's not a difficult problem in larger corps.


If you just want a chart based on a DB query, then yes, it's a waste of time and money. The value of a lot of these products is that they have query/charting builders, which takes a lot more effort to implement. These days a lot of business aren't willing to accept analytics being controlled by a single person or team, as they're used to anyone being able to do their own custom thing.


Yep. UX is by far the most underrated aspect of product development by engineers.


> Those who program and those who don't.

And you should know, that the ratio between those two is something like 1:1,000,000

So, for the most of the world, tech is essentially "magic" to them.


Are you sure your scripts and web server costs less than Tableau?


Probably not. From his comment I doubt he's even used Tableau (and I say this as a programmer who worked on something similar - https://perspective.finos.org - and yet I still find myself turning to tableau pretty often).


We were getting ready to do a tableau api/integration with our software and the licensing came out to about $100k for two server dev licences and six client licenses.


Yeah, Tableau is very expensive. Way more than a few weeks of dev time.


Ah of course. Just a few few weeks of dev time to build the same thing a multi-billion dollar company sells. Then after a few weeks of dev time you find out you're missing about 80% of the features that makes tableau useful. After that few weeks your manager asks you how the project is going and you just mutter something about how you should probably just buy looker or tableau.


Nah, these sorts of folks are far too obdurate and prideful to ever willingly admit defeat. More likely they would keep telling the manager that they just need a few more weeks time to finish it while they frantically keep rolling that Sisyphean boulder up the hill.


> Way more than a few weeks of dev time.

Except it's not just a few weeks of dev time that makes up the overall cost. Consider infrastructure, maintenance, updates, support, training, etc,. Those things start to add up and you don't get the benefit of scale/community if you do it on your own. There's also the opportunity cost of building your own system when you could buy something existing and use that time to work on other things.


True I'll give you it depends on the situation and might indeed be the better option. For a startup I wouldn't recommend it at least


Tableau is $850/yr per seat. Very expensive might be overstating things a little.


So how many seats are needed to hire somebody who creates a custom shiny + R app that does exactly what you want? How big is the average company that makes use of tableau etc.? Let's do the math ...


It’s 2019. Are we seriously still trying to argue that homegrown solutions are overall more cost effective than SaaS? Are we still of the opinion that developing tooling outside of your company’s core competencies is somehow adding value?


Absolutely. There are tons of people I encounter every single day that seem to think it is cheaper to build everything inhouse. I think it comes from a complete misunderstanding of how business delivers value coupled with RMS-like distrust of anything they cannot completely crack the lid open and mess around with. Mostly the former, some of the later.


Well it's probably a question of perspective and problem domain. Also my main concern is precision and correctness. Quite often you can choose between (1) customized tools that achieve almost what you want them to do, but often the people who implemented or customized them didn't fully understand the problem and did things wrong without anyone noticing for a long time, or (2) custom tools that do exactly what you want, that were tested by somebody who cares.


I actually felt the same way about Salesforce until about a year ago when my company bought it. It certainly has a duplo blocks side that let's non technical users do complex things (aka Excel) but their backed is extensible through a language that is more or less Java for custom handlers/queries/etc and they are switching their front end to W3C compliant web components for developer maintainability after getting burned by rolling their own front end framework in the early angular 1 era. They have super tight integration with heroku postgres dbs (because they own them) and have some other best technical tricks up their sleeve to the point that I my opinion has changed quite a bit.


This seems really high for a company without earnings and a weird growth curve. Their ticker is cool and maybe sales force wants to be DATA on nasdaq.

Otherwise, it will be hard to justify this high markup for a tool company.

It will be awesome if Salesforce can adjust their model and make Tableau spit out D3. Their desktop tool is nice for designing, but their server components seem frequently unnecessary for running the visualization. The catch is that creating serverless dynamic visualizations isn’t all that money-making and the cool UI/UX design tool is outside of OSS’ wheelhouse.


A lot of companies use Tableau to visualize their data. In the future if they want to use Tableau they (speculation) might have to host their data in SalesForce and/or be directed to use a number of other SalesForce features if they want the latest coolest dashboards. To me this acquisition seems be a play toward growing customer usage of SalesForce licensed functions and data migration to their cloud, rather than based on the current revenue stream. The lock-in is strong.

SalesForce has been pushing Einstein Analytics recently. I haven't used it, but I do see that moving an organization from Tableau to Einstein has a lot of costs involved so this would be a hard sell in many places. Having them both under one roof means they're able to bring a bunch of people across to their cloud and now that license revenue year over year is theirs with the additional data lock-in.

As someone that really dislikes vendor-forced lock-in and generally dislikes the way SalesForce controls your data, rate limits, maxes, licensing, etc, this move is about even more control up your stack that will seem like a "no brainer" to decisionmakers, which is dangerous. That said, I'm sure it will work well for some organizations.

EDIT: I would also love to see them spit out D3 or other open visual but then they'd be losing control of the secret sauce and the requirement for a license. Not sure there is an incentive to go that route.


My company is using Einstein Analytics, however, other than the Salesforce integration has been a terrible disappointing experience. Loading data is a pain and has to be done to a custom API. Of course Einstein analytics need to have it is own language called SAQL resulting in a steep learning curve.


yeah i put in another comment that this acquisition is a vote of non confidence in Einstein Analytics. You may see it phased out in the next few years.


Yeah SalesForce NEEDS Tableau. It is like a missing puzzle piece in their suite. Einstein is essentially enhanced reporting for SalesForce even though they sell it as an analytics tool.


That sounds right. This acquisition maybe gets rid of the curve there and for orgs like mine that use Tableau and SalesForce it sounds simpler / easier in future (Einstein has conversion and learning costs as you say). However, that comes with a cost - loss of any freedom to use your data without worrying about arbitrary license changes and cost. The way they structure the licensing will be interesting since they’ll have leverage over any customers.


You may want to take a look at [perspective](https://perspective.finos.org/), a WASM-based serverless dynamic visualization library with [D3FC](https://d3fc.io) charting.


Thanks, this is a good tool. There’s a few libs that spit out d3, but the UI isn’t as usable. Tableau is popular with the Excel crowd that I’ve been able to work with.


>This seems really high

Precisely what I thought. It looks like their annual revenue is ~ $1 billion, placing this price around 15x annual revenue.

However, Looker has about $131 million in revenue, so their purchase price was an even higher 20x annual revenue.

My conclusion is that these acquisitions are much less about sales revenue and much more about filling strategic holes in product offerings, and I can only assume it's a sellers market in that area.


Multiplers don't scale linearly. The lower your valuation, the more possible it is to get a higher multiple for various reasons (cash, # of bidders in the market, etc).

At $1B/year in revenue there aren't a lot of companies that can realistically acquire you. At $131MM/year, there are.

But still I agree. Both are quite high.


Those strategic holes are probably worth more to the acquiring company than the price they paid in terms of new revenue.


> Their ticker is cool and maybe sales force wants to be DATA on nasdaq.

CRM is a pretty darn good ticker for Salesforce. Why would you change it? It perfectly explains your core business


CRM is the best ticker name for salesforce. DATA works but on a different level.


CRM is a much smaller market than cloud and data.


I have worked with some companies that love Tableau so much, they use the desktop and server versions a lot. The licensing costs are high, but it does the job.


>Their ticker is cool and maybe sales force wants to be DATA on nasdaq...Otherwise, it will be hard to justify this high markup for a tool company.

You make it sound like the ticker symbol justifies a $15.7B price tag...


I didn’t intend that, but searching for reasons for such a high value.

I think we are many years away from $15B vanity ticket symbols.


Because they are the clear market leader in the category, have been for a long time, only recently falling behind Microsoft. (within the last 2 years of their 20 years as a company.) Give it a year or two to integrate into Salesforce and they will be neck and neck. This is just as much a bailout to Tableau to save them from being dwarfed by Microsoft.

https://interworks.com/blog/ksloan/2019/04/03/the-2019-gartn...


A CEO of a soon to IPO company should most definitely take “AWS” as the ticker.


Very expensive acquisition.

Perhaps in 5 years we'll look back at this move as a prime example of the bubble we're currently in.


Or maybe the begin of an even bigger wealth creation period. "And why did i not invest earlier?"


Or maybe neither, and just a weird factoid about a corporate acquisition that wasn't indicative of any larger market trends. (All possibilities now hedged).


NP complete?


Tableau has a research division they themselves acquired recently in Boston.


Last year they acuired Empirical Systems, "an eight-person artificial intelligence startup based in Cambridge, Mass" [1].

However, 'Tableau Research' [2] has existed for years and its researchers regularly publish at major academic visualisation conferences like IEEE VIS (InfoVis/VAST) and EuroVis (see [3]).

[1]: https://www.geekwire.com/2018/tableau-acquires-mit-ai-spinof...

[2]: https://research.tableau.com/

[3]: https://research.tableau.com/papers


Yes but they’re not hardcore statisticians like the Empirical people except for this Daniel Ting fellow


> t will be awesome if Salesforce can adjust their model and make Tableau spit out D3

Can you clarify?


Tableau worksheets and dashboards are proprietary and need either Tableau Reader on the client’s desktop or Tableau Server to run proprietary server side stuff for the js libraries to draw it. This is expensive to license their server and limited in how you can host. In many situations the data is small enough to send to the client.

It would be nice if tableau would just generate static content that could be hosted anywhere.

There’s not a client tool for d3 as nice as tableau. I work with lots of scientists who learned tableau but aren’t really programmers and can’t figure out d3 or other libraries.


Maybe they meant salesforce's business model, which doesn't usually include outputting stuff in an open format? I'm not sure.


Sort of. Salesforce is more interested in selling cloud licenses and space. They push proprietary apps and such, but maybe they won’t care so much about hosting tableau server specifically.


Salesforce currently has a P/E ratio of 104. The headline might as well read "Salesforce is buying Tableau for $15.7B in Monopoly money."

This deal makes a lot of sense for Salesforce. They should be (and are) on an acquisition spree.

But if I had stock options (or any kind of locked-up equity) in Salesforce, I'd be worried right now. Someone is going to be left holding the bag.


I held Salesforce stock from 2016-18 and it helped us just buy our first home. They have incredible lock-in with a very large and diverse customer base. They now have to defend themselves against Google's purchase of Looker, and the rise of PowerBI.


Are they serving the same market as Google? It seems as though the two are serving different use cases, both of which need data viz. Salesforce gives you a sales database and a front end out of the box, and you can tweak it and add to it with add ons. Google gives you compute and storage, but the application logic is all owned by you.


That's what makes this seem like a strange deal to me. There's pretty much 0 overlap with Google on CRM, and the next comparable tech stack that does compete with them is MS Dynamics CRM combined with Power BI. But Dynamics doesn't appear to be a growing threat for CRM marketshare.

That said, Salesforce (based on my usage ~4 years ago) has truly awful baked-in BI and analytics, necessitating third party products and data engineering to fill that gap. Tableau will fix that, but I'm staggered at the price.


Salesforce is a sales organization par excellence. They sell whatever they have, be it social media monitoring, CPQ, marketing automation, Heroku, Jigsaw data, IoT etc. they develop deep wallet share in large accounts and are constantly cross-selling. Their core audience is business people, and that is in stark contrast with GCP which is almost entirely dev focused. I’m not saying CRM is an afterthought but they have plenty of accounts that don’t use their CRM.

That said, Google has deep inroads with their apps suite, and it’s really a race to have their tech run business processes. That and their ecosystem like Insightly are making big inroads to CRM’s SBM market.

Their analytics is shit, but I think the important thing here is that they know their audience very well, and business people love Tableau.

They bought $15B worth of sales leads for cross selling their portfolio of services.


So is Google making a play for Saleforce’s core business? I don’t know a lot about the details of what Salesforce does but from my view it looks like there is not that much overlap


As I think about it, I don't see this as a play for business or much of direct competition at all, they simply both had strategic gaps, but for very different reasons, in their BI/Analytics offerings.


Good points, it's easy (well for me it was) to overlook gradual expansion beyond the CRM sphere.


Interestingly, I think an earlier version of the title had "in all-stock deal", but I think the mods removed it. I wish they hadn't, as that's an important detail for the reasons you mention. (Also, ask James Baker, who was convinced by Goldman Sachs to sell Dragon Naturally Speaking in an all-stock deal ... of a company that turned out to be the Belgian Enron.)


I was in speech recognition tech during that time. It all tanked, but Dragon got screwed the hardest. ScanSoft (now Nuance) made out like a bandit with all the tech fire sales post-bust.


Yeah, I can't imagine how Baker (and all the employee-shareholders) must have felt about being advised not to do the standard half-cash-half-stock purchase and then turn out to be a fraud so soon.


Yes, the title of the post was changed after my comment.


My guess is that Salesforce was also in the race to acquire Looker, and since that fell through they picked up Tableau because they had sunk the work into acquiring a data visualization company. The timing is uncanny, but the price tag is vastly different.


Few if any institutional investors are looking at CRM on a P/E basis. They're valuing the company usually on EV/Sales and EV/FCF.


That's a strange claim.

First, how do you know what all institutional investors are thinking?

Second, Sales and FCF straddle Earnings on the income statement. A focus on EV/Sales suggests that investors are optimistic about growth and ignoring the spending required to get there. A focus on EV/FCF suggests that investors are optimistic about increased efficiency and cutting costs.


I love this. Mulesoft, Tableau -- there's a very clear strategy here for Salesforce. You're running all of your customer-facing operations on Salesforce, so why not also integrate the BI stack into everything?

I've worked with a lot of companies who spend months (if not years) integrating their data into a few disparate systems... The finance team has one system (and underlying data lake), the commerce team another, the marketing team another... If Salesforce thinks they can run the entire underlying data infrastructure in addition to the actual customer-facing functions, then this is a smart play.


If Salesforce wants to run my entire underlying data infrastructure, I think we’re in trouble. I see this more of a land grab, with SaaS vendors trying to push their own BI stacks and create an even tighter era of lock-in. I don’t think it is necessarily a good thing.

Data is an asset and liability - when somebody else has all of yours in their proprietary platform and under the control of their cloud, that is a scary proposition to me.

Fair enough if you are running your infrastructure on open standard tech and common cloud platforms, but not locked away in Salesforce.

This is reminding me of the behavior of other large orgs, like Oracle.


On the flip side, if Salesforce does not own its own BI stack, their customer might go to a different vendor that has better integration (i.e.: Microsoft PowerBI).


Salesforce data is stored in Oracle DBs


It’s a risky play. At my enterprise company, we’re not allowed to use a single vendor for everything, we explicitly must use different companies if a favoured vendor hits a certain spend threshold.

This is to prevent cost overruns and solution capture, where every solution to your company’s problems becomes “give it to X vendor” and then X vendor kills a product line and you’re toast.

Salesforce needs to be careful or else they’ll hit that threshold where companies don’t want to use them because you as a client are too small. Google is facing this problem right now.


Is this somewhat common? It makes intuitive sense, but the big 3 or 4 or whatever number it is (IBM, P&Y, etc) consultancies have "use us for everything" as their explicit strategy / part of their sales pitch.

So while they might lose customers like you, there is clearly ridiculously large piles of money up for grabs if they diversify their products, rather than remain specialized. And, of course, any sufficiently good specialist is at risk of being acquired by one of these behemoth generalists.


Although there is a good % of the market that works in a similar way to your business, it is worth noting that when a deal is exceptionally good, executives will create exceptions. MS Suite and some of its satellite offerings (PowerBI,Sharepoint,etc) is a good example of an exception.

There are also cases where a company will pick multiple vendors in an attempt to de-risk and/or for negotiating tactics. If you are fully dependent on a single vendor, the cost of migrating tends to skyrocket and the negotiating power moves towards the vendor.


> If you are fully dependent on a single vendor, the cost of migrating tends to skyrocket and the negotiating power moves towards the vendor.

This is particularly important with SaaS, as you lack the leverage to walk away. If you're fighting with a vendor, you had best resolve it by your contract date, as they will happily shut you off and wait for you grovel (and pay).


Interesting! I haven't seen that requirement before, but it does make sense.


I’ve seen requirements by Target and Walmart to not use any AWS infrastructure in the service you are providing them.


I've seen that with retailers too-- but this is often driven by a fear/hate of Amazon rather than any real IT policy.


Yes, that seems more like Walmart/Target not wanting to support their main competitor in the retail space (Amazon) even though it's a different division.


It also happens to be Amazon's most profitable division, so it kinda makes sense.

Actually does Amazon's marketplace make any profit?


What are these spend thresholds?


It is smart. It works the other way too. Tableau customers will see more reasons to switch to Salesforce.


They've already got a product called `Dataroma`, I'm not sure if this acquistion might cannibalize Dataroma customers.



Dataroma was all about its pre-built hooks into every Marketing platform in earth


I remember the wave of BI vendor acquisitions in 2007-2008 when IBM acquired Cognos and SAP acquired BusinessObjects. Now, 12 years later the wave repeats. Google is buying Looker. Salesforce is buying Tableau, and probably Qlik will be acquired within the next 1-2 years too.


Qlik was sold to Thoma Bravo, a private equity firm, about two years ago for $3B.

Not sure if that's a relative bargain compared to this deal or if it makes the $15.7B look totally unrealistic. As of a few years ago the total revenue of the two firms wasn't that different.


I've heard that Thoma Bravo specializes on re-selling assets. If that's true then the $15bln sticker on Tableau should only justify and boost their intention to re-sell Qlik.


Haven't heard much about Qlik lately. A few years ago we were evaluating Birst and Qlik, and have since settled on Looker and Tableau.

Birst end up getting acquired too. https://www.infor.com/news/infor-to-acquire-birst-infor


> I've heard that Thoma Bravo specializes on re-selling assets.

This is true, because Thoma is a PE firm and is literally in the business of acquiring and selling companies.


My interpretation of the rise of tools like Looker & Tableau is that they happened in part because of that '07-'08 acquisition spree. Neither IBM not SAP capitalized on their investments, and instead their products mostly stagnated. Cognos 11 for example looks more modern bit really isn't much more than a re-skin of a product that hasn't changed much since the release of 8.x. At some point IBM had a Watson-branded product that was supposed to be a Tableau competitor, but it never really materialized and was near completely divorced from Cognos.


Indeed, it's cyclical. About 7 years ago there was a social media platform buying spree. Salesforce bought Radian6 and Buddy Media (for ~$700M). Oracle bought Virtue for $300M. Google made its own acquisitions in the space (Wildfire).

But now, Salesforce now has a bigger war chest to play with.

Looks like BI is the new hotness.


I would say all things data is the new hotness. Recently Google bought Alooma, Talend bought Stitch.

Personally I'm really interested in who if anyone will buy Snowflake.


Dont' forget Metabase


Isn't that fully open source project?


Mostly. It costs $300/mo to remove their branding badge from embeds.


I'm the CPO at a SaaS data catalog and analysis hub company called data.world - I was at the Snowflake Summit when the Looker acquisition by Google was announced - the guys at their booth seemed pretty happy ;)

With Google snapping up Looker ($2.6B) for Google Cloud, Salesforce's much bigger purchase of Tableau is a clear sign that the big guys see buying BI tools is a good way to expand the reach of their offerings into more of the business. We talk to companies every day that have made massive investments in data warehouses, viz tools, and high-paid data scientists, but they still aren't agile enough because they can't tie it all together so their people can find and use the right data when they need it. I read an article once about the failure of self serve BI and the reality is that you just end up creating more sprawl. People need tools to reduce the clutter and sprawl and stop the endless chain of emails trying to figure out what table to look at or query to use or if your source is still updating.

We built our data catalog and analysis hub for exactly this, and it's extremely validating to see the big guys like Salesforce and Google investing in expanding the user base of big data tools and I really hope we can be part of the solution of sorting it all out!


Looker is way better than tableau.

Lets break it down:

Both are over priced

Lookeer, however has LookerML

Tableau obfuscates all code.

Looker has easier bolt-ons to redshift/postgres

Tableau's BI tool-set is weaker than looker, albeit, wider spread (more mature)

So, I think google got a steal and SF is playing catch-up... at a high cost.

Plus SF's sunk costs in eveything is going to make a 15B buy take at least two decades to pay off....


I disagree with you! Looker is more advanced, and that does not mean it is better. Tableau’s tools were MUCH easier for me to make ad hoc analyses with, as well as to create dashboard with. Yes, there were some transformations that I couldn’t apply and had to go to custom SQL or ETL for, but the total time to finish my analyses and produce high-quality live dashboards was much less than with Looker.

(I don’t know why you are being downvoted though, you’re entitled to an opinion same as me)


Can't wait to pay $1000 per gigabyte per month to host my analytics data there.


Yeah, Salesforce presentation of pricing really can feel like a bait-and-switch. Advertised as a pay per-user per-month, but then to actually make the system work you get all of these add-on fees, like for storage, or for API calls to perform data integration, etc.


The data storage limit increased quite a bit just a few months ago:

https://releasenotes.docs.salesforce.com/en-us/spring19/rele...


I don’t think this data is going into the Oracle tables


Not far from the truth here.


Oh hey, maybe this will mean somebody at Tableau has to start to give a damn about enterprise features, such as a way to do product activation and registration that doesn't fail completely in non-persistent virtual desktop environments.

Or Tableau can continue to pretend that this isn't a real issue and stonewall customers and partners alike.

I think the actual visualisation part is neat, and better than many competitors, but many of the server-side parts are various levels of disastrous (as is their support), and their "data preparation" tool needs some serious improvements to be borderline useable.

15+ billion seems like a lot to me given how Tableau interacts with customers and partners alike, especially seeing how they are activelly alienating existing enterprise customers, all in favour of new sales, but perhaps something will change for the better here.


Yeah as the defacto Tableau Server administrator in my org, I am not happy with Tableau. Despite it's power and ability to let less-technical people generate insights from data, the administration experience as a whole is a total nightmare; purchasing/applying licensing to all installations is a disaster. Maintenance upgrades and backup procedures are also sort of wonky. For the data visualization piece, I personally feel like the software is difficult to use because it uses custom "branded" terminology for many things, so the knowledge you gain using Tableau can't often help you outside of Tableau... and then they place many features in locations that are not intuitive and thus difficult to find without documentation or prior experience.

We're a Microsoft-heavy shop, and I've been trying to get them to move to Power Bi simply because it's far more fully featured, easier to use if you're familiar with the "Windows way" of working, and has streamlined administration/installation/licensing/configuration in Windows environments.


I think Tableau Server works better on Linux, where upgrades etc. are also somewhat less painful and require less downtime.

That said, it baffles me why I have to restart Tableau Server 3 or 4 times during installation, and why I have to restart it for trivial changes more generally. For a piece of software that specifically ships with a cluster controller and full-blown zookeeper, somehow their engineers (or "engineers", as I sometimes get the impression) manage to make things that should be trivially solvable with reloads, partial restarts or spawning new workers (e.g. SSL certificates for the built-in Apache webserver) require a complete restart of the whole node.

edit: Regarding Power BI -- I feel that Tableau Server is (for better or worse) one of the killer features for many enterprise customers, because it means all of your data can remain within your own infrastructure and does not have to rely on external cloud providers. If that is not a requirement in your organisation, Power BI might make sense depending on your overall IT landscape, as well as your users' specific needs. On the other hand, if your organisation requires hosting things yourself, I guess it doesn't matter how miserable the experience is for you as an administrator. That's basically Tableau Server in a nutshell.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: