That makes for nice headlines, but anyone who actually understands this stuff knows that hacking the public-facing web server is not a big deal and not really related to obtaining private info like this.
Parallel is of course very important. But if serial speed isn't in the same range, go parallel will be the same perf crutch as the Python folks who say drop to native. Rust programs shouldn't be parallel to beat a single threaded C++ program.
We do want Rust to be excellent at parallel execution, but that does not mean that we don't pay attention to single-threaded performance either. The way that we make paralell/concurrent code better has no negative impact on single-threaded performance. In fact, sometimes you can use more efficient data structures when you know that you're not using multiple threads, like Arc<T> and Rc<T> for example.
It's absolutely intended to make it easier. Easier in the sense for example, that you can create a multithreaded program that uses stack-allocated but shared data, and you don't have to debug complex synchronization yourself.
Rayon, being a multithreading library, is of course itself not so trivial to write. What it lives up to is that users of rayon can use the regular rust type system rules to ensure that their use of rayon is thread safe if it compiles.
I find rust is still a complex language. I don't find concurrency to be any easier to write using Rust over using the C++11 concurrency API, for example. I don't have much real experience with concurrent Rust applications, though.
What Rust does have is that once your code compiles in safe mode you're very confident.
Overall, I like the complexity and I have been enjoying my time with Rust. I began using for bare metal ARM programming. After using Rust in "no standard library mode", I'm now convinced there are no more valid reasons for using C in this century. To me, Rust is a no-brainier replacement C.
It's not however,
going to effect the adoption of modern C++, which has a slightly higher level niche in performance critical applications. It's the libraries that make C++ awesome, definitely not the language itself (although the core is improving).
Thanks for asking. I just installed 9.0.4, and here are a few examples:
- Tableau Server runs only on Windows, so why can't it use a TLS certificate and key from the CryptoAPI certificate store, rather than requiring these to be converted to PEM format (with Unix line endings!) and saved in the file system?
In an enterprise with an internal CA using Active Directory Certificate Services, these extra steps have to be done not only at installation but also every time the certificate expires. Compare the experience with Microsoft IIS: the server automatically requests a renewal from AD CS, retrieves the new certificate, and begins using it.
- Tableau Server should be able to run as a Group Managed Service Account, so we can give it access to remote data sources without having to assign (and regularly change) yet another service account password.
- It would be helpful to have an scriptable installation process; as far as I can tell, there's no way to install Tableau Server without clicking through wizards.
Thanks for the input. I am going to forward these on to the server dev team and follow up with them in person. They may be aware of some of these already but it is important to us to keep track of what is causing our users the most headaches. I appreciate you taking the time and letting me know your suggestions and the issues you are having!
1. No ability to use a 3rd party auth provider AFAIK, which means either keeping tableau passwords in a database or having users remember two different passwords
2. Embedded views use synchronous requests, which can easily hang the browser. Synchronous XMLHttpRequest has been deprecated for a while. I think I even saw a version of dojo from 2005 being loaded.
3. Reports are either static size or dynamic size, and unless you're using the (clunky but well documented) JS SDK, there's no way to tell.
4. Viewing reports in the browser is sloooooow. Browser console output is filled with warnings.
5. In order to put together sheets from multiple workbooks into a browser-based view, you need to either a) load the jssdk for each of the workbooks and query for sheets, which is extraordinarily slow, or b) do it with the REST api, authentication with which is asinine in nature (see #1).
> 1. No ability to use a 3rd party auth provider AFAIK, which means either keeping tableau passwords in a database or having users remember two different passwords
The answer is SAML/ADFS. You should look to enable this integration. If you are not using AD/LDAP, that's a whole different story. But SAML/ADFS is pretty much the standard way since Tableau is a Windows service, it is very natural to just use AD/LDAP/SAML.
 I had to set the client-side map rendering threshold to a very high number (100000 I believe) to get maps to render at all. Server-side rendering doesn't work, even though it can contact the map servers and display all of the examples in the documentation (Miami/Havana I think?).
 It's been a few months, but I remember getting the license activated offline was a weird process. Something like, point tabadmin toward a license file, which generates a number or json or some other file, which you then paste into or point the UI toward, which gives you another file to use in tabadmin... and at the end tabadmin gave me error. Now when I go to "Manage Product Keys" it acts as though it is unregistered, but the server still starts without error (it did not before the failed activation ritual).
I do have a ticket in with support for .
Given how much of a bitch it was to activate (or half-activate) I'm reluctant to investigate  further.
Also, I'd like to see a linux server. Tableau is our only Windows server, which weighed heavily against the product when we were considering alternatives.
So, I am not on the server team specifically. A lot of these issues that are mentioned may already be in the pipeline/on our radar. However, I think it is beneficial to make sure that we continually follow up to ensure the squeaky wheel gets the grease, so to speak.
All of these issues mentioned here will be sent to the server product owners and managers. :)
I am, however, on the maps team. I am curious about  above. I'll see what I can find internally on this. I am rather curious since this isn't something I have seen.
Offline license signing is a solved problem, Sophos for one has figured this out with the way they license their UTM product.
When they give you a license file, it's cryptographically signed with their GPG key, and the public key resides on the appliance for verification. All you have to do is get that license into the system, either by USB key, typing it in yourself in Vim, or simply uploading the license file in the webUI if you have access to it.
Trusted Authentication is a poor solution to the problem of how I can embed views in my web app without having the end users of my web app have Tableau server accounts. For the following reasons:
- I have to explicitly add each server IP address. I have no way to trust an entire subnet or range of addresses. This is a huge problem in an auto-scaling app server environment where I don't know the IP addresses my app servers will have. It is a major annoyance to developers whose DHCP-assigned, dynamic IP addresses keep changing.
- There is no API for adding trusted IP addresses. It is a manual process.
- The Tableau server must be stopped and restarted to add new trusted IPs.
It's an unholy combination of rails and postgres somehow hacked to run on windows. Really, they should just ship a linux VM that runs these things decently.
Many Linux services have a concept of reloading. If the config file changes you can send the running program a signal and it will re read the config. This is very useful for production systems.
Tableau (9 at least) has no such concept.
Change the email address it reports to? Restart tableau.
Change the location of the SSL certificates? Restart tableau.
Want to apply an update for tableau? Uninstall your current version and install the new one. Oh and until recently when you downloaded the installer for tableau server the file name didn't actually contain the version number.
This product was not designed with ops in mind at all.
Edit: I forgot, I've actually had a tableau server fill itself up with logs. Tableau has logs in many different locations outside of windows event viewer and doesn't include log rotation facilities for all if them.
If you run tableau in an enterprise environment you will likely have a lot of c level executives, global sales teams and more relying on tableau to be available outside of your local business hours. This means any maintenance needs to be planned and communications sent out to all stakeholders.
If reloading was an option then there wouldn't be downtime, and I wouldn't need to schedule a maintenance window for something as simple as updating an email address. The idea being that if there is a config error during a reload, the system just continues uninterrupted with the original config. If I have to stop the system completely in order to run the config sanity checks when it starts again, the potential for prolonged downtime is much greater.
Thanks for that perspective, I hadn't thought of that.
Would a system that did something like an internal cut-over be useful? e.g. try to start a whole new instance of the application, if it loads, then let it become the running application, if not, write an error log and shutdown?
It would still lose all the state associated with the previous instance, e.g. user sessions, but would avoid this specific issue.
I agree that it's pretty silly that things like email addresses need a restart, but I'm wondering in general how bad this pattern is.
This is an extremely frustrating thread to follow. We have some people who have run a Tableau server with no issues, and now two people who are effectively saying "oh, it's awful but I have no interest in telling you why".
> I think a better question is what problems they haven't faced.
Publishing to the desktop (Windows or Mac), to the web, to the cloud, or mobile devices (iOS and Android). Publish to the server once, consume on all supported platforms.
Deploying a copy of a current site for redundancy, testing or development. Install the app, backup the primary Tableau database with its admin utility (command line), restore it on the new box. All data, visualizations, users and permissions are contained in that single restore step.
Tableau means I spend time working with my data, instead of the presentation of it. Its not a perfect product by any measure, and could obviously use some improvements, but is a timesaver in many areas.
It was pretty apparent in '08 that Obama wasn't going to _save_ anything. Obama, is, was and will always be a Statist. Everyone was exuberant for the removal of Bush and nothing more, the fact that he is considered black was just a bonus. In '12, it was literally the lesser of two evils.
I was responding to this claim by jernfrost: "Growth per capita in Japan has been completely normal in the supposed terrible years. It is the population decline which causes the overall GDP growth to look anemic."