Integrating with Streamlit was also very simple, in our case, we only had to expose how we serve Tensorboards and Notebooks on our platform, and we created a couple of tutorials to show how to host an app. Several of our users started using it after that as the default way for sharing interactive and customizable dashboards on their Kubernetes clusters.
It's nicer than sending someone static results, and it isn't much more effort.
And vastly better than sending a notebook to someone unless you expect them to modify the notebook a lot.
And learning time to make Streamlit useful for a small internal apps is probably ~15 minutes for most people.
> And learning time to make Streamlit useful for a small internal apps is probably ~15 minutes for most people.
For the types of things streamlit works for, it's minutes to learn and can be just minutes to write a useful app.
Now, however, as some of our internal users are comfortable with writing Streamlit, we're directly deploying apps from the notebook. It's useful to show results to clients without the user having to set up a VM, upload stuff, Docker, authentication, resources, etc.
It's not really the 15 minutes it takes one individual to learn. It's the SSH into something, send a link, shut down the VM or recycle it for next proto, remember the IP, etc...
- : https://iko.ai/docs/appbook/
Dave Cutler of NT fame used to say leaky abstractions are often worse than no abstractions, and early on, it’s often better to go with a lower-level API (for example, what Tensorflow did) because you can add a higher level one later (eg Keras) based on usage and reasonable defaults.
Heavy reliance on strings with logical meaning? Another strong smell.
Streamlit looks very accessible - which is amazing in this space, and I hope they do well - but my prediction is people will be importing “streamlit.v2” within two years.
Let's say you implemented your model in python, and want to show it off:
With streamlit, you can simply take your python script, turn the variables you want to change and the plot/dataframes you want to output into streamlit objects. That takes about 5-10 minutes, and then you can already serve your application. It's almost like it's no extra work.
You can write very short amounts of python code and get a nice interactive webapp. It's aimed at, but not exclusively for, exploring data and results. Live reload/etc make development extremely fast.
If that's what you're taking away, we need to improve our marketing copy!
The point of Streamlit is to have a nimble way to create applications, where auto-refresh/hot-reload is part of the developer experience. But the goal of the overall project is to make interactive data apps available to the broad public, not just people who have front-end experience or a front-end developer working on their team.
We've been loving it for making internal point-and-click tools + external project starters (ex: tutorials, solution engineering, ..).
Today, if you and your users are coding-heavy for data flows (dataset -> pydata wrangling/ml/... -> ui), jupyter notebooks are #1, and if you're in bigco, maybe say databricks notebooks as #2. However, most operational users really want an interactive point-and-click dashboard UI. Tableau and friends don't make as much sense in the pydata world, mostly for simpler SQL-only flows, and the existing python dashboarding tools (voila, panels, plotly, ...) have been too much work, esp. when sharing.
I've been liking Streamlit as it's pretty opinionated + prestyled (less work!), simple interaction model (accessible!), etc. It clearly can be better, but is already so much more accessible than our experiences with other tools here.
As some examples:
* We've been building https://github.com/graphistry/graph-app-kit for people building mini graph apps (one-click self-hosted launch via docker + st dashboard + graphistry viz + optional graph db connectors + optional graph compute tools).
* We're releasing 2-3 more integration + tutorial sets this month, where they're both notebook modes + dashboard modes.
* We just ran a hackathon for the same w/ the TigerGraph team: https://tigergraph-web-app-hack.devpost.com/
* projectdomino.org uses it internal for anti-misinfo dashboard tools. Our devs/data sci/some advanced osint researchers are fine w/ nb's, but everyone else needs dashboard UIs.
I'm excited to see what 2021 + 2022 bring here, and esp. if they can keep increasing accessibility all the way to no-code!
TDLR: it's a coincidence / slightly obvious given our names. We discovered the similarity on the day we launched our logo, and it almost made us change our logo and/or company name XD
We have since met the getstream.io folks and had a good laugh about it together.