Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: Taipy – Turns Data and AI algorithms into full web applications (github.com/avaiga)
98 points by nevodavid10 on Nov 30, 2023 | hide | past | favorite | 30 comments



"Show HN" implies that this project is your own personal work. Since you posted both this today and https://news.ycombinator.com/item?id=38461101 (as a Show HN) yesterday, I'm thinking that maybe you weren't aware of this rule? If so, it would be good to read https://news.ycombinator.com/showhn.html and follow the rules in the future.


? I searched their website and code base and can find no reference to AI with this project (other than a list of “AI veterans” involved).

It’s not generating code or UI using AI (as I initially assumed from the demo)… how is “Data and AI algorithms” into “full web applications”?

It looks like it’s a framework you write code in.

Am I missing something obvious?

(For example, this sentiment analysis just imports transformers => https://github.com/Avaiga/demo-sentiment-analysis/blob/devel...)


Yeah, you got it right. Taipy is not about AI but more about providing a way for people who work in AI and data to create a front-end for their project without having to learn other skills outside of Python.


It’s like gradio or stream lit.

It’s a python UI framework targeted at the AI crowd.


Small feedback:

> No knowledge of web development is required!

Then right below

> $ pip install taipy

If it's truly no-code and you don't need to know how to use pip, and your audience is people without web dev knowledge, showing a pip install command so high up on the page might scare that audience away.


Looks like their audience is backend python developers that don't have web-specific knowledge, not no-code folks.


Web development, not Python coding ;) I'm data scientist, I know pip but have no knowledge in web dev (HTML, CSS or JS)


Another fresh python dashboard. Really cool. We need more of these for our Python Apps.


Do we? I struggle to find anything that really differentiates all of these different products. Streamlit, Gradio, etc. are all doing the same thing. What would be the benefit of yet another way to get around writing HTML/CSS/JS? These tools are great for quick POC's, but from what I have seen, none of them are great production ready apps


If this is really your sentiment, I strongly invite you to try out Taipy. This was exactly our reaction when we decided to build Taipy. Streamlit was already somewhat popular, but it would always fail at the production stage when we tried using it for consulting missions. Any application in production generally has a significant workload in the back-end, multiple pages, and users. Streamlit's approach of re-running your code outside cached variables limits it to POCs, as you said.

That is why we created Taipy. We wanted an easy-to-learn Python library to create front-end for data applications while remaining production-ready: we use callbacks for user interactions to avoid re-running unnecessary code. Front and back-end run on separate threads so your app does not freeze whenever a computation runs.

We also focus on providing pre-built components to allow the end-user to play around with data pipelines quickly. These components allow the user to visualize the data pipeline in a DAG, input their data, run pipelines, and visualize results...


Have you ever delivered a Python project (AI or not) with a multi-page GUI, multiple end-users, and dynamic graphics? Well, we failed completely with Streamlit. Gradio is even more limited for this.

Streamlit and even more so Gradio are simple tools. They won't make the mark for such projects. They lack so many things: - not really multi-user - event loop is inefficient and creates side effects - no support for large data in graphics - difficult/impossible to call asynchronous functions (u get stuck in the GUI while waiting for the job to complete) - fixed layout / no real way to customize the look&feel - etc.

Don't get me wrong: Streamlit has benefits and was actually the first package to offer Python devs a low-code approach for building GUIs (for non-GUI specialists).


I think Taipy is excellent for getting around HTML/JS; that's true, but it is not only that. Here are a few Taipy functionalities that I found handy for applications to be used in production by end-users:

- For example, the scenario and data management feature helps end-users properly manage their various business cases. We can easily configure scenarios to model recurrent business cases. I am thinking of standard industry projects like production planning, demand or inventory forecasting, dynamic pricing, etc. An end-user can easily create and compare KPIs of multiple scenarios over time (e.g., a new demand forecast every week) and multiple scenarios for the same time period for what-if analysis for instance.

- The version management is also a good example. Besides a development mode and an experiment mode for testing, debugging, and tuning my pipelines, a specific production mode is designed to easily operate application version upgrades. It helped me deploy a new release of my Taipy application in a production environment including some data compatibility checks and eventually some data migration. I don't know any other system that helps manage application versions, pipeline versions, and data versions in a single tool. Plus it's really easy to use with git releases for instance.

- The pipeline orchestration is also very production-oriented for multi-user applications. You have visual elements for submitting pipelines, managing job executions, tracking successes and failures, historizing user changes, etc. Which is more than helpful in a multi-user environment. Everything is built-in Taipy.


Demos look good! Could you please explain the advantages of Taipy over Streamlit and Shiny for Python?


We actually used Streamlit in the past. Our gripe with it was how the backend event loop was managed. Basically, Streamlit re-runs your code at every user interaction to check what's changed (unless you cache specific variables which is hard to do well). When your app has significant data or a significant model to work with or multiple pages or users, this approach fails, and the app starts freezing constantly. We wanted a product that is the compromise between the easy learning curve of Streamlit while retaining production-ready capabilities: we use callbacks for user interactions to avoid unnecessary computations, front and back-end are running on separate threads. We also run on Jupyter notebooks if that helps.


The script re-run ( and the bandaid of caching via decorators) is exactly what I don’t like about streamlit. I’d love to see an example of how you use Taipy to build an LLM chat app, analogous to this SL example:

https://docs.streamlit.io/knowledge-base/tutorials/build-con...

Then I’ll give it a shot


Another interest one in this space — Reflex (formerly known as PyneCone). They have a ready to use LLM chat App, which makes it more likely I will check it out.

https://github.com/reflex-dev/reflex


I would love to read a comparison explaining the relative advantages of each framework from an experienced practitioner who has actually built apps with each. Add in plotly dash, bokeh/panel, and voila too.

Off the top of my head, bokeh and panel are more oriented towards high performance for large datasets, but have less overall adoption.

Voila is oriented towards making existing jupyter notebooks into interactive dashboards.

I'm always curious as to the runtime model for these interactive frameworks. Building interactivity into a jupyter notebook is fairly straightforward, but it's a very different execution model than the traditional http model. Jupyter notebook widgets need a separate backing kernel for each new user, vs the traditional http server models where all request state is reified normally based on a session cookie to DB state. The complete interpreter per user makes for simpler programming, but it is much more memory and process intensive.


Shiny has a DAG that enables intelligent caching. Here is a good talk explaining it https://www.youtube.com/watch?v=YNCPc9aWm_8


What is the business model for https://www.taipy.io/, https://streamlit.io/, or https://www.gradio.app/? These are nice tools - but how will the sponsoring businesses support themselves? I didn't see any mention of enterprise plans, etc. Is the answer simply that "we've not announced our revenue model yet"? What should one expect?


It's a gold rush right now. Business models don't matter to investors because they want to fund 100 startups with the hope that maybe a couple will hit it big. The goal isn't for all 100 startups to be successful. If these aren't VC funded, maybe just cool side projects and no need for revenue model?

Just like during the peak crypto hype, people were doing things that clearly had no obvious business model. Some figured it out, maybe pivoted a few times, and are successful. Most silently disappeared.

The AI space is really hard to have a long term business model strategy because the market is evolving insanely fast. Your long term business plan could easily be killed by 1 feature release from OpenAI without notice. Perhaps that's anothe reason people are focussing more on use adoption rather than revenue.

The reality is user adoption in AI has WAY more value than revenue growth. You can always monetize later. But first you need users. Most companies prioritize user adoption and worry about revenue later (that's what VCs want from early seed stage AI startups... users are gold, revenue is just a cherry on top).

Obviously thats not sustainable. But that's ok... most companies don't fail because they didn't generate enough revenue. They fail most commonly because they couldn't build a product that could attract enough users (which made it impossible to generate revenue). So it kind of makes sense to stress out about user adoption, because if you can't get that, your business model doesn't matter.


IMO there really isn't one. Nobody has been able to build a business here (yet). Unfortunately the folks who have this problem (data scientists) rarely have budget internally.

Ideally these solutions bridge to engineering teams better, who DO have budget and get the best tools. But again they don't have this problem (they can build & deploy web apps)


Streamlit, for one, was bought by snowflake for 800m, and they've been adding the ability to deploy streamline apps natively to your snowflake account.


Thank you. So they are or would be complements to somebody elses commercial platform.


They had essentially 0 revenue though. Nobody paid for streamlit cloud.

Snowflake's acquisition of Streamlit was similar to Google Cloud's acquisition of Kaggle. They bought a large community of end-users that spent money on their cloud databases.


This is cool! There's a similar project I'm working on but it's more of a reactive jupyter notebook than just a python library. Very cool though.https://github.com/Zero-True/zero-true if you want to check it out.


This looks very user-friendly.


How is this different from Gradio/Streamlit?



Side related topic: do you think the growth of ML/AI means Python could "dethrone" JavaScript as the generalist language of choice?


Well Python lives in the backend and JS in the frontend. You can't really build rich front-end apps yet in Python.

There's 20 tools to build neat little data / ML dashboards in Python that spit out front-end code but those don't have the features that JS / React have




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: