Constructing this such that a single class works as both a Pydantic model AND a SQLAlchemy model - while maintaining an elegant user-facing API - is one heck of a trick! Congratulations.
Thank you for building this! So I'm running a FastAPI deployment in prod already today using both Pydantic models for the app / API itself and also reflected SQLAlchemy ORM models for the database. In my case I am lucky in that my app interacts (read only) with a data warehouse as opposed to a schema that it manages directly.
Am I understanding right that the main benefit of using sqlmodel is not needing to maintain the 2 separate models for the app vs db that one would need in most cases?
Also, have you thought about an approach to migrations yet? Curious to hear your thoughts there as well.
Yep, less code duplication between Pydantic and SQLAlchemy, autocompletion and inline errors in editors (e.g. when creating new instances and when fetching data from the database), because it has several type annotation tricks.
It's all just SQLAlchemy, so the same migrations with Alembic would do it, I plan on documenting that in the future and probably adding a thin layer on top.
Flask was incredible and a joy to build with, but it's not the future. It's only the present in that so any legacy apps exist that use it and because people who don't knew better will likely still stumble upon it first and use it for their new project.
But we are at the point where async python has really taken off, as well as where the performance gains are just impossible to ignore except for apps thats are either dead-simple or that are super CPU-bound. And FastAPI is unquestionably leading the pack with regards to async frameworks, so it makes sense to target that. (Also, yes, he's the creator.)
(Additionally, FastAPI is using Starlette under-the-hood, which is built on ASGI, part of the point of which is to standardize async servers and frameworks to allow for wide compatibility. Anything that works on Starlette or any ASGI framework will work with FastAPI, and vice versa unless the component is tightly coupled to FastAPI-specific code.)
New Flask 2.0 supports async views and handlers though they admit it's not as performant as async-first frameworks
But I agree largely that Flask is now a thing of the past. If you want to very quickly bring a small API server or a simple view page up then it's fine. For almost anything more complex, it's a huge pain. The ecosystem is extremely fragmented and lots of vital plugins for core systems like caching, auth and such aren't maintained anymore. For someone starting a new project, even if it's a very small web server I'd prefer FastAPI
You can use it with anything you want, it's independent of any framework.
But as @tomnipotent says below, a SQLModel is also a Pydantic model, so, you can use the same model to do automatic API data validation, serialization, documentation, filtering... and now, also define the database.
There's no more integration, apart from the fact that I built it to do just that, and I made sure everything works just right with FastAPI.
But for any other framework that doesn't have automatic data validation, documentation, serialization, etc. (and you are just doing that by hand) it's an even simpler problem, so SQLModel will work just right, as any other ORM.
> Do you have any reason to narrow down the scope to FastAPI when other frameworks like Flask are still more widely used?
FastAPI is basically the (async) 2021 version of Flask just built on Starlette instead.
If you have used Flask before, the learning curve will be pretty low, and the documentation is great.
But of course people have been using SQLAlchemy and Flask together for ages so I don't see any reason you couldn't use this with Flask instead if you prefer. That said, I bet you'll be able to whip up the MVP even faster using FastAPI.
I do not have any affiliation with the project other than being a happy user today.
Using sqla as a base is such a sane decision. It leverage a rock solid layers with async support and a battle tested ecosystem while offering a versatile query builder when you need to opt out of the ORM paradigme.
I'm curious and excited. Have you talked about the differences between this and TortoiseORM yet? The latter has the big advantage of having an api familiar to Django users.
In short, what motivated you to write your own rather than "bless" one of the existing ones?
I see the approach that this isn't exactly an orm, it's more of an api wrapper around it (SQLalchemy in that case), is that correct?
SQLModel isn't its own totally new thing so much as an abstraction that lets you create models that are both SQLAlchemy models as well as Pydantic models, simultaneously. If you use Pydantic, or want to be using it with your models, this sounds pretty great. Right now shoehorning the same kind of functionality into other frameworks can be a bit of a pain.
I do love Tortoise though, and it is what I am using currently -- but also one thing I'm currently battling with it is its approach to working with Pydantic. So I'll be giving this a try.
Yep, I wanted to build it on top of SQLAlchemy to inherit all it's robustness and to be compatible with it, as it has the biggest widespread. SQLAlchemy now supports async too, so SQLModel inherits that as well.
And in particular, other libraries have done a great job, but I wanted to take advantage of the features that editors can provide, with autocompletion, inline errors, etc. SQLModel has lots of tricks to provide the best developer experience possible, e.g. autocompletion while creating a new model instance.
I faced this exact problem a few days ago i.e. having to build one model for SQLAlchemy and an identical Pydantic model to validate the data (outside of the FastAPI context)! Will be using it the next I need it.
@tiangolo I would like to use this for production at work, however I'm a bit hesitant about using the 0.0.x versions. Would you recommend me to wait until version 0.1.0 or 1.0.0?
I just released it yesterday, so almost no one is using it in prod (I bet there's a couple using it already).
But it's all SQLAlchemy underneath, the most wiidely used Python SQL library, with tons of years of usage, so that gives some safety.
And the test coverage is at 97%. I will release 0.1.0 once test coverage reaches 100% and I have the main docs I want there.
The only caveat with the version is upgrades, as, as of now, anything could change (probably I won't change anything, just add stuff). But you can solve it by pinning the version, writing tests, and upgrading the version only after tests pass.
The only other issue you might have is that you really want to do some very advanced trick with SQLAlchemy and in some way it's still not supported. In that case you might want to use SQLAlchemy directly for that model, you can mix SQLAlchemy and SQLModel (although I haven't documented that yet).
The main benefit is actually being able to have statically typed models instead of dynamic, to get autocompletion, error checks, etc.
In fact, I made a small utility library some time ago to create dynamic Pydantic models from SQLAlchemy models (https://github.com/tiangolo/pydantic-sqlalchemy), but that's only useful in a few cases, e.g. for response_model in FastAPI.
Well done Sébastian - I have been waiting for this. Having seperate SQLAlchemy and pydantic models always seemed like a spot where the whole FastAPI developer experience wasn't ideal imo. Looking forward to try this out :)
This example trys to play with some quirky table structures - Mixins, Surrogate primary keys, index, etc
Both async and sync modes.
I like what @tiangolo has done with SQLModel, but i suspect that a lot of people who are already using sqlalchemy in production will prefer to unify via dataclasses versus switching the DX to a new library.
As Pyright already supports it, and Pylance is built on Pyright, they can use it directly. Hopefully more editors will use it and hopefully it will be part of the standard Python (in a PEP, with typing.dataclass_transform, and with mypy support).
And just in case you're wondering, there's no downside to having the dataclass_transform, anything that doesn't support it is unaffected, and nothing else would have completion either way.
Anecdata, but when I was digging into how Dataclasses worked, Pylance had great completion, but only if I imported from the dataclass module itself. If I pasted the whole implementation into a local file and used that, I didn't get the hints.
This makes me think that Pylance _does_ have some secret sauce.
Yep, but not secret, it's the in-progress dataclass_transform draft spec, already implemented in (open source) Pyright, and Pylance re-uses Pyright, so it gets that support.
Super nice! Always delivering quality tools. This also seems to play good with starlette as well. And I see async and migrations are coming soon.
Alembic is good but a bit cumbersome to use, I hope to see some improvements there as well
Nice! Yeah, it should work well with Starlette, Flask, or anything else. And yeah, I plan on adding docs and a thin layer on top of Alembic, but yeah, it's just SQLAlchemy so it will be just the same Alembic behind the scenes.
While I'm a fan of both pydantic and SQLAlchemy, I feel like support for this should be more clear in pydantic. We have the `.from_orm` method and we have sqlalchemy model metadata. This project wraps them nicely but I shouldn't have to have separate glue code for this. Until we have cleaner pydantic/sqlalchemy integration for FastAPI, the OP's project will do just nicely. Great job. I fully expect FastAPI with it's reliance on pydantic to help push better support for ORM models though. Anyway, enough ranting and back to coding.
Does it work outside of FastAPI or is it tied to it? We sort of shoved SQLAlchemy into Starlette (which FastAPI also uses) and it would be great if this library also works with Starlette. Thanks!
Thanks! This could potentially save us a ton of work! Really appreciate all the amazing work you've done. FastAPI definitely being put to good use at my company.
Does this have async query support? I’ve been using ormar lately with FastAPI and found it a pleasure to work with. I even built a number of PG specific extensions easily for things like JSONB and native UUID fields
I was literally an hour ago trying to figure out how to get my complicated Pydantic models to convert correctly to the SQLAlchemy models I made. Thanks tiangolo!
The exact opposite for me -- I’ve been interested in using FastAPI with SQLAlchemy for some time, so this project is exactly what I’ve been looking for.
Why would I include a dependency that in turn pulls in a sprawling dependency like SQLALCHEMY when much other lightweight ORMs exist? If I was already using a number of tools that expect or build on SQLALCHEMY sure but I usually never am.
Thanks for sharing!
This is the biggest thing I've built since FastAPI and Typer...
SQLModel is a library for interacting with SQL DBs, based on Python type hints.
Each model is both a Pydantic and SQLAlchemy model, and it's all optimized for FastAPI.
GitHub here: https://github.com/tiangolo/sqlmodel
More info in this Twitter thread: https://twitter.com/tiangolo/status/1430252646968004612
And the docs are here: https://sqlmodel.tiangolo.com/