Learning Django and the Django REST Framework by working through William S. Vincent's Django for Beginners [1], Django for APIs and Django for Professionals was a turning point in my career. I started to take myself seriously as a developer, not just an electrical engineering grad who happened to like coding.
Not SICP as a teen, not the Haskell Book, not The Art of Unix Programming, not A Philosophy of Software Design. Those are all fantastic books, but they were the wrong things to read for someone whose definition of "Keep It Simple, Stupid" was "never build anything at all and stay unemployed". Django got me to shut up and build.
And build I did, and most of it did and does suck, and that's okay. And then I got a job, not doing Django, but using the things I learned from actually building with Django every day. I owe Django a great deal, and I still think of the DRF as my favorite approach to building a "well-tempered", maintainable API, on a deadline.
I have a very similar relationship with .NET MVC + EF. I suppose Django would win against it (them) by a mile in the "batteries included" battle but oh boy was I productive with it, especially with some jQuery and knockout.js for the frontend. I really felt like I could build anything CRUD with the tools I had (I was specialized in system integration) and I had the fortune of getting paid for it. Now even the smallest memory of some of the code I've written back then makes me cringe, but I learned a lot.
From those experiences you not only learn what to do, you get "what not to do" and "ah these parts are useful because of this" kind of information as well.
Having done both in my career I would pick either one depending on various factors but they both can get the job done.
One edge of Python is its like the modern day Perl in terms of how many libraries you can use with it to get things done. It certainly feels that way anyway.
On the other hand .NET has cool things Python barely scratches the surface on like MAUI and Blazor. Not to mention both MAUI and Blazor are directly supported by a major tech giant.
Even though I prefer Python over PHP, for web projects, I'm probably down with PHP forever. Because it so nicely supports stateless request handling without long running processes.
In PHP, you can just throw a php file on a webserver and it works. To update, you just update the file. You don't have to restart anything.
On the dev machine, you can just have Vim in one window and Firefox in the other, change code, hit F5, and you see what you did.
I don't like having to run a code watching process which then recompiles the whole codebase every time I save a file.
PHP is underrated, especially as a learning resource. I'm very surprised I only ever built something with vanilla PHP on the job a few months ago - and how fast I could go from idea to prototype.
There is a very natural learning progression to: First build apps that run purely locally; then build a few static websites, maybe starting with hand-crafted HTML and eventually using something like Hugo; then build a small dynamic website with vanilla PHP; then finally build something with a more complex framework, like Laravel or Django. Going upwards in these iterations I think would help a lot ofd newer devs internalize where the tradeoffs of inital complexity vs. future ease of development lands for them.
The differences you speak about arise due to the nature of their design. PHP was created for the web whereas web programming support in python wasn't part of the language design. When you speak about web programming in python, the seam between the language and the web is usually a wsgi/agsi layer. This is where all the things you mentioned come into play. There's a whole lot of benefit imo to using python over php beyond that seam.
There's a fair bit of nuance and it really depends on the setup. You can run Python with CGI and execute once per request but it's much more common to use wsgi/asgi. Likewise, I think php-fpm is still pretty common which runs long running PHP processes.
The difference is striking the balance between developer's speed and performance. In case we don't want to reload everything and run from scratch, it's pretty easy to do with python too. But we "chose" to run it always and quick reload, so the requests aren't too slow [and scalable].
Rest of the things you mentioned are pretty same for Python as well.
You talk about performance, but I think that is another point for PHP. In my experience, PHP handles the same requests faster.
Yes, I could build everything from scratch myself in Python and have the same statelessnes as in PHP. But parsing headers, creating headers etc feels like it should be handled by a framework. In PHP, it is build right in.
And if I would build it, it would talk to the webserver via CGI. But I think CGI is slow. For PHP, you have mod-php which is super fast.
If PHP doesn't recompile your script imports, how does the reload tracking work with dependencies? Or does it punt somehow (eg only reloads your script but not is imports)?
There are various implementations of this kind of autoreload system for Python but it always seems to come with compromises on semantics (different initialization order causes behaviour differences).
Most major PHP frameworks like laravel, symfony, Drupal, Magento develops all kinds of complex caching layers to work around PHP's stateless 1 request/1 execution model, essentially poorly recreating shared application state you would get for free with a long running worker process.
Python's import model is not without its flaws either, but at least you have a working application state, no need to fully initialize your app for every single request.
For simple apps that are contained within a few files, PHP is hard to beat for simplicity and speed.
Imports are also updated the moment you update the imported file.
I'm not sure if PHP stores any compiled binary or byte-code at all. Maybe it compiles it all on each request. It's super fast though, even with tons of imports.
My guess would be that it keeps compiled versions of each file in memory and on each request, it walks down the whole import path. And when it encounters a changed import, it compiles only that one.
Would be cool if someone with more knowledge could shed a light on what is actually happening.
AFAIK, before PHP even accesses opcache for bytecode, it checks the file's metadata (last modified, most likely). So an updated file might never trigger a cache hit or miss.
In PHP, dependencies are just .php files just like your own code, it all works the same.
PHP code is also typically far less complex than Python modules - modern PHP code consists of a single index.php with procedural code, and everything else is just class / function definitions, so there are no side effects.
When a framework "supports auto-restart", that usually means it has its own webserver for development and the auto-restart is supported when you use that.
I don't like having a different webserver in development and production.
For Python, gunicorn is suitable for production and development use and has a --reload option to reload on changed files. This functionality is framework-independent.
The way I understand the gunicord documentation, this has the same effect as if you set up a script which listens for file changes and restarts the server (or workers) every time a file changes.
That's way less efficient compare to how PHP handles it.
I don't want processes to be killed and new ones to be started every time I change a file.
PHP does it the right way: Only when a request that touches outdated code hits the server is that outdated code reparsed. As long as you just edit files, it uses up no resources at all.
I’ve never switched to the browser and reloaded the page fast enough to “beat” a gunicorn reload after editing a file. So I get not “wanting” a process restart but I don’t get why it’s such a big deal in a practical sense.
But hey if using what you use does what you want, then you do you.
As a long time Django and DRF user I'm really happy this release finally got out. If you look at the PRs being released some of them are years old. It's true that DRF is a mature and feature-complete framework but Django is still evolving slowly and DRF must keep up with that at the very minimum. Big thanks to the maintainers for getting it out!
Here are some of my hilights from this relase:
* Default on model gets sent to API docs
* Orderedict replaced by plain dicts everywhere - plain dicts are ordered in python since 3.6 so this is a welcome simplification
* Automatic support for UniqueConstraints in ModelSerializers - this is one of the places where DRF were lagging behind. I found myself using the deprecated unique_together just to not bother with writing validators on all my serializers
* There is a new router supporting path converters instead of regexp - Looking forward to trying this out! Also long overdue IMO as this came to Django in 2017 :)
I find it hard to criticize DRF, but one thing that could be added to the documentation is how to deal with M2M. Last time I tried a few different ways of doing it, and only two of them worked, then I had to figure out which way is better.
I don't think http://django-vanilla-views.org/ ever really caught on, but it does show that ccbv.co.uk shouldn't have to exist. The DRF class tree is small enough to keep in your head once you've spent some time with it.
I honestly only managed to get my mind wrapped around how class based views worked by studying Django Vanilla Views and Django Rest Framework… while I eventually learned and understood why the built in class based views are the way they are… in no small part thanks to https://ccbv.co.uk/ … but it was definitely more instructive to have a more simple class and mixin hierarchy that I could use, put breakpoints on, and fully get my head at… in order to get my mind around the very concept of class based views … after that it all made a lot more sense.
The class inheritance tree problem is solvable with a proper IDE/editor setup that allows you to drill-down into implementations. In a properly configured PyCharm for example you can Cmd+Click on any symbol and see its implementation (and do so recursively).
As someone who has used DRF in many projects over a decade, I 100% agree. In my current project I’m using django-ninja which gives me the power of the Django ecosystem and ORM with the simplicity of FastAPI. This is the way.
Is django-ninja maintained? I've always disliked DRF and liked FastAPI, so Ninja was a natural choice, but my team ended up moving away from it because of it being less popular than DRF.
It is very much maintained and got it's 1.0 release not long ago. Sadly there is a single guy doing all the work for Ninja so the tempo of releases varies. I also think there is quite a way to the stability of DRF and DRF's ecosystem. If you want permissions and throttling for example you'll have to use django-ninja-extra which is pretty much "DRF but its Ninja".
For me personally I think the micro approach of Ninja / FastAPI is at odds with what I want out of DRF. I just want to make my crud stuff and not worry about implementing e.g. throttling, etc, on my own.
I think it's a bad idea to judge tech based on popularity. It just becomes a game of high school fashion drama, instead of getting work done.
Also, I think the comparison is unfair due to the sheer age of DRF. Check out this graph: https://star-history.com/#encode/django-rest-framework&vital... Then click the "align timelines" and you see that ninja is more popular than DRF was at the same age.
>I think it's a bad idea to judge tech based on popularity. It just becomes a game of high school fashion drama, instead of getting work done.
I disagree. If one tool is way less popular, you risk it being discontinued. You do not want to start maintaining a web framework on top of your application nor do you want to rewrite your entire app.
People worry too much about that stuff. You can use non-maintained software for decades just fine. But I see people all the time freak out because there wasn't a release in the last week or whatever.
Also, most business fail way before their chosen framework stops being maintained. In fact, most businesses fail before they even got their first customer, let alone getting to break even. We need to keep some perspective here.
You assume that popular high school drama OSS is not productive and you don't get work done with it. Your assumption and the point you are making are incorrect.
DRF does what it says on the tin. It is REST by the definition of it. Not a HTTP JSON API but strict REST. A object with CRUD as an API. Everything else is outside of scope and mostly reall hard to do.
That's why I rarely use DRF and either use function based views directly or use another library.
IMO, DRF is one of the few libraries that can be considered stable and complete. The code is well structured, without a whole lot of hidden magic under the hood (as compared to say Django itself), yet providing immense functionality. I've learnt some neat patterns from digging into the code while working with it
To the devs : well done an Congrats on the release!
By magic I mean the complexity brought on by the heavy use of metaclasses, patterns employed via convention rather than enforced by code (implying that you have to read enough of the code before you understand the patterns) and other similar leaky abstractions.
Don't get me wrong, I do think Django is one of those deep modules[1] where the interface makes it a pleasure to work with but the internals do need effort. Especially the ORM layer.
Imo there's a fair bit of complexity in request dispatch with sync/async interop, wsgi/asgi, channels. I think you end up with something like 10+ frames going through all these pieces before you hit the middlware.
I love DRF, but we don't really use it anymore for front to back communication. Too much working around the pure REST implementation to get anything done. However, we'll still throw it into our projects because the data folks love it. Easy way for them to shop for whatever data they want from the system.
Hobbyist dev here, love Django and it’s my go to for rapid prototyping and have built a couple web apps with it. I love the batteries included approach over flask and the ORM is simply amazing. I tried a couple times to use DFR but never really stuck with it, my reason was that my html templates have a lot of dependency on views and I just find the idea of using DRF to decouple front end and back end hard. Not a DRF limitation but mine alone. I might just back and revisit it again. Kudos and Best wishes to the team on the release!!
You might find it easier to do piecemeal at first. Start by identifying where you have a pure data dependency that’s currently doing a round trip to render HTML redundantly or unnecessarily, and think about whether it can be served as both HTML and pure data via content negotiation. If it can, then you have the flexibility to decouple that dependency at your leisure.
I like the automatic model serializers, but don't like the extra layer of syntax in views, ie different than normal Django. DRF's system can be replaced in many cases by these helper functions:
I love DRF for CRUD apis. It just gets the job done and you can Focus on data modelling.
We built our data hub/data Integration solution on top of it. [1] It was a good choice.
By far the most extensible and overridable library I have worked with so far. Even when you need to ressort to hacks, they never seem to break when upgrading a Version.
I don't think one should write APIs in Django. It's layer upon layer upon layer on top of a framework that wants hierarchical models and views, and you have to bend backwards to put a slightly more complex data model in it, and then another time to write all the hooks to make it fit the API. It's like paint by numbers for software engineering.
You have to know what DRF is good for, and what doesn't fit well with it. Because DRF is on top of Django (one layer not many layers), you can always fall back to a function that takes a request and returns a response. You can also fall back to SQL instead of the ORM.
Not SICP as a teen, not the Haskell Book, not The Art of Unix Programming, not A Philosophy of Software Design. Those are all fantastic books, but they were the wrong things to read for someone whose definition of "Keep It Simple, Stupid" was "never build anything at all and stay unemployed". Django got me to shut up and build.
And build I did, and most of it did and does suck, and that's okay. And then I got a job, not doing Django, but using the things I learned from actually building with Django every day. I owe Django a great deal, and I still think of the DRF as my favorite approach to building a "well-tempered", maintainable API, on a deadline.
[1]: https://djangoforbeginners.com/