13
Developing Maintainable Django Projects for the Long Run
Django being the web framework for the perfectionists definitely comes with many handy features out of the box. Even after being a very opinionated framework [there is always the "Django way" for doing things], in many cases there are more than a few ways to do the same thing. This might lead to some poor structure and coding choices for many beginners as described in this blog post. These mistakes [along with many others] often lead to unmanageable code in the long run when the size of the project keeps growing.
If there is a bug and causes downtime in production then fixing the error might feel like finding a needle in the haystack with a poorly structured codebase. The above mentioned link can work as a style guide [like many other, scroll to the end of this blog post for more Django style guides] for structuring Django projects which can help manage our code better. On top of this, let's dive into 11 ways that will definitely make life easier developing and maintaining a Django project for the long run.
The codebase should be clean, concise and readable as suggested by the Django Best Practices guide. To maintain such a codebase, the best approach is to follow a specific code style and format. The hard part is remembering all the formatting rules and getting every developer working on the project onboard with these specific rules. That can eat up a lot of unnecessary time in development.
Enter black - The Uncompromising Code Formatter. It supports all major IDEs and upon saving a file, it does it's black magic to convert all the messy code to beautifully structured code. You can also run black against your whole codebase for formatting all of the files as well. Here is a handy guide on dev.to for setting up black in VSCode.
To help you with syntax errors, pylint is an amazing tool. There is also a dedicated Django extension for pylint called pylint-django. Pylint is the default linter in VSCode and you can setup pylint-django just by installing it via pip and updating the settings as shown on here.
Finally, to help you code even faster, you can use Kite for AI powered auto-completion.
Your environment variables like SECRET_KEY
or anything related to database setup should never be placed directly in the Django settings file and in the version control. These are sensitive data and you should safely keep them out of anyone's reach who's not supposed to see it.
You can use django-environ for serving the purpose. The process is very straightforward as well. You just need to create an .env
file and keep all your variables there and import them into your settings file using the API of django-environ package. And keep this .env
file out of version by placing it in the .gitignore
file. But you should keep an .env.example
file in the version control that contains a template of the original .env
file.
Your development coding environment will obviously differ in one way or another and to match with that, your project's settings file will differ as well. If you are keeping your development settings in the old fashioned local_settings.py
file and keeping out of version control, you are probably doing it wrong and there are many valid reasons for that.
A better approach is to split up your settings file into base, development, test, stage and production files along with a __init__.py
file that will go to a settings directory replacing the settings.py
file. The base.py
settings file will hold all the settings that don't need to change and the development, test, stage and production settings files containing all the environment specific settings with start with importing all the settings from the base.py
file as from .base import *
.
One issue that will come up with this approach is that the manage.py
will be pointing to the older settings file and it'll no longer work. In this case you won't be able to run your development server along with any commands with manage.py
. To solve this, you can create a new environment variable called WORK_ENV
in your .env
file and a new engage.py
file in the settings directory which will contain -
from .base import *
working_environment = env.str("WORK_ENV", default="development")
if working_environment == "production":
from .production import *
elif working_environment == "stage":
from .stage import *
elif working_environment == "test":
from .test import *
else:
from .development import *
And finally, update your manage.py file to replace the line
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'your_project.settings')
with
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "your_project.settings.engage")
So, previously all the settings with all sensitive information would have been stored in a single settings.py
file. And now, with the split up settings file and the .env
file, it should look like -
|--settings
|-- __init__.py
|-- base.py
|-- engage.py
|-- development.py
|-- production.py
|-- stage.py
|-- test.py
|-- .env
|-- .env.example
Django comes with a simple and lightweight SQLite3 database out of the box which takes you out of the hassle of setting up the database and you can go straight to developing your application. But there are many cases where it might go wrong and even restrict you from writing efficient code for filtering your data from the database. Among others, one example can be how to filter distinct values as explained here, you can't really use the SQLite3 database for this.
PostgreSQL database works fantastic with Django. It is recommended by many to use PostgreSQL in production. A better approach would be using it in all of your environments [development, test, stage and production] so that your application works consistently everywhere. If you can just go through the setting up the PostgreSQL database on your environments [just a few commands anyway and if you use Docker you can streamline the process even further], you don't have to spend time thinking about which Django ORM features that you can use.
If you are working with Django REST framework, it's a good idea to keep your API documented even if you are not exposing it to the public. One great tool is Swagger and you can generate the Swagger schema automatically for your DRF project by using drf-yasg. Very easy to setup and the schema will be generated from your urls.py
files. You can also choose between two different looks.
Keep in mind that drf-yasg generates Swagger/OpenAPI 2.0 specification of your REST API. If you are looking for Swagger/OpenAPI 3.0, you can check out drf-spectacular.
The django-debug-toolbar package makes debugging a breeze. Django's default debug mode is definitely a very useful tool but django-debug-toolbar is a step above. You can see how much CPU time it takes to process the response, how many [and what] SQL queries are being executed, which static files the request is fetching, details about your template, list of all of your context variables and many more.
Django shell is like a Python shell but with everything that Django is setup with on your project settings. You can call all of your apps, models and anything that's defined in your project really. Just type in python manage.py shell
in your virtual environment and you're inside the Django shell. This can come in handy for testing different ORM queries with your defined app models as well as trying out APIs of third party packages.
You can use custom management commands with manage.py
for doing tasks. For example, you might need to insert data into your database from a csv file or a daily cron job for doing something that is related to the database. You can't access the database with the Django ORM from outside the application [for example, a cron job]. But if you define your task in a management command, you can call it from anywhere with the Python from that project virtual environment using python manage.py your_command
.
There are many useful management commands can already be found in different third party packages. One such package is django-extensions. It comes with many helpful commands that can help you in many different scenario in development. You can drop the whole database, get a list of all of your urls and many more. One feature I particularly like is that I can generate a visual representation of my models using this command. When the codebase grows big and I need to get an overview of how all of my models are connected, it comes in very handy.
Well, the night is dark and full of (t)errors. And you can't just play Jon Snow when the users start complaining about getting server errors.
In development, when your debug mode is on, you can just go through the very helpful debug messages that Django helps you with. But it's not the same in production. You should definitely setup logs in your Django projects but additionally, a third party tool like Sentry can save you a ton of time and hassle. The setup process can't be simpler, you just sign up and add a few lines to the Django settings file and that's it!
To get you started, the free tier gives you to save 5K errors, 10K transactions and 1GB attachments for 1 user with 30 day data retention. And you don't need to wait for your users to start complaining that your site isn't working, Sentry will send you emails when it catches new errors.
You should always pin the exact version of Django and other pip packages that you are using in your Django project in a requirements.txt file. You can easily do it with virtualenv
and a pip-freeze
command. But the issue that comes up with this is that it exports all the package names that are in your virtual environment. There can be many dependencies of a specific package and if you want to upgrade one, you'll need to update all of it's dependencies manually as well along with it. Also, there are packages that you need only in development like black
, pylint
or django-debug-toolbar
that you definitely don't need in production. To deal with all these, there is an awesome package called pip-tools.
With the help of this package, you only need to pin your packages in a requirements.in
file and when you compile this file, it'll generate a nicely formatted requirements.txt
file that will have all the dependencies with notes for which one of them is coming from which specific package. You can also pin your development packages in dev-requirements.in
file and keep your development tools separate.
Now you have taken care of the version of the pip packages, how about your Python version? Or exact PostgreSQL version that you are using in your development? Or even further, the version of your operating system that's all of these are running on?
When you run your application, it's behavior depends on everything that it's using [and not using, that are running alongside it on the same operating system]. All of these can influence how it's running and performing. To isolate the application and to make sure it works the same no matter where it's running, you should use Docker.
Getting started with Docker can easily feel overwhelming for a newcomer. Here is an awesome article how you can user Docker with your Django project. This is a three part series and it walks you through how you can use Docker in your development as well as deploying it live.
One of the most handy "batteries" included with Django has to be the Django Admin. It can be very useful for inspecting the data in the database with a model-centric interface as well as for inputting / inspecting / managing test data while in development. When we add a model to the Django Admin, the usual approach is to use ModelAdmin
from django.contrib.admin
but if you use it with a custom user model, there is an issue. And if you are not already aware, defining the custom user model should be something that you should do as one of the first things when you start a new Django project as suggested by the Django documentation itself.
Just like most other web frameworks, Django doesn't store the passwords as plain texts. It uses a hash function to encrypt it and then stores it in the database. So even if the data is compromised, it won't be possible to decipher what actually the password is for a specific user.
Going back to the issue about using ModelAdmin
with a custom user model is that your user create form and user edit form will present you with password input fields where you'll need to enter the password in the hashed format itself. If you enter yourawesomepassword123 as your password and save the user, that user won't be able to log in with yourawesomepassword123 because that's not the actual stored password. The actual password would be the value that's is yourawesomepassword123 after hashing.
The solution is to write user creating and updating forms based on UserCreationForm
and UserChangeForm
from django.contrib.auth.forms
and use them in the admin that will be based on UserAdmin
from django.contrib.auth.admin
. You can find all about how to do it on this awesome tutorial. This way you will have the freedom of creating new users from the Django admin with ease.
Test driven development is where you write your tests even before you code a feature. The idea is that you will think of the test case, write a test and write the function that will make the test pass. It is certainly not easy to always follow this and getting started with it can feel overwhelming. But for the long run, it'll help you save so much hours of pain and misery for fixing unwanted bugs that happen in production.
I would suggest checking out testdriven.io for getting up and running with TDD [and Docker]. I can vouch that their paid courses are worth every penny without having any affiliation with them.
There are a number of things that always need to be done when starting a new project which are always the same. There are a some fantastic starter templates that can help jumpstart your Django projects. I'm listing a few of them below -
When the project keeps getting larger and multiple developers working on the same project [even while working alone], it is possible to end up with different style codes in the same codebase. It's better to follow a specific style guide throughout all of the codebase so it's easier to read, modify and maintain in the long run. It's also easier to get new developers onboard with the project with a specific style guide. A few of the popular styles guides -
13