At Govnex we’re using MySQL on our development machines. It has several advantages, but one of the drawbacks is unittests run slower when not using Sqlite (big ups to MockSoul for posting his benchmarks). The reason for this is when running unittests in Django with a Sqlite database, the database is run in memory (RAM) instead of being written to the disk.
Django only allows you to specify one database, and although a new database is created for unittests, it still uses most of the same settings (auto prepending “test_” to the database name), which means a Django install using MySQL also uses MySQL for unittests.
To get around this problem, I created a file in my project root called test_settings.py containing the following lines:
http://gist.github.com/269919.js?file=test_settings.py
I then opened my settings.py, imported sys, and inserted the following lines at the end of the file:
http://gist.github.com/269919.js?file=settings.py
If one of the command line arguments is “test”, that means a unittest is being run, in which case Django will attempt to import test_settings.py, which will override the database settings and use Sqlite instead. Migrations won’t be an issue, since Django South uses the old syncdb command for generating databases for unittests.
Hope this helps you save some time testing.
Awesome, thanks. I used your approach to override DATABASES dict for multiple databases (since my Django app uses multiple databases) and the tests are now blindingly fast, and we don’t need test_* databases on the remote servers.
LikeLike
Thanks for commenting, I’m glad you found this useful.
This reminds me, I should update this Gist for the new dict style database configuration.
LikeLike
Thanks for such a simple and elegant solution. It was taking approx 6 seconds to create a remote postgresql database to run 0.3 seconds worth of tests. Now…
$ time ./manage.py test backend profile
…
real 0m0.533s
LikeLike