It's annoying how there is no canonical way to do it. You need to spend some mental energy learning how to do this when starting up a Django project. Django should really integrate one solution into the django core, and then everyone can just use that. Make it much easier.
I feel the opposite, Over the 3 years or so I've worked with django full time there have been very good reasons to have different setups wrt environment, virtualenvs & web servers.
That said I tend to use env vars to identify which environment an app is running in and build everything off that, which branch / db / logging target etc to use.
Having an opinionated, best practice default doesn't have to prevent customization. I have always thought Rails is better than Django in this sense. There's one clear default way to do things, and then you can customize if you need to.
Environment variables work extremely well when deploying on Heroku [1]. You can't exactly commit a local_settings.py file to your repo, so setting env variables is the only way to exclude sensitive keys and passwords from your source control.
I'm aware of those, thankfully the things I set wont change within the life a single environment, a single variable to say dev/test/staging/production, could easily set in bashrc & wsgi script.
Not at all. Store settings_dev.py and settings_prod.py in your repository. What will be different on production machine and on local one is the symlink file called settings.py.
A fourth way: the django config is a template, with entries like:
settings.py:
CACHES = {
'default': { 'BACKEND' : 'django.core.cache.backends.memcached.MemcachedCache',
'LOCATION' : [ {% for IP in MEMCACHED_SERVERS %}'{{IP}}:{{MEMCACHED_PORT}}', {% endfor %} ],
},
Yes, that's a django template in python code. Upon deployment, my fabfile renders the settings file along with various other config files. This makes sure that {{MEMCACHED_PORT}} doesn't accidentally say one thing in settings.py, and another in memcached.conf.
This also allows me to keep my sitewide settings in a single file or two, namely a module in the fabfile folder.
It feels a little dirty to do things this way (though I can't figure out why), but it's saved me a lot of headaches.
Where the first one has the defaults (usable for local development), and the others contain any overrides needed to run with Gunicorn on the corresponding dev/staging/production servers.
We have a very similar approach except we encapsulate settings into its own package just for readability/maintainability. So it looks something like the following
I wonder what kind of maintainability benefits do the separate packages bring in this case? To me, having a lot of empty __init__.py files and subdirs reduces readability, so I like to maximize simplicity.
I do something very similar. But I just have dummy files in folder depending on the environment (so I do something like 'touch PRODUCTION' or 'touch STAGING') and based off which file is present I know which settings to serve.
Then I have alex.py, prod.py and test.py all in a config subfolder. All under version control, with my settings.py automatically choosing the right environment based on where it's deployed to.
I've tried a few different strategies and found this method works best for me.
In my fabfile I have environment 'setters' that precede regular commands. So I can do 'fab qa deploy' or 'fab prod deploy' and the deploy command grabs the correct settings_local file for the target environment.
I toss most of my core settings into the settings folder and __init__.py and then from there in my development.py, staging.py production.py
I can then, in each of my specific environment files just from settings import * and have access to the direct variables for all of my __init__.py items I setup. Most of the time you'll just have different database and cache settings for these environments.
Before you yell at me about the * import, yes, it normally is a bad idea but in this case it is good. It does often get abused and that is why so many people see it was "bad".
I like how Heroku does it, it's very smart, they basically inject the database settings to your settings.py that code just grabs the connection settings from the environment so you don't need to worry about it.
tldr; 'Scaling' WordPress is simple. Install a caching plugin like WP Super Cache.
WP gets a bad rap because it seems to spit out the oh so unattractive "Error Connecting to Database" message under the slightest load. Perhaps it's inefficient, i'm not sure why it seems to die so easily.
But... good news is, there is a simple solution. WP renders everything dynamically on every request. It fetches the post from the DB each time you load the page. But, more often than not, the content is only going to change when 1) you write a new post, or 2) a comment is made to a post.
A simple caching plugin solves this. It will render the page with a specific timeout (3600 seconds is default I think?, 1 hr) and then just serve up that HTML rather than hitting the DB. This will solve 99% of your problems. I've never really seen a WP site die when used with something like this. Duh, because it's just static HTML at that point haha.
Combine this with a PHP cache like APC (my personal favorite opcode cache of the moment) and a fast webserver like nginx, and you're gonna pretty much survive anything.