Kaydet (Commit) b22a4fdc authored tarafından Simon Meers's avatar Simon Meers

Fixed #16886 -- Memcached socket file documentation. Thanks ddbeck for the report and patch.

git-svn-id: http://code.djangoproject.com/svn/django/trunk@16858 bcc190cf-cafb-0310-a4f2-bffc1f526a37
üst aaf77c16
...@@ -100,8 +100,9 @@ To use Memcached with Django: ...@@ -100,8 +100,9 @@ To use Memcached with Django:
on your chosen memcached binding) on your chosen memcached binding)
* Set :setting:`LOCATION <CACHES-LOCATION>` to ``ip:port`` values, * Set :setting:`LOCATION <CACHES-LOCATION>` to ``ip:port`` values,
where ``ip`` is the IP address of the Memcached daemon and where ``ip`` is the IP address of the Memcached daemon and ``port`` is the
``port`` is the port on which Memcached is running. port on which Memcached is running, or to a ``unix:path`` value, where
``path`` is the path to a Memcached Unix socket file.
In this example, Memcached is running on localhost (127.0.0.1) port 11211, using In this example, Memcached is running on localhost (127.0.0.1) port 11211, using
the ``python-memcached`` binding:: the ``python-memcached`` binding::
...@@ -113,6 +114,16 @@ the ``python-memcached`` binding:: ...@@ -113,6 +114,16 @@ the ``python-memcached`` binding::
} }
} }
In this example, Memcached is available through a local Unix socket file
:file:`/tmp/memcached.sock` using the ``python-memcached`` binding::
CACHES = {
'default': {
'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache',
'LOCATION': 'unix:/tmp/memcached.sock',
}
}
One excellent feature of Memcached is its ability to share cache over multiple One excellent feature of Memcached is its ability to share cache over multiple
servers. This means you can run Memcached daemons on multiple machines, and the servers. This means you can run Memcached daemons on multiple machines, and the
program will treat the group of machines as a *single* cache, without the need program will treat the group of machines as a *single* cache, without the need
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment