Main Content

Using a cache for efficiency. Python and PHP examples

Archive - Originally posted on "The Horse's Mouth" - 2009-08-21 07:25:05 - Graham Ellis

Ask yourself "how long is it (in seconds) from 5.36 a.m. and 20 seconds to 6.04 a.m. and 39 seconds".

Have you worked that out?

Now - how long is it (in seconds) from 5.36 a.m. and 20 seconds to 6.04 a.m. and 39 seconds.

Got that? And how long is it (in seconds) from 5.36 a.m. and 20 seconds to 6.04 a.m. and 39 seconds.

I expect by the third time you came to the same calculation, you already knew the answer and just trotted it out again, without bothering to recalculate it. But if you put such a calculation within a program, to be done time and time again, the computer will almost inevitably repeat the calculation time and time again, which may significantly slow the operation. And if that's likely to happen, it's worth considering caching or storing the result the first time, so that you don't have to recompute in the 2nd, 3rd, 4th ...

Let me ask again "how long is it (in seconds) from 5.36 a.m. and 20 seconds to 6.04 a.m. and 39 seconds". See - you KNOW straight away that it's 1699 seconds, don't you, because the human mind caches.

On Wednesday, I wrote a course example which analysed a log file of over 30 Mbytesi which listed accesses to our web server - nearly 150,000 record from over 14,000 different IP addresses. And I wanted to sort them by the elapsed time from first to last visit from each IP, to give a length of visit - a snippet of the result to look like this:

205.210.222.150 3 600
115.240.214.62 8 600
203.128.4.254 6 599
89.61.24.234 2 599
199.46.149.168 18 599
212.235.67.230 6 596
196.11.235.63 4 596

(Snippet shows visitors who spent between 9 minutes and 55 seconds and 10 minutes between their first and last accesses - so actually people who were probably with us for JUST OVER ten minutes, bearing in mind they would want time to study the final page they found)

My methodology - in Python - was to read the whole file, and create visit objects (lists of accees strings) which were themselves stored in a dictionary. An object method was written to provide the duration of an access by analysing the first and last log lines, converting to seconds and subtracting. Here is the class:

class visit:
    def __init__(self,defstring):
        self.detail = [defstring]
    def hit(self,defstring):
        self.detail.append(defstring)
    def getcount(self):
        return (len(self.detail))
    def getduration(self):
        if len(self.detail) < 2:
            return 0
        startinfo = when.findall(self.detail[0])
        endinfo = when.findall(self.detail[-1])
        thyme = gelap(endinfo) - gelap(startinfo)
        if thyme < -200:
            thyme += 24 * 3600
        return thyme


It worked ... but was a bit slow. With a timer added:
(13.630000000000001, 0.089999999999999997, 0.0, 0.0, 10582027.0)
13 seconds of c.p.u. time - hardly the ens of the world but could do better. And here is better:

class visit:
    def __init__(self,defstring):
        self.detail = [defstring]
        self.cached = 0
    def hit(self,defstring):
        self.detail.append(defstring)
        self.cached = 0
    def getcount(self):
        return (len(self.detail))
    def getduration(self):
        if self.cached:
            return self.eltime
        self.cached = 1
        self.eltime = 0
        if len(self.detail) < 2:
            return 0
        startinfo = when.findall(self.detail[0])
        endinfo = when.findall(self.detail[-1])
        thyme = gelap(endinfo) - gelap(startinfo)
        if thyme < -200:
            thyme += 24 * 3600
        self.eltime = thyme
        return thyme


Ironically, the code and objects are more complex. I have added an extra binary flag into each object to note whether or not I have an up to date cache - for I might enquire about an object's duration before it has had all its records added, and if I do I need to expire the out of date calculation. And I have also added a check to see if I need to bother to calculate when the duration is called for, which returns the stored (cached) value if it's already available. Finally, I actually stored the elapsed time value in the object.

But the results are good:
(2.1899999999999999, 0.080000000000000002, 0.0, 0.0, 10582099.23)
I have reduced the program's run time to about one sixth.

You should have an eye open to caching, but it won't always help. When you are sorting a significant number of records by a calculated value, as I was, there will often be a gain. But if you are only using each value a couple of times, you may slow things down a little, or increase the memory footprint so much that you loose any gains. Then ... if you end up caching something that only took 10% of the time anyway, even a 10 fold speed improvement in that part of the code will give you just a scarce-noticed 9% speed-up.

The complete code from the uncached example is [here] and from the cached example is [here].




Caching is also useful in web2 applications, where your web server uses resources from another system on the internet to give a combined report - it picks up exchange rates, blog feeds, status reports such as weather, from other information providers, and includes that data directly or indirectly in its own results. In such cases, the data supplying site will often REQUIRE you to cache the data, which may then fall slightly out of date and you'll have no way of being sure if the data has changed in the intervening time. But with a reasonably frequent check, the data has probably slipped just a little.

Here's a PHP example in which we grab a data feed at a maximum interval of 10 minutes, and we use the file timestamp to note when the most recent retrieval happened:

# Cache - update every 10 minutes (max)
if (time()- filemtime("tfd.txt") > 600) {
    $logrecs = file("http://(url obscured)");
    $fh = fopen("tfd.txt","w");
    foreach ($logrecs as $line) { fputs($fh,$line); }
    fclose ($fh);
} else {
    $logrecs = file("tfd.txt");
}


This code is included in our live feed from the local train operator (full code [here]) and is only re-collected every 10 minutes. OK - so it may be slighty out when you display it, but it gives a good, live impression. And we have another live page where logged in members have a shorter caching time ... that uses the same data file, so even the casual visitor can sometimes get better updates than the 10 minutes might lead one to believe.

The Python caching example was covered on our Python Programming Course this week, and the PHP and web2 is routinely covered on PHP Techniques.