Dynamic page caching with PHP and output buffering
A while back I created a portfolio website for a friend of mine, Andrew Holder. As his artwork grew in popularity, traffic to his site began to erupt. The portfolio section of the site is pulled dynamically from a database. When experiencing traffic spikes, the connection to the database would fail, exceeding the maximum connections allowed. Not good.
The website is hosted on Media Temple’s Grid-Service (gs) which only allows 30 database connections at a given time. This caused the site to throw a mysql_connect() error once the limit was exceeded.
Error: Warning: mysql_connect() [function.mysql-connect] : Too many connections
So, the solution seemed that I needed a way to cache these files. A few Google searches later led me to an article over at PaperMashup. Basically, the way it works is by utilizing PHP’s output buffering. It checks to see if the page has been previously cached and if so, delivers the cached content. Otherwise, it loads the page normally, storing the output within the internal buffer. Before spitting out the contents to the user, it creates a cached version.
I’ve included the code snippet below similar to how it’s used at andrewholder.net. For the original code and a more in depth explanation, have a look at the article.
<?php // Cache page contents $filename = (isset($_GET['id'])) ? $_GET['id'] : $cur_art; $cache_file = 'cache/' . $filename . '.htm'; $cache_time = 3600; // 1 hour // Serve the cached file if it's older than $cache_time if (file_exists($cache_file) && time() - $cache_time < filemtime($cache_file)) { include($cache_file); exit; } ob_start(); // Page contents goes here! $cached = fopen($cache_file, 'w'); // Cache the contents to a file fwrite($cached, ob_get_contents()); fclose($cached); ob_end_flush(); // Send the output to the browser ?>
Just came across this. Exactly what I was looking for, for a similar project that you had.
Thanks!