Last week I helped aardman digital set up the Wallace and Gromit site to cope with a huge traffic spike resulting from the 20th anniversary site relaunch and “google doodle” logo displayed on the google site across 12 countries. Working with the in-house team, I wrote a script to scrape the live site and create a static copy. The initial scrape was done using wget, and then a command line php script was used to run through the site doing some searching and replacing and renaming of various things. With thousands of pages of forum and user uploads the script took a while to run! The static site was then transferred to rack space cloud hosting which could be scaled on demand as the traffic spiked. Traffic to the main url was then diverted to the static copy of the site, based on the idea that most people would rather browse than interact. Anyone who wanted to log into the forum was then transferred to the live site.