Me on the WP Elevation podcast

I’m a huge fan of Troy Dean’s WP Elevation podcast: he’s the best interviewer in the WordPress space, and he has a knack of getting great people to spend an hour giving away their secrets.
So it was a real honour to be invited (courtesy of one Mike Little) to participate. We recorded the show via Skype – me at home, just back from the school run; him in south Australia, at a client’s office at the end of the day – and it’s gone up more or less unedited.

Watching it back just now was much more nerve-wracking than actually doing it! – but apart from getting the launch date of wordpress.com wrong by a mere ten years, I don’t think I said anything too stupid.
Show notes etc on the WP Elevation website.

Shifting sites: a new home for the Wales Office web presence

Regular readers will know the pivotal role played by the Wales Office in recent gov-web history. In 2007, they took the then-radical step of moving their corporate web presence into an open-source web publishing platform, namely WordPress. Nobody died. A point was proven. From there, to Downing Street, to Defra, to Transport, to Health… etc etc.
We started out with two completely separate ‘single site’ installs: WordPress MU didn’t seem quite stable enough. But since 2010, they’ve been running a WordPress 3.x multisite – containing both their English and Welsh language sites, archived copies of their complete pre-2010 content, and more recently, the (bilingual) Commission on Devolution in Wales site. All fairly modest in traffic terms, but punching far above their weight (and pricetag) in terms of functionality.
For some time, we’ve been trying to perusade the Wales Office team to change hosting provider. Getting any serious systems admin work done – including, I’m ashamed to admit, WordPress upgrades – was almost impossible with the legacy hosting company. The market price for hosting had crashed, but their hosting bill hadn’t. And to be quite blunt, they were getting a minimal level of service.
Our first step, early this summer, was to liberate the DNS. As with a lot of websites, the domain name info was held by the hosting company. Two eggs in the one basket. By taking the DNS to a third party, it gave us the freedom to move the sites at a time of our choosing – and the hosting company couldn’t really do anything about it. Would they have been deliberately obstructive? Probably not, no moreso than usual. But ‘usual’ was precisely why we wanted to move.
Step two was to buy some new hosting space. And courtesy of GCloud (v1), this part was unexpectedly straightforward. In CatN, we found a hosting provider offering an appropriate level of service, with the kind of access and support we expected, for a tiny fraction of the cost.
Step three was migration. Assisted by regular partner-in-crime John Blackbourn, we did a number of dry runs, zipping up the entire WordPress installation – database and uploaded files – and transferring it to its new home. Not as straightforward as it probably sounds, given the relative inaccessibility of the incumbent server… but we found a way. One or two rules may have been broken, at least in spirit, along the way. And I’m very glad I’m on an unlimited broadband contract.
Today was step four. We implemented a content freeze at 9am, migrated everything one last time…. and by lunchtime, we had everything up and running at CatN. At 2.30pm, the DNS changes began – some by us, some done on our behalf. (Thanks again to you-know-who-you-are.) With some having very low TTLs, we could see the changes starting to kick in almost immediately. With others, we had to wait an hour or two. But by 4 o’clock, it was done. And with the freeze unfrozen, there was even time for a new press release before going-home time.
No downtime, and no loss of data. Massive performance improvements, at massive cost savings. At long last, a fully up-to-date install. And best of all? If we hadn’t told them we were doing it, I doubt they’d even have noticed it happening.

Two projects make State Of The Word


Saturday saw the annual State Of The Word address by WordPress co-founder Matt Mullenweg, as part of WordCamp San Francisco. Worth taking an hour out to watch it, if you’ve got any interest in the WordPress project.
And I’m delighted to note that not one, but two Code For The People projects got a mention during the talk: our work for the Rolling Stones, and Oxford University’s Free Speech Debate (although the latter was a bit blink-and-you’ll-miss-it). We’d have been delighted to see one project among Matt’s hand-picked highlights of the year; having two is a bit of a shock.
The other important point to note is that WordPress 3.5 will be released on 5 December (‘I 100% believe it’s gonna happen’), even if it means dropping certain features. We’re already starting to see signs of what’s in store, including the Twenty Twelve theme, and changes to Media uploading.

Code For The People among only nine agencies with WordPress VIP accreditation


Big news today, if you’re into this sort of thing: Automattic have just announced an extension of their WordPress.com VIP Featured Partner program(me). It used to be only for other technology platforms; but they’ve now opened it to interactive agencies.
There are only nine in the initial group of agencies vetted ‘to ensure the agency’s capabilities fit the needs and scale of VIP customers’. And we’re among them.
It’s an elite bunch we find ourselves in: exclusively English-speaking, majority US-based, with a smattering of famous names (well, in certain circles). To be quite honest, we’ve no idea where it might lead. But with a client list like ours, we feel we’re already playing in that VIP league already, and we’re excited at the possibilities coming from wider exposure.
To mark the occasion, we’ve (finally!) got a Code For The People website up – and we’re really quite proud of it. It’s in the ‘one page’ style – partly because we didn’t have time to write loads of editorial, but also because you don’t have time to read it either.
Although the content is hardcoded, it’s been built as a barebones WordPress theme, giving us easy access to extra functionality – there’s an embedded contact form, for example, and an embedded version of Simon’s Twitter Tracker widget. I’m sure we’ll migrate more stuff into it over time. But then again, I bet we’ve all said that, and then… well, you know.
It’s been a(nother) fantastic piece of work by Laura Kalbag, who led on the visuals, and took charge of the coding duties in the latter stages. Thanks, Laura.

Mick, Keef, Charlie, Ronnie – and us. Oh, and the Greatest 404 Page Of All Time.


I’m now into my 18th year earning a living building and running websites. I’ve been lucky enough to work for and with some household names. But never could I have imagined that I’d end up working for The Rolling Stones. And yet, for the past few months, that’s precisely what I’ve been doing. I’m still not sure it’s actually sunk in yet.
I’m not going to waste time explaining why it’s been an exciting project to work on. It’s The Rolling Stones, for dear’s sake. But one note I will add, for younger viewers, is that the Stones were one of the first ‘real world’ entities to have a website: here’s the Wayback Machine’s copy from 1996. A year earlier, they’d even live-streamed a concert (to an audience, one assumes, of literally dozens) – it was the very moment I decided there was probably a future in this stuff.
Simon and I were brought on board by designer James Stiff, who had worked with the Stones’ team last year, on spinoff site StonesArchive.com. Their existing site was black, and rather funereal. It read like a museum exhibit. It hadn’t kept pace with the development of social networking, or online music sales. And it was running on Drupal. They wanted a ‘total overhaul’, to coincide with the 50th anniversary of the band’s first concert (here’s the setlist), and the opening of an exhibition at Somerset House. Of course we said ‘yes’.
Behind the scenes, it’s the now customary mix of WordPress custom post types and taxonomies. We’re running separate post types for people, songs, albums (etc), videos, and photo galleries. Albums and people also exist as taxonomies, allowing us – for example – to show full credits and track listings for each album, including audio previews.

The entire site is underpinned by the iTunes API. Our starting point was a big data scrape, pulling down details of the 400-ish separate songs in the Stones back catalogue, which we then associated with the relevant albums (plural, in may cases). Of course, in doing this, we couldn’t have picked a harder back catalogue to work with: so many compilations, live albums, and so on. So we’ve also included an indication of the ‘canonical’ version: in other words, the album most normally or naturally associated with a given track. This gives us something sensible to offer in search results… and powers a feature we didn’t quite have ready for launch, but which is truly awesome.
iTunes is also the source of the 30 second audio clips, delivered in m4a format. Great in Chrome and Safari, Android and iOS, even IE: all of whom can play it natively using HTML5’s <audio>. But not Firefox or Opera. So we’re having to include a Flash-based fallback, using the jQuery-based jplayer. Hang on, wasn’t HTML5 meant to spell the end of cross-browser chaos?
There’s plenty more I could waffle on about here: the all-widget homepage, our use of WPEngine or Twitter Bootstrap (thanks to both!)… but it’s clear there’s one feature which has excited people more than most. Ladies and gentlemen, I give you the error page they’re calling ‘Probably the Greatest 404 Page Not Found Error of All Time‘. (And that’s a guy called Jesus saying that, so…)

It’s not unusual for a relaunched website to see a lot of people hitting its ‘404 not found’ page, as people click on now-invalid links or bookmarks. But we’re seeing people deliberately trying to hit it, in vast numbers. It’s getting its own articles on sites like Gizmodo and Mashable. I’m half-expecting to see it trending on Twitter by the end of the day. People are sending me vast numbers of suggestions of musical 404 pages for other Rock Legends. It’s all just a little bit nuts.


What’s more exciting for me, personally, is the way we (by which I really mean Simon W) approached the site development. The functionality, wherever possible, has been written for easy reuse. So the next time a legendary rock act with an extensive back catalogue comes calling, we’ll be able to get them to a working prototype in double-quick time.
Our thanks go to all those who made it possible: EC, the guys at WPEngine, the Twitter Bootstrap crew, the WordPress community… and Bruce Springsteen, whose website – relaunched on WordPress earlier this year – was a source of inspiration.
But in particular, thanks to James Stiff for bringing us on board in the first place, providing us with some gorgeous visuals to work with, coming up with the 404 idea, and from my own selfish perspective, supplying everything in perfectly web-ready form.
PS If anyone happens to see David Bowie, tell him I said ‘hi’. 😉
This has been a Code For The People production, on behalf of the Greatest Rock N Roll Band In The World.

Cutting down on cookies: practical tips


The Government Digital Service’s Implementer Guide for the new cookie rules recommended that site owners should audit their sites, and look to reduce ‘unnecessary and redundant cookies’. With or without the new rules, it’s still sound advice. So I thought I’d share a couple of things we’ve done for clients, which might be helpful to other people.
It’s easy enough to look at the cookies being dropped by your own site, but life becomes a lot more difficult when it comes to third party services. You might not realise it, but every time you embed a YouTube video on a page, you’re exposing your users to YouTube cookies. And if you’ve included Twitter’s excellent profile widget on your site, guess what? – it’s dropping cookies too.
Both services would probably argue that any user tracking is ultimately for users’ benefit: and in fact, unlike many in the web industry, I have some sympathy for that argument. But I’m not entirely comfortable with government websites acting as (unwitting?) conduits between users’ personal web histories and third-party services.
YouTube
YouTube offers a seamless solution: a parallel domain, youtube-nocookie.com which gives you the exact same YouTube playback function, but tighter controls over cookies. If you’re ever embedding a clip manually from youtube.com, you’ll see an option to ‘Enable privacy-enhanced mode’: tick this, and you’ll see the embed code’s reference to youtube.com change to youtube-nocookie.com. Easy as that.
(The name is slightly deceptive: it doesn’t completely eliminate the use of cookies. YouTube’s help pages indicate: ‘YouTube may still set cookies on the user’s computer once the visitor clicks on the YouTube video player, but YouTube will not store personally-identifiable cookie information for playbacks of embedded videos using the privacy-enhanced mode.’)
On a couple of client sites with large quantities of videos, FreeSpeechDebate and the Government Olympic Communication site, we use a WordPress custom post type to simplify the process of adding YouTube content. All they need to do is paste the URL of the clip’s page into a WP editing screen, and we extrapolate all the rest: embed code, thumbnail image, dimensions and so on. The videos are then included automatically at the top of the appropriate page.

As seen at goc2012.culture.gov.uk

We’ve now altered that functionality to serve all videos from the youtube-nocookie.com domain; and also to include the youtube-nocookie.com domain in the embed code we offer. A fairly simple case of find-and-replace, initially in the page template’s PHP, and subsequently also in javascript if users want to customise the dimensions.
Twitter
Avoiding Twitter’s cookies has been slightly trickier. Our solution has been to move clients away from the official Twitter widget, instead deploying my colleague Simon Wheatley’s well-established Twitter Tracker plugin (downloaded well over 10,000 times), which we’ve adapted to permit cookie-free usage.
Twitter Tracker adds two new WordPress widgets: one showing Twitter search results for your chosen term or hashtag, the other displaying all tweets by a given user. It includes local caching of the data, minimising traffic to Twitter and (in all likelihood) rendering the pages much faster – for the loss, admittedly, of a ‘real time’ view, which may or may not be important to you.
However, because the widgets call users’ profile images live from twitter.com, cookies were still being dropped. So there’s now a ‘partner plugin’, called Twitter Tracker Avatar Cache, which – as the name suggests – downloads any Twitter profile images and saves them locally within WordPress. No need to call them in from twitter.com, and hence no cookies. (For those who don’t want this extra functionality, the base plugin will continue to work as it always has.) It’s available now from the WordPress plugin repository: find it via the ‘Add New’ screen in your WordPress admin interface.
For most people, this will probably seem like overkill – and in fairness, it probably is. But for quite a few of our clients, it’s been a helpful way to avoid some of the more sensitive issues around cookies and usage tracking, without compromising on site functionality.

Code For The People presents: HMG's Olympic & Paralympic media centre


The run-up to the Olympic Games starts in earnest today, with the arrival of the flame on British shores – and Whitehall is opening up its dedicated Government Olympic Communication operation, providing ‘a single point of contact for London 2012-related media enquiries … until the end of the Paralympic Games on 9 September.’ There’s a dedicated press team, drawn from across Government – and a dedicated website, which I’m genuinely proud to say we built.
DCMS asked us for a site which could draw together the many streams of information – text, photo and audio – already being produced in government, and make them easy for journalists to explore. Many departments were issuing press briefings, or posting fantastic material on Flickr or YouTube, but there was no easy way to browse through it, or conduct targeted searches.
Cue some WordPress-powered magic. 🙂
A trained eye will spot our heavy use of custom post types. Some, like ‘backgrounder’ were fairly straightforward, identical to posts or pages, but separated out for convenience. Others, like ‘theme’ and ‘region’ were more complex – and were also sync’ed up to custom taxonomies, allowing us to ‘tag’ other post types as being relevant to a given theme/region. We then interfere with WordPress’s default selection of display templates, to show collated results pages: editorial on one side, search results on the other.
Here’s an example: a ‘theme’ page, showing relevant results from the other post types.

There are specific custom post types for photos – specifically Flickr; and videos – specifically YouTube. (Why these two? Because pretty much every department is already using them.) And in both cases, we’ve written custom code to interface directly with the host sites’ APIs, making the process of adding new material a breeze.
Let’s take YouTube, for example. Editors simply click ‘Add New’, then paste the URL of a YouTube clip’s page into a clearly labelled box. We extract the clip’s unique ID, then query the YouTube API to get its thumbnail, which we save as the WP post’s featured image; and the YouTube-recommended embed code. Couldn’t be easier.
Then, when you view the clip’s page on the site, the video gets embedded automatically – and we display the YouTube embed code, for journalists or bloggers to take away to their own sites.
We let the journalists and bloggers customise the embed’s dimensions via an ajax call back to YouTube; so if you need a clip to be a certain size, we’ll recalculate the width and height accordingly. We store your preference using a cookie – meaning that now, every time you look at a video page, the embed code is pre-customised for you. 🙂

Then there’s the multi-dimensional search function. Each post type has a number of taxonomies associated with it: theme, region, originating department, and so on. So when you’re browsing, say, the photo archives, you can specify that you want photos on a given theme… or from a given region… or by a given department… or in a given month. Individually, or in combination.

It’s the first time we’ve tackled this kind of ‘advanced searching’ functionality, and it probably doesn’t sound all that complicated: but I can assure you, it is. 🙂
It’s also the first time we’ve delivered a responsive design on a client site. We originally planned three versions: phone, tablet and desktop. But a late change of code base, and (to be honest) questions over its real value, led us to drop the tablet view. For the most part, it’s just been a case of un-floating the various blocks in the layout grid – but a few elements, like the primary navigation and homepage carousel, needed a bit more work. Give it a try if you’ve got a smartphone handy; or resize your (non-IE) browser to a really narrow size. It should kick in at 480px width.
Behind the scenes, working with our very good friends at CatN Hosting, we’ve added a Varnish cache – just in case there’s a sudden huge leap in traffic. Hopefully it won’t ever be required. But for the same reason they’re putting missiles on top of east London tower blocks, we’re planning for a worst-case scenario.
My thanks to Nick at DCMS/GOC for commissioning us, and protecting us from the internal wrangling. To Joe at CatN, for leaping into action when called upon, and for very kindly volunteering to help with Varnish. And to the G-Cloud process, directly and indirectly, for its help in buying Joe’s services. To the other clients who, knowingly or not, contributed ideas and code to the site’s development. But most of all, to designer Laura Kalbag, who developed the visuals and did the bulk of the front-end functionality. You’re all wonderful.
Let the Games begin.
This has been a Code For The People production.