News today that Liam Maxwell has been appointed Deputy Government CIO, replacing Bill McCluggage.
Maxwell joined the Cabinet Office last summer, on an 11-month sabbatical from his job as Head of ICT at a Berkshire secondary school. Liam's belief in open source is well documented, and it's quite remarkable to have someone like that in such a senior position.
The Guardian says he will be retaining his responsibilities as Cabinet Office director of ICT Futures.
My attention has been drawn to the commitment on page 42 of yesterday's Budget 2012 document.
from 2014, new online services will only go live if the responsible minister can demonstrate that they themselves can use the service successfully
It's so simple, it's brilliant. And quite funny too.
But don't overlook the non-highlighted bit which follows. It's a commitment, the first I'm explicitly aware of, that:
all information is [to be] published on a single ’gov.uk’ domain name by the end of 2012
In other words, the Single Domain will at least be 'dual running' with all departmental websites within 9 months. But it's surely more likely, given that efficiency is a key selling point of the Single Domain strategy, that we'll see all departmental websites closed by then. There was no deadline mentioned in the Martha Lane Fox report of November 2010, or in last October's ICT Implementation Plan.
Update: blogging on the GDS's WordPress.com-based site, Mike Bracken adds some clarification:
We're working with colleagues across Government to get all information for citizens and businesses (what's currently covered by Directgov and Businesslink) published on GOV.UK by the end of this year and this gives us the hurry up. We're also working towards migrating Departmental sites onto 'Inside Government' but that will take a little longer, with a more gradual transition as current contracting arrangements for individual Departments come to an end.
I began my career at the Foreign Office, joining what was known as 'Guidance Section'. Its job was to be the in-house newswire service for British embassies far and wide. The day started by editing down a daily news summary and press review, based on BBC World Service scripts; at the click of a button on a VT100 terminal (look it up), these were delivered to hundreds of British diplomatic missions by the best means available. Could be fax, could be telex, could be telegram, one or two had something called E-Mail. Cutting edge stuff for 1995, believe me.
We would spend the rest of the day gathering news items from around Whitehall - press releases, transcripts of speeches, whatever. We'd edit these down to the essential, decide which embassies would be likely to receive media enquiries on the subject, and send it out to them. Then, at lunchtime and 5pm, we'd produce a 'shopping list' from which embassies could request anything they were interested in, but hadn't already received.
Departments were generally more than happy to work with us: often we'd get significant announcements ahead of delivery, so that Our Man In Wherever could have a head-start. The one massive exception was the Treasury, on Budget Day.
They would send an official on the short walk up Horse Guards Avenue to our office in the Old Admiralty Building, just by the Arch. He or she (usually he) would have the Chancellor's speech on a floppy disk. He would sit stony-faced in our office, one of few in the building to have a TV, whilst we all listened to the speech. When the Chancellor's bottom touched the front bench, the speech having been delivered to the House, he would hand over the floppy disk. And finally, we could begin the work of reformatting the text file, editing out the party-political bits, double-checking it, then sending it out.
Today, any Embassy press officer who's interested will be reading the same advance press coverage we all are. He/she will watch the speech live - CNN, BBC World, streamed online, whatever - before hitting the Treasury website. And he probably won't get a single call asking for a copy of the speech.
Much consternation in certain political circles this afternoon, as Boris Johnson renames his Twitter account... and takes a quarter of a million people's details over to his election campaign HQ.
Johnson was elected on 4 May 2008. His first tweet came on 8 May 2008 ('Setting up social marketing accounts!') - although it's not entirely clear what username the account used when it was created. In January 2009, though, he changed that username to MayorOfLondon. And the account has been quoted since at least May 2009 in official City Hall press releases, as his official account. Or in the case of that May 2009 press release, 'the Mayor's Twitter site'.
Before today's change, the URL associated with the account was http://www.london.gov.uk/ - and the biography read:
City Government for Greater London under the auspices of the Mayor of London
Could it have sounded more official?
(Something similar has happened to his Facebook account too; facebook.com/borisjohnson is now adorned with BackBoris2012 logos, and contains no history prior to 17 March 2012. And yes, that Facebook URL has similarly been promoted in the past as his official presence.)
In response, there's a statement on the BackBoris website:
As some of you may have noticed, earlier today Boris changed the name of his Twitter account from @MayorofLondon to @BorisJohnson. While the name of the account may have changed, rest assured that the account is still - and has always been - controlled by Boris.
No City Hall resources will be used to update or maintain the account - that would be against the rules. Given we're now in the official election period, this change is being made so there can be no question of Boris using official resources to campaign.
Of course, those who no longer wish to follow the account are welcome to "unfollow" at any time.
Of course, it's not the fact that future City Hall resources will be used; it's that past City Hall resources have already been used to build up a significant following. And the last line is somewhat ill-advised, in my opinion.
I'd be very interested to find out from people at City Hall - or indeed, from HM Government's Deputy Director of Digital Engagement, Emer Coleman who used to be City Hall's head of digital projects - as to whether City Hall thought it 'owned' the account on behalf of the office of Mayor.
If the account was always personal, Boris should have used his personal name. By using the name of his elected office, the natural assumption is unquestionably that you are following the individual in his/her elected capacity - as was the case with the Prime Ministerial Twitter account.
Here's a tip. If you're working in a government web team, I strongly advise you get something in writing to confirm who exactly owns any Ministerial accounts - rapidly.
Update: a climbdown of sorts. Boris has tweeted:
To be clear- @borisjohnson will only be used for discussing mayoral duties. To follow me on the campaign trail, follow @backboris2012
'As he entered the campaign he was determined to ensure there was no confusion between him as Mayor and him as a candidate and therefore changed the name of his Twitter account.
‘He did not expect this openness and honesty to have created such hysteria.
‘So in case there is even one Londoner who has a problem with what he did, he will not use that account for the campaign and instead can be followed from the political front on @BackBoris2012.’
Has he reverted back to being @MayorOfLondon? No. But the username hasn't been abandoned - someone, and you have to hope it's someone close to Boris and/or City Hall, has bagged it. Hopefully for safe keeping. We don't want this happening again, do we.
Updated update: Somewhat inevitably, Boris has - pardon the pun - backed down. He's now reverted to using @MayorOfLondon as his account name, and the BorisJohnson account has gone blank again.
I've just started work on a project to build a first-ever intranet for a small UK government entity. I've been waiting for ages for an opportunity to put BuddyPress, the semi-official WordPress add-on which promises a 'social network in a box' experience, to the test... and this is it.
It's still early days in the thought process - but the plan is to make heavy use of BuddyPress 'groups', to generate a personalised real-time view of activity in the areas in which you have a specific personal interest. Each team or department would be a group. Each cross-departmental project would be a group. There might also be groups based on physical location, social activity, union membership and so on. Some would be mandatory (eg 'all staff'); some would be open for anyone to join; some would be invite-only, or totally hidden.
The BuddyPress 'activity stream' filters itself automatically according to each signed-in user's group memberships; so your homepage (tbc) view would consist only of updates - news, forum discussions, events, document uploads, new members etc - from the groups you belong to. No two users' views would be identical. It's easy to see how powerful this could be; and in a post-Facebook world, it shouldn't be an unfamiliar concept.
Anyway... I started preparing wireframes yesterday, and hit an immediate question. What should go in the 'logo' space, reserved by convention in the top left corner?
Most intranets I've had the misfortune to use in the past have had names. But I wondered, did people actually use those names when referring to them? When asked 'where can I find that document?', would people generally answer: 'On the intranet.' or 'On [insert name here].'? Personally, I'd instinctively say the former myself; but after 17 years in this business, I'm used to the fact that I'm not 'normal'.
So I asked Twitter. And to be honest, I was surprised by the response.
Almost without exception, people responded that yes, their intranet did have a name... ranging from the fairly dull ('Cabweb' at the Cabinet Office) to the fantastic ('Narnia' at the National Archives!) to the quite unfathomable (one digital agency chose, er, 'Agnes'). And yes, people used the name in common parlance.
One or two people reported failed attempts to name their intranet: but the names they mentioned - '[organisation name] Online', or 'The Hub' - seemed very generic. It's almost as if people will make an effort to use the name, if you've clearly made an effort to make one up. If the name seems half-heartedly conceived, it shouldn't come as a surprise that the staff don't buy into it.
I'm not claiming any scientific validity for these results; but I'm left in no doubt that I'm going to have to think up a name.
The next phase of the gov.uk beta programme was opened last night: a six-week public testing phase for the 'Whitehall' information, now renamed 'Inside Government' (complete with tautological URL). Ten departments are covered initially, including all the obvious online big-hitters such as Health, BIS, Defra, FCO and DFID.
It looks very much like the rest of the gov.uk platform - as you'd expect, with a Global Experience Language - so it feels more like an extension than an enhancement. This is most striking with the individual department 'subsites': a unique 'spot colour' aside, and with an unexpected exception made for the MOD crest, all look identical and carry the same navigation. Departments aren't going to recognise these as 'their' sites - but that's kind of the point.
It's far too early to make definitive judgments about the presentation, not least because the team admit it's much more unfinished than previous previews. It's hard, therefore, to decide what's deliberately minimalist, and what's just 'not done yet' - and therefore, hard to offer helpful criticism. A lot of the pages feel very plain, probably too plain. In particular, I'm not fond of the very 'boxy' presentation of many pages: see the main News or Publications pages as good examples. I just don't find my eye being guided anywhere, and I don't get any sense of significance. But maybe they just haven't been 'done' yet.
Writing on the GDS blog, Neil Williams describes the 'custom publishing engine properly tuned to the needs of multiple users and publishers across Whitehall, and built specifically for the kinds of things governments produce. ... On average, publishing to GOV.UK was 2.5 minutes faster than WordPress and 11 minutes faster than Directgov,' he claims: I've already taken him to task on that one.
As a website, it's what they said it would be, and it looks like we knew it would look. So it doesn't feel like much of a leap forward, and could actually be quite a tough sell around Whitehall. But this part of the gov.uk project isn't about a website. It's about redefining how government departments see themselves, present themselves, and talk about what they do. And that's w-a-y more difficult than building a website.
Puffbox's longest-standing working relationship in Whitehall is with the Wales Office; it was there, don't forget, that the whole WordPress-in-government thing started back in late 2007. We moved them on to a multisite setup just before the 2010 general election; and we're seeing the benefits, through sites like the one we launched in November for the Commission for Devolution in Wales.
They're about to start a round of public engagement events, and they asked us if we could add a Google Map to the site... which, of course, is bilingual, English and Welsh. It's not rocket science these days, but it's probably the smoothest implementation I've done, and I thought it might be worth sharing.
We've defined 'event' as a custom post type, non-hierarchical (ie more like posts than pages), with a full set of fields. It gives the 'more info' pages a nice URL, and keeps them nicely self-contained, with benefits for both admin interface and template access.
We've then added a 'metabox' to the 'edit' screen, for the various elements which define an event: basically date, time and location. When you click into the 'Event date' box, you should get a popup jQuery-based calendar - but if you don't for some reason, or if you're a keyboard wizard, you can still enter it manually. We've left the 'time' field freeform: we didn't plan to do anything too clever with the event times, and besides, times are often rather vague.
I'm quite pleased with how we're doing the location. We ultimately want two things: a text-based name, which should make sense to humans rather than computers; and an exact geolocation, ideally latitude and longitude, for the machines. So, looking down the page, first thing you come to is a text search box. If you know the address, particularly if you have a postcode, you can enter it here; then click 'find on map'. This sends the query to Google, and makes a best-guess for the precise location, indicated by the crosshair hovering over the centre.
Google's guesses are usually pretty good, as you'd expect. But you can fine-tune them by dragging the map around - even to the specific building. And every time the map moves, whether via the search or via dragging, the coordinates update automatically.
The text name and the coordinates are saved separately - which means, once you've pinpointed your venue, you can then go back and edit the text-based name, to make it less of a search query, and more of a human-friendly description.
That gives us enough data to put the markers on the map - with accuracy down to a few metres if you're so inclined! - and to generate some meaningful text content too, in the form of a table and stand-alone page. And yes, we've got all the info in both English and Welsh - although this site predates our work on Babble, so it uses WPML. (I say 'all': it turns out, Google Maps doesn't do Welsh.)
Like I say, it's not rocket science. But it's always a joy when you can hand what is actually quite complex functionality over to a client, and it just works*.
Since last spring, Mr Wheatley and I have been working with Guardian columnist Timothy Garton Ash, in his role as Professor of European Studies at St Antony's College, Oxford on a project called Free Speech Debate. It's been the main focus of my attention for the last six months or more - which explains why the blog has been rather quiet of late.
As the name suggests, it's a site for editorially-led discussions on issues of freedom of speech, in today's socially-networked world. And as you might expect from someone so well-known and well-connected, the Professor has managed to secure contributions from a host of famous names in the field, from around the world - from Jimmy Wales to Max Moseley, not to mention numerous writers, and the odd Nobel Prize winner.
We've done similar 'editorially-led discussion sites' for numerous clients in the past, but never anything on this scale. You see, one of the site's 'ten draft principles' includes the right to speak in your own language. So we had little choice but to publish in multiple languages. And yes, that includes the difficult ones too.
The site went live initially in English-only. But in the last week, we've rolled out German, Turkish, Spanish and Chinese... with Arabic, Farsi/Persian, French, Hindi, Japanese, Portuguese, Russian and Urdu to follow shortly.
Our original plan was to use the WPML plugin for WordPress: we knew it had weaknesses, but it was the best of a disappointing bunch. However, for reasons I won't go into here, we subsequently decided to write our own plugin, based on modern WordPress features such as custom post types and taxonomies. It's called Babble; and I'm delighted to say, as soon as we've tidied it up a bit, we'll be open-sourcing it.
The way we've implemented Babble on FSD, you enter each post (etc) in the site's default language first, then - using a menu added to the Admin Bar - you can 'create' a translation in any of the languages you've enabled on your site. All the metadata - taxonomies, comments on/off, and so on - are inherited from the default language; so all you need to do is translate the text, and hit Publish. You'll even see the WordPress admin interface in the appropriate language as you're translating.
Comments are merged across translations: which means you'll potentially have a discussion thread made up of comments in different languages. Problem? No. We've implemented Google Sectional Translate, to let you translate each comment instantly into the language of the page currently being displayed (via ajax).
The entire site, in every language, is being generated by the same WordPress theme, be it a left-to-right language like English, or a right-to-left like Arabic. Building bi-directional code has left my head spinning on numerous occasions, I can tell you - and prompted me to write, in effect, my own direction-agnostic CSS framework. If you think that sounds easy enough.... go ahead and try it. Then think about the implications of showing - in some circumstances - content in both right-to-left and left-to-right languages simultaneously on the same page.
So much for the translation of page content: what about the stuff that's inevitably hard-coded into the theme? For that, we've used GlotPress - the translation management web app, as seen at translate.wordpress.org. To be completely honest, it doesn't yet feel like a finished product: one bug in particular derailed us for a day or more, and I consider myself very lucky to have found a solution. But when it works, it's excellent.
There's a dedicated section for audio/video content, powered by a custom post type. In fact, this is probably the code which gives me the most pleasure: to post a video, all the editor needs to do is paste in the URL of its YouTube page, and we do (nearly) everything else automatically - even going as far as pulling the video's preview image into WordPress, and attaching it as the post's Featured Image. For audio clips, we're doing a combination of HTML5 and Flash embedding of MP3 files, to cover as many browsers as possible.
That's not to mention the seamless MailChimp integration on the WordPress register page. Or the voting mechanism. Or the multi-server configuration, in preparation for some anticipated traffic peaks. Or the live webcasting. Or the connections into Sina Weibo, LiveJournal and Mixi, as well as Twitter and Facebook. Or... To be honest, we've been working on it so long, there are probably a few custom features in there that I've completely forgotten.
It's unquestionably the most ambitious project I've ever taken on: and seeing it launch this week has prompted feelings of pride, anxiety and relief in almost equal measure. I now know more about quotation marks, soft hyphens and non-Latin typography than I probably ever wanted to know.
Oh. Next week, we have our first scoping meeting for Phase Two. Gulp.
The Telegraph Media Group began embracing WordPress two and a half years ago: first its blogs were migrated over, then its My Telegraph community. They then began embracing WordPress people, hiring BuddyPress core developer Paul Gibbs, and hosting London WordPress meetups.
Now they've gone a stage further: releasing a WordPress plugin in the company name. Expire User Passwords has obvious applications in a more corporate environment: it's a zero-configuration plugin which you simply install and forget about. Until you reach the 30-day expiry point, when you're prompted to renew your password.
It's available from the WordPress repository, where it's owned by Paul and a new Telegraph user account. Or alternatively, they've just started making use of a Telegraph Github account which they seem to have registered two years ago.
Well done, Team Tele. Great to see a large corporate giving back to the WordPress community. I'd love to know how they got over the inevitable concerns about plugin support, liability and so on.
I came away from this year's UKGovCamp with an uncomfortable sense of there being an 'us' and a 'them'.
The day opened with Dave Briggs declaring the event was different because, among various examples he quoted, it didn't have a keynote address. The day concluded with a keynote address by a senior Cabinet Office civil servant, who proceeded to tell us what his team of hired specialists were going to do.
But the 'us and them' was even more apparent in the first session I attended, led by the Cabinet Office's Liam Maxwell, on the subject of open standards. The substance of the presentation was:
- we think open standards are very important;
- we're doing lots of very important things, none of which we can talk about;
- but we'd value your input when the time comes.
I voiced a certain amount of frustration in the questions which followed, so it won't surprise Liam if I say it all felt thoroughly unsatisfying.
Having said that, I did - and do - have some sympathy. Open standards are commercial dynamite: software lock-in is worth £££millions to the big vendors. Enough for those vendors to put up a hell of a fight, in defence of an unsustainable and #unacceptable status quo. And to extend my metaphor just one step further, Liam and his colleagues were keeping their powder dry.
The aforementioned time for our input has now come: the Cabinet Office has opened its consultation process, with Liam asking for 'as much feedback from the IT community as possible... There’s a lot of strong opinion on this subject,' he says, 'so I’m urging people to take this opportunity and let us know what they think.'
The consultation 'document' is online, and it's been done on WordPress.
The interactive part of the site comes in three pages of questions, two of them very long and very scary, powered by a bespoke plugin (by the look of it). At the very top, it declares:
which may not be quite what they meant. Based on the error message displayed following a blank submission, it looks like only name and email address are actually required, plus an answer to at least one question. And if there's an asterisk anywhere, I've yet to find it.
The exercise itself is all rather semantic, and the language inevitably technical. It goes way over my head, to be perfectly honest. But my feelings on open standards are easily summarised:
As open as possible, as standardised as possible, as soon as possible.
Based on my experience in the Civil Service, it's that final point which is probably most important. I've been scarred by past experiences - notably around the Government Category List and eGMS, which both took several years, went through numerous iterations, and yet seemed to deliver no tangible benefits. (Correct me if I'm wrong.)
This time round, hopefully, things are different. The 'cloud computing' narrative has been widely accepted; and implicit in that is the belief that government's needs are not unique. Government should be looking to embrace standards that are already being widely adopted - and where there are any (perceived) deficiencies, it should play a part in their development.
Exactly how it does that, frankly, is up to smarter people than me.