Much consternation in certain political circles this afternoon, as Boris Johnson renames his Twitter account... and takes a quarter of a million people's details over to his election campaign HQ.
Johnson was elected on 4 May 2008. His first tweet came on 8 May 2008 ('Setting up social marketing accounts!') - although it's not entirely clear what username the account used when it was created. In January 2009, though, he changed that username to MayorOfLondon. And the account has been quoted since at least May 2009 in official City Hall press releases, as his official account. Or in the case of that May 2009 press release, 'the Mayor's Twitter site'.
Before today's change, the URL associated with the account was http://www.london.gov.uk/ - and the biography read:
City Government for Greater London under the auspices of the Mayor of London
Could it have sounded more official?
(Something similar has happened to his Facebook account too; facebook.com/borisjohnson is now adorned with BackBoris2012 logos, and contains no history prior to 17 March 2012. And yes, that Facebook URL has similarly been promoted in the past as his official presence.)
In response, there's a statement on the BackBoris website:
As some of you may have noticed, earlier today Boris changed the name of his Twitter account from @MayorofLondon to @BorisJohnson. While the name of the account may have changed, rest assured that the account is still - and has always been - controlled by Boris.
No City Hall resources will be used to update or maintain the account - that would be against the rules. Given we're now in the official election period, this change is being made so there can be no question of Boris using official resources to campaign.
Of course, those who no longer wish to follow the account are welcome to "unfollow" at any time.
Of course, it's not the fact that future City Hall resources will be used; it's that past City Hall resources have already been used to build up a significant following. And the last line is somewhat ill-advised, in my opinion.
I'd be very interested to find out from people at City Hall - or indeed, from HM Government's Deputy Director of Digital Engagement, Emer Coleman who used to be City Hall's head of digital projects - as to whether City Hall thought it 'owned' the account on behalf of the office of Mayor.
If the account was always personal, Boris should have used his personal name. By using the name of his elected office, the natural assumption is unquestionably that you are following the individual in his/her elected capacity - as was the case with the Prime Ministerial Twitter account.
Here's a tip. If you're working in a government web team, I strongly advise you get something in writing to confirm who exactly owns any Ministerial accounts - rapidly.
Update: a climbdown of sorts. Boris has tweeted:
To be clear- @borisjohnson will only be used for discussing mayoral duties. To follow me on the campaign trail, follow @backboris2012
'As he entered the campaign he was determined to ensure there was no confusion between him as Mayor and him as a candidate and therefore changed the name of his Twitter account.
‘He did not expect this openness and honesty to have created such hysteria.
‘So in case there is even one Londoner who has a problem with what he did, he will not use that account for the campaign and instead can be followed from the political front on @BackBoris2012.’
Has he reverted back to being @MayorOfLondon? No. But the username hasn't been abandoned - someone, and you have to hope it's someone close to Boris and/or City Hall, has bagged it. Hopefully for safe keeping. We don't want this happening again, do we.
Updated update: Somewhat inevitably, Boris has - pardon the pun - backed down. He's now reverted to using @MayorOfLondon as his account name, and the BorisJohnson account has gone blank again.
I've just started work on a project to build a first-ever intranet for a small UK government entity. I've been waiting for ages for an opportunity to put BuddyPress, the semi-official WordPress add-on which promises a 'social network in a box' experience, to the test... and this is it.
It's still early days in the thought process - but the plan is to make heavy use of BuddyPress 'groups', to generate a personalised real-time view of activity in the areas in which you have a specific personal interest. Each team or department would be a group. Each cross-departmental project would be a group. There might also be groups based on physical location, social activity, union membership and so on. Some would be mandatory (eg 'all staff'); some would be open for anyone to join; some would be invite-only, or totally hidden.
The BuddyPress 'activity stream' filters itself automatically according to each signed-in user's group memberships; so your homepage (tbc) view would consist only of updates - news, forum discussions, events, document uploads, new members etc - from the groups you belong to. No two users' views would be identical. It's easy to see how powerful this could be; and in a post-Facebook world, it shouldn't be an unfamiliar concept.
Anyway... I started preparing wireframes yesterday, and hit an immediate question. What should go in the 'logo' space, reserved by convention in the top left corner?
Most intranets I've had the misfortune to use in the past have had names. But I wondered, did people actually use those names when referring to them? When asked 'where can I find that document?', would people generally answer: 'On the intranet.' or 'On [insert name here].'? Personally, I'd instinctively say the former myself; but after 17 years in this business, I'm used to the fact that I'm not 'normal'.
So I asked Twitter. And to be honest, I was surprised by the response.
Almost without exception, people responded that yes, their intranet did have a name... ranging from the fairly dull ('Cabweb' at the Cabinet Office) to the fantastic ('Narnia' at the National Archives!) to the quite unfathomable (one digital agency chose, er, 'Agnes'). And yes, people used the name in common parlance.
One or two people reported failed attempts to name their intranet: but the names they mentioned - '[organisation name] Online', or 'The Hub' - seemed very generic. It's almost as if people will make an effort to use the name, if you've clearly made an effort to make one up. If the name seems half-heartedly conceived, it shouldn't come as a surprise that the staff don't buy into it.
I'm not claiming any scientific validity for these results; but I'm left in no doubt that I'm going to have to think up a name.
The next phase of the gov.uk beta programme was opened last night: a six-week public testing phase for the 'Whitehall' information, now renamed 'Inside Government' (complete with tautological URL). Ten departments are covered initially, including all the obvious online big-hitters such as Health, BIS, Defra, FCO and DFID.
It looks very much like the rest of the gov.uk platform - as you'd expect, with a Global Experience Language - so it feels more like an extension than an enhancement. This is most striking with the individual department 'subsites': a unique 'spot colour' aside, and with an unexpected exception made for the MOD crest, all look identical and carry the same navigation. Departments aren't going to recognise these as 'their' sites - but that's kind of the point.
It's far too early to make definitive judgments about the presentation, not least because the team admit it's much more unfinished than previous previews. It's hard, therefore, to decide what's deliberately minimalist, and what's just 'not done yet' - and therefore, hard to offer helpful criticism. A lot of the pages feel very plain, probably too plain. In particular, I'm not fond of the very 'boxy' presentation of many pages: see the main News or Publications pages as good examples. I just don't find my eye being guided anywhere, and I don't get any sense of significance. But maybe they just haven't been 'done' yet.
Writing on the GDS blog, Neil Williams describes the 'custom publishing engine properly tuned to the needs of multiple users and publishers across Whitehall, and built specifically for the kinds of things governments produce. ... On average, publishing to GOV.UK was 2.5 minutes faster than WordPress and 11 minutes faster than Directgov,' he claims: I've already taken him to task on that one.
As a website, it's what they said it would be, and it looks like we knew it would look. So it doesn't feel like much of a leap forward, and could actually be quite a tough sell around Whitehall. But this part of the gov.uk project isn't about a website. It's about redefining how government departments see themselves, present themselves, and talk about what they do. And that's w-a-y more difficult than building a website.
Puffbox's longest-standing working relationship in Whitehall is with the Wales Office; it was there, don't forget, that the whole WordPress-in-government thing started back in late 2007. We moved them on to a multisite setup just before the 2010 general election; and we're seeing the benefits, through sites like the one we launched in November for the Commission for Devolution in Wales.
They're about to start a round of public engagement events, and they asked us if we could add a Google Map to the site... which, of course, is bilingual, English and Welsh. It's not rocket science these days, but it's probably the smoothest implementation I've done, and I thought it might be worth sharing.
We've defined 'event' as a custom post type, non-hierarchical (ie more like posts than pages), with a full set of fields. It gives the 'more info' pages a nice URL, and keeps them nicely self-contained, with benefits for both admin interface and template access.
We've then added a 'metabox' to the 'edit' screen, for the various elements which define an event: basically date, time and location. When you click into the 'Event date' box, you should get a popup jQuery-based calendar - but if you don't for some reason, or if you're a keyboard wizard, you can still enter it manually. We've left the 'time' field freeform: we didn't plan to do anything too clever with the event times, and besides, times are often rather vague.
I'm quite pleased with how we're doing the location. We ultimately want two things: a text-based name, which should make sense to humans rather than computers; and an exact geolocation, ideally latitude and longitude, for the machines. So, looking down the page, first thing you come to is a text search box. If you know the address, particularly if you have a postcode, you can enter it here; then click 'find on map'. This sends the query to Google, and makes a best-guess for the precise location, indicated by the crosshair hovering over the centre.
Google's guesses are usually pretty good, as you'd expect. But you can fine-tune them by dragging the map around - even to the specific building. And every time the map moves, whether via the search or via dragging, the coordinates update automatically.
The text name and the coordinates are saved separately - which means, once you've pinpointed your venue, you can then go back and edit the text-based name, to make it less of a search query, and more of a human-friendly description.
That gives us enough data to put the markers on the map - with accuracy down to a few metres if you're so inclined! - and to generate some meaningful text content too, in the form of a table and stand-alone page. And yes, we've got all the info in both English and Welsh - although this site predates our work on Babble, so it uses WPML. (I say 'all': it turns out, Google Maps doesn't do Welsh.)
Like I say, it's not rocket science. But it's always a joy when you can hand what is actually quite complex functionality over to a client, and it just works*.
Since last spring, Mr Wheatley and I have been working with Guardian columnist Timothy Garton Ash, in his role as Professor of European Studies at St Antony's College, Oxford on a project called Free Speech Debate. It's been the main focus of my attention for the last six months or more - which explains why the blog has been rather quiet of late.
As the name suggests, it's a site for editorially-led discussions on issues of freedom of speech, in today's socially-networked world. And as you might expect from someone so well-known and well-connected, the Professor has managed to secure contributions from a host of famous names in the field, from around the world - from Jimmy Wales to Max Moseley, not to mention numerous writers, and the odd Nobel Prize winner.
We've done similar 'editorially-led discussion sites' for numerous clients in the past, but never anything on this scale. You see, one of the site's 'ten draft principles' includes the right to speak in your own language. So we had little choice but to publish in multiple languages. And yes, that includes the difficult ones too.
The site went live initially in English-only. But in the last week, we've rolled out German, Turkish, Spanish and Chinese... with Arabic, Farsi/Persian, French, Hindi, Japanese, Portuguese, Russian and Urdu to follow shortly.
Our original plan was to use the WPML plugin for WordPress: we knew it had weaknesses, but it was the best of a disappointing bunch. However, for reasons I won't go into here, we subsequently decided to write our own plugin, based on modern WordPress features such as custom post types and taxonomies. It's called Babble; and I'm delighted to say, as soon as we've tidied it up a bit, we'll be open-sourcing it.
The way we've implemented Babble on FSD, you enter each post (etc) in the site's default language first, then - using a menu added to the Admin Bar - you can 'create' a translation in any of the languages you've enabled on your site. All the metadata - taxonomies, comments on/off, and so on - are inherited from the default language; so all you need to do is translate the text, and hit Publish. You'll even see the WordPress admin interface in the appropriate language as you're translating.
Comments are merged across translations: which means you'll potentially have a discussion thread made up of comments in different languages. Problem? No. We've implemented Google Sectional Translate, to let you translate each comment instantly into the language of the page currently being displayed (via ajax).
The entire site, in every language, is being generated by the same WordPress theme, be it a left-to-right language like English, or a right-to-left like Arabic. Building bi-directional code has left my head spinning on numerous occasions, I can tell you - and prompted me to write, in effect, my own direction-agnostic CSS framework. If you think that sounds easy enough.... go ahead and try it. Then think about the implications of showing - in some circumstances - content in both right-to-left and left-to-right languages simultaneously on the same page.
So much for the translation of page content: what about the stuff that's inevitably hard-coded into the theme? For that, we've used GlotPress - the translation management web app, as seen at translate.wordpress.org. To be completely honest, it doesn't yet feel like a finished product: one bug in particular derailed us for a day or more, and I consider myself very lucky to have found a solution. But when it works, it's excellent.
There's a dedicated section for audio/video content, powered by a custom post type. In fact, this is probably the code which gives me the most pleasure: to post a video, all the editor needs to do is paste in the URL of its YouTube page, and we do (nearly) everything else automatically - even going as far as pulling the video's preview image into WordPress, and attaching it as the post's Featured Image. For audio clips, we're doing a combination of HTML5 and Flash embedding of MP3 files, to cover as many browsers as possible.
That's not to mention the seamless MailChimp integration on the WordPress register page. Or the voting mechanism. Or the multi-server configuration, in preparation for some anticipated traffic peaks. Or the live webcasting. Or the connections into Sina Weibo, LiveJournal and Mixi, as well as Twitter and Facebook. Or... To be honest, we've been working on it so long, there are probably a few custom features in there that I've completely forgotten.
It's unquestionably the most ambitious project I've ever taken on: and seeing it launch this week has prompted feelings of pride, anxiety and relief in almost equal measure. I now know more about quotation marks, soft hyphens and non-Latin typography than I probably ever wanted to know.
Oh. Next week, we have our first scoping meeting for Phase Two. Gulp.
The Telegraph Media Group began embracing WordPress two and a half years ago: first its blogs were migrated over, then its My Telegraph community. They then began embracing WordPress people, hiring BuddyPress core developer Paul Gibbs, and hosting London WordPress meetups.
Now they've gone a stage further: releasing a WordPress plugin in the company name. Expire User Passwords has obvious applications in a more corporate environment: it's a zero-configuration plugin which you simply install and forget about. Until you reach the 30-day expiry point, when you're prompted to renew your password.
It's available from the WordPress repository, where it's owned by Paul and a new Telegraph user account. Or alternatively, they've just started making use of a Telegraph Github account which they seem to have registered two years ago.
Well done, Team Tele. Great to see a large corporate giving back to the WordPress community. I'd love to know how they got over the inevitable concerns about plugin support, liability and so on.
I came away from this year's UKGovCamp with an uncomfortable sense of there being an 'us' and a 'them'.
The day opened with Dave Briggs declaring the event was different because, among various examples he quoted, it didn't have a keynote address. The day concluded with a keynote address by a senior Cabinet Office civil servant, who proceeded to tell us what his team of hired specialists were going to do.
But the 'us and them' was even more apparent in the first session I attended, led by the Cabinet Office's Liam Maxwell, on the subject of open standards. The substance of the presentation was:
- we think open standards are very important;
- we're doing lots of very important things, none of which we can talk about;
- but we'd value your input when the time comes.
I voiced a certain amount of frustration in the questions which followed, so it won't surprise Liam if I say it all felt thoroughly unsatisfying.
Having said that, I did - and do - have some sympathy. Open standards are commercial dynamite: software lock-in is worth £££millions to the big vendors. Enough for those vendors to put up a hell of a fight, in defence of an unsustainable and #unacceptable status quo. And to extend my metaphor just one step further, Liam and his colleagues were keeping their powder dry.
The aforementioned time for our input has now come: the Cabinet Office has opened its consultation process, with Liam asking for 'as much feedback from the IT community as possible... There’s a lot of strong opinion on this subject,' he says, 'so I’m urging people to take this opportunity and let us know what they think.'
The consultation 'document' is online, and it's been done on WordPress.
The interactive part of the site comes in three pages of questions, two of them very long and very scary, powered by a bespoke plugin (by the look of it). At the very top, it declares:
which may not be quite what they meant. Based on the error message displayed following a blank submission, it looks like only name and email address are actually required, plus an answer to at least one question. And if there's an asterisk anywhere, I've yet to find it.
The exercise itself is all rather semantic, and the language inevitably technical. It goes way over my head, to be perfectly honest. But my feelings on open standards are easily summarised:
As open as possible, as standardised as possible, as soon as possible.
Based on my experience in the Civil Service, it's that final point which is probably most important. I've been scarred by past experiences - notably around the Government Category List and eGMS, which both took several years, went through numerous iterations, and yet seemed to deliver no tangible benefits. (Correct me if I'm wrong.)
This time round, hopefully, things are different. The 'cloud computing' narrative has been widely accepted; and implicit in that is the belief that government's needs are not unique. Government should be looking to embrace standards that are already being widely adopted - and where there are any (perceived) deficiencies, it should play a part in their development.
Exactly how it does that, frankly, is up to smarter people than me.
In a single sentence, Stephen Hale's latest blog post encapsulates the sheer joy of moving from a classic old-style CMS to WordPress.
By switching out Stellent for WordPress as our primary content management tool, we changed the processes by which web content was created and published. Editors no longer needed the same in-depth knowledge of the CMS to publish content, it was possible to publish more quickly, and it was much easier for us to devolve the act of publishing. The day-long CMS training course for new editors was replaced with a 1 minute (I timed it) session showing staff how to click on “add new” and type in a box.
From what I hear, the GDS training course for those publishing on the new unified platform is going to take a _little_ longer than that.
I blogged earlier today about Saul Cozens and his 'v0.1 alpha' WordPress plugin for embedding gov.uk content via WordPress shortcode.
The great news is, Saul has uploaded it to a public repo at Github, meaning it's now:
- dead easy for you to download, and keep up to date
- possible for you to fix, enhance and generally improve it
Saul has very
foolishly kindly given me commit privileges on it, and I've done a bit of work on it this evening - a bit of error handling / prevention, adding basic parsing of gov.uk's multi-page 'guide' content (including any videos!), and general housekeeping.
In other words, it's now less likely to simply fail on your page. It's likely to fail in more complicated ways instead.
There's one substantial catch: and this is an appeal for help.
It adds a number of extra formatting options, to create things like information and warning 'callout' boxes. And whilst there are PHP based libraries for Markdown, which we can bolt on easily, there's nothing instantly WordPress-friendly for this new govspeak.
Yet. If you know a bit of ruby, if you've got a bit of spare time, and if you want to help expand the reach of govuk's content to charities, community groups, local government, etc etc... now's your chance.
If you fancied one of those £73,000pa developer jobs, I bet it would look great on your application.
Saul Cozens has done a wonderful thing. He's written a WordPress plugin which allows you to integrate content from the new gov.uk site within WordPress pages. You add a WordPress shortcode, of the form:
It pulls in the corresponding JSON data - which is really just a case of adding .json on the end of the URL - and plonks it into your WordPress page. So far, so not tremendously complicated.
Here's the good bit. No, sorry, the fantastic bit. Not only does it plonk the text in, it can also plonk forms into place. And keeps them active as forms. Yes - actual, working forms.
My screenshot above is taken from my test server: no offence Saul, but I'm not putting a v0.1 alpha plugin on my company site! - but it shows me successfully embedding the Student Finance Calculator 'quick answer' form within my current blog theme, and sending data back and forth. Sure, the CSS needs a little bit of work... but Saul's concept is proven.