New data reveals gov web spend, usage & satisfaction

There’s a huge amount of information to digest in COI’s ‘Reporting on progress: government websites 2009-10‘, published this morning. It lists, for virtually every government department, an assessment of staff numbers, staff and non-staff spending, page views and unique users, and where available, outcomes of user surveys, and assessments of accessibility and standards compliance.

Inevitably, there are some scarily large numbers contained within. For example:

  • BusinessLink, one of government’s three super-sites, quotes a £35,000,000 spend on ‘non-staff costs’ – accounting for 27% of the total spend as outlined in the report.
  • There’s no hint of the super-sites approach leading to economies of scale. BusinessLink, Directgov and NHS Choices spent £4m, £5m and £6m respectively on ‘design and build’, way beyond the biggest-spending ministerial departments (FCO and DH).
  • HMRC appears to have 111 people working at least half their time on its hmrc.gov.uk website, costing £7,500,000.
  • Across all departments quoted in the report, we appear to be paying £23,840,000 per year for web hosting.

However, despite COI’s best efforts, I’m still not convinced that the numbers are directly comparable. On hosting, for example, many departments quote £0 – but I’m pretty sure they’re paying for it somewhere. I’m not aware of too many departmental sites built on Blogger, WordPress.com or Geocities.

Some of the most encouraging news comes from the customer satisfaction reports from certain sites – although it’s a pity these numbers only cover half the departments in the study, with HMRC and BusinessLink being obvious omissions. The much-derided Transport Direct claims to have 1.2 million unique users in the average month, with a net customer satisfaction rate of +84%, scoring particularly highly for ease-of-use and design (!). DFID scores +79, Directgov scores +73, as does the MOD.

Other departments, sadly, don’t fare so well. DWP and Transport both show negative numbers for net customer satisfaction: -8% and -1% respectively, with very high %s of people finding ‘none of what I wanted’. I’m wondering if those measures are fair on them, though? – it seems odd with Transport Direct and (I’m guessing) JobCentrePlus, now a major part of Directgov, doing so well. And it must be a bit embarrassing for COI to rank so low in their own study, on an area where they are tasked with setting best practice (12% net satisfaction).

Like it or not, the raw traffic numbers are likely to be the main source of amusement. Predictably, the super-sites come top on all measures; but there’s a suspiciously strong showing for The National Archives, whose opsi.gov.uk site appears to be claiming to have more than 1m unique users every month. Again, BusinessLink‘s numbers stand out, reporting much lower traffic levels than their fellow super-sites. There’s also wide variety in the number of page views per visitor, and monthly visits per unique user, which might merit further investigation.

As with any dataset, it’s a mixed picture. The biggest questions, I think, are over the £23m hosting bill – and that’s unquestionably an understatement, when you consider the number of departments who quoted zero for hosting; and the value-for-money of BusinessLink.

But as with any dataset, there’s a huge risk of misinterpretation of its contents – and I wouldn’t necessarily guarantee that any of the above analysis is either true or fair. Data is good at asking questions, but rarely gives clear answers.

There’s a press release from the Cabinet Office; but to be honest, I wouldn’t bother with it.

Disclaimer: I do web stuff for lots of different bits of government. Many of the departments named above are past or present clients.

8 thoughts on “New data reveals gov web spend, usage & satisfaction”

  1. Completely agree that each department will be counting different things in its numbers (hosting, internal staff, contract staff, tendered work), will almost certainly be counting ‘visitors’ differently and will be surveying differently. I have seen it happen in govt. a number of times.
    On top of that, there’s no analysis of what it would cost to not have each website (enquiry units, paper pushing etc) – which would be pretty hard to measure anyway (so – er let’s not bother, eh ?)
    Particularly when a website offers an online service (say, tax returns for example), it’s not useful to think of it as a ‘website’, but as a service provision. How much, for instance, would it cost to process X million paper tax returns ?? I’m going to hazard a guess that it’s cheaper online.

    I’m in full agreement on the ‘super sites’ too. There’s a happy medium to be reached somewhere between a mess of micro sites vs a few overly bloated monsters…

  2. This information has been collected pretty much every year since 2001 or so, via a parliamentary round robin question. I used to use it as the source for regular reports I put across government on how little we knew about how many websites we had and how much they cost. This is the first time, though, I’ve seen it presented in such a comprehensive way (they’ve got far more data than the PQs ever surfaced which is good thing)

    The problems with the cost numbers is that departments with an outsource contract don’t pay per website or even per server – costs will be allocated in all kinds of odd ways. So you’re right, the hosting numbers are certainly wrong – and probably by a factor of 100% or even 200% at a guess

    As Matt says in the previous comment, the other numbers will have different methodologies too – but with reports like this, the COI will drive the standardisation of approach that will ensure apples to apples.

  3. Can’t tell you how excited I am to have a comment – in fact, two! – from Alan Mather. (For those who don’t know Alan, there was a time when he was the e-gov blogosphere.)

    Of course you’re right Alan. On one level, we’ve known the worst of these figures for some time. And on the other, the new data probably isn’t directly comparable. Not yet. You have to assume this is the start of a process.

  4. The figures, though interesting, are so wildly out of kilter with each other that I can’t help but feel each dept. is reporting very different things. For instance I note that the FCO’s staff number is ‘not feasible’ to calculate, and the other costs don’t seem to tally with http://tinyurl.com/2dawc8v.

    Based on those departments I know, I’m pretty sure that hosting figure includes a multitude of sins, and staff (it’s not what I’d call plain ‘hosting’ cost as you might pay to a hosting company). Not sure I believe all the ‘strategy and planning, £0; testing and evaluation, £0’s, either – even though it feels that way sometimes. Quite a few ‘content provision, £0’s’ too. So to me, the whole thing fails a sanity test.

    A breakdown of total IT spend per dept. would be an interesting one. So would office furniture 🙂

  5. Can’t tell you how pleased I am that my place in the egov blog world has been taken over by people who actually know what they’re talking about. I do want to give credit, though, to my friend John Gotze who was the first to blog on egov, several months before me (sometime in 2001)

  6. Agree with Matt that a breakdown of total IT spend would be interesting and something that has long been sought after. The questionnaires are now out there. I see people every day struggling with how to get at how much any given department spends on software, or servers or networks because their invoices are often not broken down like that. If you thought reconciling COINS was hard, the IT spend break down is going to be even harder.

  7. Alan – yes I’m sure that’s wishful thinking on my part!
    The reason I mentioned it is that the _really_ big ticket items such as http://tinyurl.com/22pr5sj (to name but one) are often not even visible to the public, and doing these better is where the real savings are to be made.
    Whenever a ‘website’ is actually part of some much bigger process/platform then a ‘cost of website’ figure is not that informative – what’s really required is a meaningful analysis of costs and benefits. Doing that would not be easy or cheap and it seems that most journalists, commentators and even ministers aren’t that interested in (or capable of understanding) anything more than headline figures.

Comments are closed.