The project started in 2009 and posed the question: how might cities evolve in open data environments? He’s not interested in efficiency and transparency – he’s a bit suspicious of them. They’re about control. He’s more interested in openness – and an ecology of open data that allows you to do cool stuff.
They targeted useful data, and showed that useful things could come out of it – which led to DataGM to free up Greater Manchester’s public data.
But they were faced with challenges – they didn’t want it to just be an inwards thing within local authorities. The innovation argument wasn’t taking hold, and a market wasn’t opening up. Do cities like Brighton and Manchester have the scale to build useful things of open data exchanges? Maybe not. Now they looking at CitySDK – an European project for an open data markets based on standardised civic data.
Does open data lead to an open society? Maybe. You need data literacy, so citizens can make more use of it. It’s the preserve of an elite right now. Data arts is one approach to this. It can help demystify data. Emoto – a data visualisation for London 2012.
What they’re working towards is a Digital Public Space. The Creative Exchange is a step in that direction.
If we don’t shape the future we want, we’ll get the future we deserve.
Bill Thompson, head of partnership development, archive development at the BBC
What is the nature of this shared hallucination we’re all about to engage with? Where the physical and virtual space merge… We’ve seen closed data cities. Facebook is one. Do we have a good vision of an open data city? What will that liminal space we occupy be like? We’re at the start of the process of building those cities.
We’re at the stage of identifying the swamp, cutting down the trees and putting logs into the swamp – to make a comparison with Venice.
Bill spent 15 years as a freelance hack, but he was seduced by an offer to help with the BBC’s archive. What’s the most they could get out of the stuff that the BBC forgot to throw away? He got to play with the archive – boxes of documents, records marked “not to be played”. The BBC was set up as a cartel between six manufacturers of radios, to sell more radios. The government required everyone to have a license for it, to regulate it – and then decided to nationalise it. It needed to be looked after. Since then the BBC has been there to act in the public interest – but the detail of how it hass done it has changed. Not everything has been kept – until about 1980 the BBC viewed the magnetic tape as more valuable than the programmes, and wiped them.
In that archive they may have footage of you as a child. Or of your parents. But can you find it now? No.
A lot of it is on paper – and there are now plans to have it digitised. There’s an enormous amount of stuff in the BBC no-one outside knows exists, so how can people ask for it? That’s his group’s job.
And the archive isn’t just “old stuff” – it’s everything that it’s recorded about itself, up until the show that just finished transmitting. A lot of the thought has been about “outputs” – complete shows. But there will be tracking shots of buildings that no longer exist, or dead people’s voices within those programmes. It’s more interesting to think of it as a collection of data – frames, chunks of programmes. If they can be digitised and catalogued, it becomes a data repository with an API. The BBC becomes a massive factory for making cultural product that people who understand RDF and XML can make use of.
Wouldn’t it be great if you could match BBC film or photos with local authority data about buildings? Wouldn’t it be great if you could match politician data with every appearance they’ve made on the BBC? And then you can find out how often they contradict themselves…
They want to bring all this into the Digital Public Space, while respecting the copyrights that exist. At least the catalogue can be there, and then the content as the rights issues are resolved. They’re working with partner organisations for a couple of years now. Lots of public cultural organisations – the British Museum, the BFI – are having similar ideas.
But… the BBC doesn’t know what it has broadcast. So they’re creating Genome, by scanning and digitise the Radio Times – the best record of what the BBC planned to broadcast, back to its 1923. It’s been written about extensively on the BBC blogs.
They’re building a prototype digital public space navigator with partners, to try and navigate user journalists through the digital public space. This is a way of proving internally that it makes sense to work with those partner.
BBC Redux is iPlayer on steroids – with an API. It records the whole digital multiplex, so they can identify problems. Each show has an individual URI, which allows them to build tools on top of it. Snippets allows you to do a full text search of five years of BBC programmes, based on subtitles data. But the rights are complicated- because the BBC doesn’t own many of the programmes it broadcasts.
TheSpace.org – a joint BBC/ACE digital art intervention launching in May 2012. There’s an underlying data structure under all the art works – so it’s catalogued properly. Every piece of work will have a data feed as well as an asset field. They’re using ACE funding to try out BBC open data theories, that will feed into the digital public space.