Here’s a nice example of a reporting video shot using a phone from Auntie Beeb. Note how he uses the phone’s mobility to give a sense of environment, and keeping himself in shot and the central point helps the audio stay acceptable.
Posts tagged BBC
This blog is eleven years old today.
But really, who cares on a day when:
- Mail Online took over Metro Online
- Flipboard bought Zite
- BBC Three heads to being an online-only channel
It’s interesting tracking the relative ages of those things, though. BBC Three is less than a month older than this blog – it was launched on the 9th February 2003. While Metro newspaper dates back to 1999, the website appears to have launched in 2004 – making it younger than this blog. Flipboard and Zite are both whippersnappers, both around three years old.
So, I need to face it. This blog, while not even teenager, is old. But while it may be old, at least it has stamina…
Photo by Martin Snopek, and used under a Creative Commons licence
James Harding on journalism today:
But these new formats are creating exciting competitors to the well-known 2 minute 15 item on a news bulletin or the 450 word article in a newspaper. Look at Now this News, which delivers the news in 6,15 and 30 second videos, easily shareable and chiefly for mobile; or take a look at Geofeedia, which enables you to follow a story on Twitter by the location of the tweeter; or use Touchcast, which creates a Minority Report-style smorgasboard of interactive screens on the iPad, and imagine the possibilities of creating your own multi-media news story; and watch a vlogger such as Philip DeFranco or JacksGap and it is clear that, thanks chiefly to Bill Gates, Steve Jobs, Sergei Brin and Larry Page, the modern computer has more or less put the combined power of the TV studio and the newspaper printworks in the hands of any imaginative individual – and to dizzying effect.
So where does the opportunity for the professional journalist lie?
Yes, breaking news channels, websites and tweets need to be fast, but slow, disciplined and meticulous investigations as well as considered and patient analysis mark out the very best newsrooms. Whether it has been The Sunday Times’ long pursuit of Lance Armstrong, Channel 4’s dogged investigation into Plebgate, the Mail’s tireless campaign on Stephen Lawrence, they have excelled thanks not to speed but time. And, in my experience, whether it has been coverage of child sex grooming, the family courts, adoption or tax avoidance at The Times or, more recently, showing first-hand the assault on civilians in northern Syria or exposing the bloody work of the Military Reaction Force in Northern Ireland, these projects have always taken longer than expected and been better for it.
Fascinating. If only there was a clear business model for slow, investigative journalism – but that’s not really something the BBC has to worry about…
The whole speech is well worth reading – the stuff on how the BBC is using social media verification is well worth a look, for example. Anyone come across audio or video of it yet?
Fascinating blog post exploring how the BBC is experimenting with linked data:
After producing a long list of possible ‘problem spaces’ we prioritised four areas to explore:
- Location and linked data. How might we use geolocation and linked data to increase relevance and expose the coverage of BBC News?
- Events and linked data. How might we make more of BBC News ‘events’ using Linked Data?
- Politics and linked data. How might we better contextualise and promote BBC’s Political coverage online using linked data?
- Responsive Desktop. How might we overcome older browser challenges to get BBC News’ responsive service to desktop browsers?
So the question was ‘how might we tag the BBC News archive with linked data and expose this data source for prototyping?’
Data journalism is one thing, using the structured data that should be inherent in your content to present it in new, interesting and (most of all) useful ways to site visitors is quite another. It’s the beginning of serious rethinking how we present news in the digital era.
Charlie Beckett makes an insightful point about last night’s Panorama covering the BBC’s multiple failures to expose Jimmy Savile as a paedophile and sexual predator:
Likewise, let’s put the Newsnight controversy in perspective. The Newsnight debacle is familiar to anyone who’s seen editors under pressure make unfortunate decisions in a culture where risk is rarely rewarded and where ‘brave’ is a term of mockery. I doubt there was any direct attempt by senior bosses to kill that film.
It’s very easy to go hunting for people to blame in a situation like this. Far more often it’s the culture to blame – but a culture is still made up of people. A lot of people. In many ways, that’s worse.
British journalism has a lot of questions to ask of itself.
How are news organisations dealing with stream publishing? Kathryn Corrick directs the flow…
Jason Mills, editor, web for ITV News: the ITV site is built on a stream. it shows that you don’t have to be a station to have a news channel.
Raju Narisetti, managing editor, Wall Street Journal Digital Network: It’s a flowing list of content – you name it, it’s in the stream. We’re doing more of it because his competition is not the other sites – it’s readers’ time, their one non-renewable resources. Streams give them more for their time. And people are used to seeing discrete periods of time for events – why can’t news coverage happen in the same way.
Patrick Heery, UK editor, BBC News website: Streaming news has always been part of their operation – but Ceefax is closing. So they’re doing it in new ways. They’re mixing up video, content from correspondents and the best of the user content. They’re hugely popular with the audiences. They’ve reorganised the newsroom to bring the Twitter writers into the heart of the newsroom.
Pete Clifton, executive editor, MSN: They can now switch to liveblogging when they need to – either doing it themselves or through the press association. They need to pick their battles, though. They know they’re good at entertainment, so they target those occasions. They should also think about innovative things, to balance their lack of scale. They’ve started doing live trending blog on the front page, driven by signals from social media. They want to make live sports coverage more interactive and involve the audience. But simplicity can be the key on alive page, especially on mobile.
Ben Schneider, senior director and general manager for CoveritLive, Demand Media: It’s difficult to make sense of the vast firehose of information coming at you. That’s what you turn to the big players for a story. They want to bring those streams into one place where they can be shaped by the journalist or editor.
Raju: Google wasn’t really indexing the stream – it didn’t see it as a single piece of content. Plus, 40% of their visitors go straight to the home page. What do they do when a both a stream and a popular branded blog are covering the same event? They run the stream on the page, and a link to the blog as well.
Jason: Not that I can think of yet. Their stream isn’t an add-on – it is the site. Can it be done with all stories? Pretty much. Stories develop. We just open source our journalistic notebook. But your publishing tool has to be very fast – it was by Made by Many.
Ben: There was a liveblog covering pro-Obama issues. Their Twitter hashtag automated importing starting accidentally pulling in anti-Obama comments. They quickly removed them, but the realised that unfettered access may not be the answer.
Pete: Accidental obituary releases. Large “Hardon” Colliders.
Patrick: It’s more complicated than it should be to start up live pages.
Tips from the audience on B2B liveblogging:
- Lots of prepublicity
- Work in co-operation with the audience
- Open a dialouge
- Use tools that work in low bandwidth.
Pete: Pin the key points to the top, so people can get a quick picture of what’s happening.
Ben: It’s very contextual to the event: photos are vital to an Apple event, for example.
What’s the next big innovation in liveblogs?
Ben: What is everyone asking for? One is more data. That’s a theme for everything. And how can they do more the engage people more? We need more intuitive ways of filtering through massive amounts of content.
What’s the influence of sport on the influence of liveblogging? What might emerge from the Olympics?
Jason: We didn’t look at sport, we looked at how people consumed news in general; the Arab Spring etc.
Raju: Sports has been less of an influence to us – for us, it’s been the way markets are covered. The elements that make for a good stream are sometimes not available for sport because of rights issues – the video and the audio.
Pete: The sports people were pioneers in showing how you can write copy in a way that compels people even without video. Not everyone can do it. Sport really pointed the way at the BBC.
Ben: Sports is a huge part of what CoverItLive is used for. The Olympics presents a unique challenge. Coming from across the pond, there’s a delay effect (for the US audience). They want time-shifted streams. Give them the opportunity to see it again.
Everybody is build live platforms – have any of you figured out the magnetisation piece? We know users are super-enganged, but we struggle with metrics.
Conrad Quilty-Harper: Engadget used sponsorship – they knew how many people would be coming. There should be more people liveblogging from the field. People today want to sit in offices doing it – not at the event. (I beg to differ, sir – I’m typing this in the field. ;-))
Raju: Time-span becomes a relevant measure again. People spending more time on the site means more ad impression, which are good for us.
Jason: Two models: banner ads and sponsorship. The tagghing enables sponsors to sponsor certain parts of the stream.
Pete: The story we tell advertisers is people coming back to the site and staying longer – it’s not a specific sell, but live is part of that. If we can bring all the live elements together, it will be a great place to look for sponsorship.
Chris Hamilton, BBC: Is the article dead at the hands of live digital streaming?
Ben: No. But it is certainly secondary if not tertiary. But there will always be the case where people need to rebuild context around something.
Pete: No, you have to offer the choice. Some people just want a well-written, concise version of what occurs.
Patrick: No. There are lots of live football reports – but a match report at the end of it.
Raju: I would be very cautious about streams that people just watch rather than engaging with it – because your business model goes away.
Jason: Our audience doesn’t distinguish. They don’t mind. We are looking at different ways of telling stories using the stream.
A packed and hot room for a panel on the current state of publishing on mobile. Katie King back in the chair.
Kate Milner, mobile product manager, BBC News
Tablets and mobile are changing how people are accessing BBC News content. Traditionally, they’d focussed on the lunchtime peak of desktop. But tablets are bring us huge traffic peaks in the evening, and mobile in the mornings. They’ve been on mobile for two years – 12m app downloads globally. People expect better services from them in apps – but it’s a complicated landscape. Browsers are getting more capable, and the number of devices people are using is growing.
They’re shifting to responsive HTML5 web design – the website automatically adapts to show more content as screen size increases. As the device gets faster, they can offer better quality video. They update the site’s codebase every two weeks. The can customise by capabilities – or can do it by geolocation on mobile devices. They’re working on richer advertising for outside the UK, and continuing to optimise for tablets. They’re working their way up to the desktop, and will completely replace the existing site at some point. They know for sure, thanks to responsive design, that their site will just work on the newly-announced Nexus 7.
They’re not abandoning apps – the marketing opportunity around big events cannot be ignored. They see spikes of downloads around big new events.
Robert Shrimsley, managing editor of FT.com;
The story of the FT leaving the Apple app store has been often told. Their fundamental principal is they want to be available everywhere their readers are. They’re not in a hurry to commit to being available through applications like Flipboard or Zite – but hope to do it. It might not be their optimal way of delivering the content, but if it’s what the readers want, they want to deliver it that way fi they can within their business model.
They mine data religiously. They have so many dashboards that it’s staggering. they almost have a data overload situation. There’s an advertising benefit as well, as they target ads. But they can also customise experiences and target stories.
The iPad app has changed the audience’s relationship with the paper – they now treat it as a weekend read, too. So they’re changing what they do to adapt to that. The Daily is failing – he thinks it’s because its form over function. The Week made the mistake of updating daily. Their raison d’être is weekly. And The Economist is a digital representation of the magazine and nothing else. You can learn from all of this. The challenge is to make the product they have give the best experience they can on new platforms. The one core difference is on functionality – you need to make sure it’s up to snuff. Make it easy to share, e-mail and comment.
They see the iPad version as a hybrid. They produce a dynamic version which is up to date with their US rivals. Their newsrooms internationally don’t just own their market – they own their time zone, and can update the content in the app during their “awake” period. Focus on your core purpose and everything else will take care of itself.
Subhajit Banerjee, mobile editor, Guardian
32% of their daily traffic comes through mobile. As so many people have said today, different devices at different times of day. Subhajit was a bit hijacked here – many of his slides had appeared at the business model session this morning, or paralleled earlier in the session.
Interestingly, though, there’s a dramatic swing to mobile at weekends, which has not been discussed before.
- Best products for different times of the day
- Editing for multiple platforms
- Understanding the user
Four visions of data journalism, moderated by Kathryn Corrick, digital media consultant.
Bella Hurrell, specials editor on the BBC News website
The BBC specials team produces a whole range of added-value content for the BBC website. They’re becoming part of a visual journalism team at the BBC. Data journalism can be long slow projects, but not all of them. You should pick subjects that have a shelf life – road traffic accidents, unemployment, that sort of thing, Update the data and people will keep coming back. Make tools they will want to keep using. Build sharing into it.
They did a project plotting road traffic fatalities through FOI requests – the map was the most popular element, because it allowed people to understand what the situation is where they lived. They also visualised some of the most interesting data – for example, bikers are 21% of fatalities, compare to 1% of traffic. They liveblogged every accident on one day to bring publicity to it, to help amplify the data and give it more life. It was really popular and followed nationally. Their military deaths in Afghanistan gets traffic every time there’s a new death. Their unemployment tracker gets updated monthly and gets s ready stream of traffic.
Visualisation helped bring a dry subject like the Eurozone debt web to life. They had comments open, and responded to the issues raised. “People really appreciated it”. They key seems to be a double-whammy of personally applicable information that is also globally relevant.
Claire Miller, senior reporter and data journalist, Media Wales
From global to very, very local… The bread and butter of what Media Wales do is government data. The focus is still stories for the paper, so they’re reacting to what is released by government, and finding stories in that. Beyond the day to day, it’s a lot of FOI data used to create stories. With FOI you can get the data you want at the level you want. For example, she asked exactly when and where all the parking tickets where handed out in Wales. They visualised it using Tableau. A&E visits by location, not surprisingly, increased nearer the hospital you lived, and that showed up well on a map. Mapping empty homes allows quick identification of hotspots.
With open data, more and more stuff is being published, so there’s lots of potential.
People look for specific things, as uses of the data store show – local elections, the Olympic torch relay and sport. Education bubbles along all the time… They ended up making their own Olympic Torch map, because they couldn’t embed the official one easily. It went crazy. It was the most popular thing on the site. Wales lacks the same easy access information on school performance in the UK. Media Wales gathered everything they could find, gathered it into one app, and let people access it.
And anything with rugby in it is popular…
Damian Kimmelman, CEO, Duedil
Everything that consumes electricity will inevitably be connected to the internet. And that means it will leave a data trail. And they are, in his words, “data whores”.
Duedil is a site for examine the state of companies using available data (Martin did an excellent write-up of their Hacks/Hackers talk). They’ve had acquisition offers for £20m – and they’ve turned it down. They’re still seed funded. And they’re still finding new ways of making data more interesting and useful. They’re planning on launching a facility called lists. Create your lists of types of companies, and use the data to find new ways of tagging, categorising and analysing the companies. But they need more information. Mapping the companies around your social graph – will that show whose companies have changed dramatically over the last few years?
Heirecrchy of needs for data: it needs to be clean – deduced and usable. It needs to be findable – and linked.
Users need to know the provenance of data – who touched it, who keyed it in? Did the accountant make a mistake? The more people touch data, the more imperfect it. It’s important to understand the authority of a dataset.
James Ball, data journalist working for the Guardian investigations team
He’s a reporter, dammit Jim, not a designer. Whatever you’re trying to do – there’s a dataset you can buy, open, assemble or FoI. But that’s a bit like saying there’s someone who knows the key to your story – how do you find them? He wants to challenge the idea of “from data to story”.
There’s all sorts of caveats when you’re using data from surveys and censuses. Investigating the stat used as the basis of a Diane Abbot comment piece lead to exposure of a biger story – one disproportionate rise in young black male unemployment – which hit the front page. Sometimes readers will simulate investigations – people claiming in comment threads that the vacancies claimed in jobs centres are not real, or zero hours contracts, or the like. They tried to scrape the relevant data from a government site – but it had protections in place. So he had to FOI it – and got the data, albeit heavily encrypted (but the phoned the password over). It was very messy, inconsistent data.
This wasn’t a story from digging around in data. This was questions from the readers and a comment piece which they could answer with data. Do you ring around your sources and ask them for a story? If you’re doing that, you’re doing something wrong. Don’t do that to data, either. Talk to humans, look at news, and then ask the right questions of your data. This argues for not having data journalisms in silos. Don’t just keep them in offices looking at spreadsheets…
The project started in 2009 and posed the question: how might cities evolve in open data environments? He’s not interested in efficiency and transparency – he’s a bit suspicious of them. They’re about control. He’s more interested in openness – and an ecology of open data that allows you to do cool stuff.
They targeted useful data, and showed that useful things could come out of it – which led to DataGM to free up Greater Manchester’s public data.
But they were faced with challenges – they didn’t want it to just be an inwards thing within local authorities. The innovation argument wasn’t taking hold, and a market wasn’t opening up. Do cities like Brighton and Manchester have the scale to build useful things of open data exchanges? Maybe not. Now they looking at CitySDK – an European project for an open data markets based on standardised civic data.
Does open data lead to an open society? Maybe. You need data literacy, so citizens can make more use of it. It’s the preserve of an elite right now. Data arts is one approach to this. It can help demystify data. Emoto – a data visualisation for London 2012.
What they’re working towards is a Digital Public Space. The Creative Exchange is a step in that direction.
If we don’t shape the future we want, we’ll get the future we deserve.
Bill Thompson, head of partnership development, archive development at the BBC
What is the nature of this shared hallucination we’re all about to engage with? Where the physical and virtual space merge… We’ve seen closed data cities. Facebook is one. Do we have a good vision of an open data city? What will that liminal space we occupy be like? We’re at the start of the process of building those cities.
We’re at the stage of identifying the swamp, cutting down the trees and putting logs into the swamp – to make a comparison with Venice.
Bill spent 15 years as a freelance hack, but he was seduced by an offer to help with the BBC’s archive. What’s the most they could get out of the stuff that the BBC forgot to throw away? He got to play with the archive – boxes of documents, records marked “not to be played”. The BBC was set up as a cartel between six manufacturers of radios, to sell more radios. The government required everyone to have a license for it, to regulate it – and then decided to nationalise it. It needed to be looked after. Since then the BBC has been there to act in the public interest – but the detail of how it hass done it has changed. Not everything has been kept – until about 1980 the BBC viewed the magnetic tape as more valuable than the programmes, and wiped them.
In that archive they may have footage of you as a child. Or of your parents. But can you find it now? No.
A lot of it is on paper – and there are now plans to have it digitised. There’s an enormous amount of stuff in the BBC no-one outside knows exists, so how can people ask for it? That’s his group’s job.
And the archive isn’t just “old stuff” – it’s everything that it’s recorded about itself, up until the show that just finished transmitting. A lot of the thought has been about “outputs” – complete shows. But there will be tracking shots of buildings that no longer exist, or dead people’s voices within those programmes. It’s more interesting to think of it as a collection of data – frames, chunks of programmes. If they can be digitised and catalogued, it becomes a data repository with an API. The BBC becomes a massive factory for making cultural product that people who understand RDF and XML can make use of.
Wouldn’t it be great if you could match BBC film or photos with local authority data about buildings? Wouldn’t it be great if you could match politician data with every appearance they’ve made on the BBC? And then you can find out how often they contradict themselves…
They want to bring all this into the Digital Public Space, while respecting the copyrights that exist. At least the catalogue can be there, and then the content as the rights issues are resolved. They’re working with partner organisations for a couple of years now. Lots of public cultural organisations – the British Museum, the BFI – are having similar ideas.
But… the BBC doesn’t know what it has broadcast. So they’re creating Genome, by scanning and digitise the Radio Times – the best record of what the BBC planned to broadcast, back to its 1923. It’s been written about extensively on the BBC blogs.
They’re building a prototype digital public space navigator with partners, to try and navigate user journalists through the digital public space. This is a way of proving internally that it makes sense to work with those partner.
BBC Redux is iPlayer on steroids – with an API. It records the whole digital multiplex, so they can identify problems. Each show has an individual URI, which allows them to build tools on top of it. Snippets allows you to do a full text search of five years of BBC programmes, based on subtitles data. But the rights are complicated- because the BBC doesn’t own many of the programmes it broadcasts.
TheSpace.org – a joint BBC/ACE digital art intervention launching in May 2012. There’s an underlying data structure under all the art works – so it’s catalogued properly. Every piece of work will have a data feed as well as an asset field. They’re using ACE funding to try out BBC open data theories, that will feed into the digital public space.