Info

A trade journal of a still-emerging field, written by Adam Tinworth.

Posts tagged community management

The Financial Times is using comments to engage in a constructive discussion around Brexit.

Lilah Raptopoulos, community manager at the FT.:

“Creating a hub where it was clear that we were asking and listening really improved the quality of the comments that came out, because people had fuller ideas and thoughts, and they were more personal.

“We believe in moderated comment sections and I think they are one of the most direct connections we have with our readers.

Compare and contrast with this news from IDG:

We’re no longer taking comments on our websites —including CIO.com, Computerworld.com, CSOonline.com, Greenbot.com, InfoWorld.com, JavaWorld.com, Macworld.com, NetworkWorld.com, PCWorld.com, and TechHive.com. Instead, we’re encouraging readers to interact with us on our social media outlets, such as Twitter, LinkedIn, and Facebook.

It’s the familiar message: “goodbye, comments, why don’t you all pop off to social media”. This industry’s willingness to hand its reader relationships over to Facebook lock, stock and barrel concerns me.

Killing the community manager

Jason Snell, a former IDG employee, celebrates the decision, but makes an interesting aside:

We used to have a dedicated community manager, but that position had been eliminated years before and editors were forced to act as moderators in their “spare time.”

And there’s an even more familiar tale: taking away dedicated community management resource undermines any effort to have a positive community experience for the readers. As John Gallant, Chief Content Officer, IDG US Media says in the announcement post:

Second, while we’ve always valued comments, we’ve also had to deal with the reality of managing spam and policing inappropriate comments—comments that don’t reflect the professional nature of our audiences and diminish the value of community interaction. Moving the discussion to social media obviates those issues.

So does hiring community experts like Lilah Raptopoulos. But that costs money. If you won’t put you money and time into readers relationships – you don’t care about reader relationships.

This is uncomfortable viewing. Men – volunteers – read out nasty and harassing tweets targeted at female sports journalist – to those journalists.

It starts off much lighter than it becomes later on. And then the guys get really uncomfortable with what they have to read out.

An interesting way of showing the impact of internet abuse, once the distance between perpetrator and victim is removed.

The main lesson of Boaty McBoatface

Nat Torkington:

[…] you want opinions, but you also want committed opinions. Your poll/survey/vote will erect (or fail to erect) barriers to participation, and those barriers represent a measure of commitment. No barriers = lots of votes, but high risk of Boaty McBoatface. High barriers = few votes, but from those who care.

It’s basically another example of that classic measurement of potential abuse on any online community: time to penis. Because, in any free-for-all community submission, you’re always going to end up with a picture of a penis. To rewrite the above:

No barriers = lots of participation, but high risk of unsolicited penis

Trolling – hostile, provocative anti-social behaviour – is one of the biggest challenges to any large-scale online community – and that includes comment sections on mainstream publications.

The problem is far, far bigger in the online gaming world, though. And one of the biggest games in the eSports sector – League of Legends – suffers particularly badly. The game’s publisher – Riot – is fighting back with huge studies, conducted with academic rigour, and shared with the academic community:

“We let loose machine learning,” Lin says. The automated system could provide nearly instantaneous feedback; and when abuse reports arrived within 5–10 minutes of an offence, the reform rate climbed to 92%. Since that system was switched on, Lin says, verbal toxicity among so-called ranked games, which are the most competitive — and most vitriolic — dropped by 40%. Globally, he says, the occurrence of hate speech, sexism, racism, death threats and other types of extreme abuse is down to 2% of all games.

2% is still substantial, but the approach here is certainly one community managers across the journalism world could learn from.

Twitter’s fundamental problem is that it has got seedy

Julieanne Smolinski:

Twitter is like a beloved public park that used to be nice, but now has a rusty jungle gym, dozens of of really persistent masturbators, and a nighttime bat problem. Eventually the Parks Department might rip up the jungle gym, and make some noise about fixing the other problems, because that’s what invisible administrators like Twitter staff and municipal recreation departments tend to do. But if the perverts and the bats got to be bad enough with no recourse, you’d probably just eventually stop going.

Yes. That’s exactly it. And the problem is that the very presence of these people changes the behaviours of others. They become more uptight, vigilant and careful. It makes debate stilted and uncomfortable. And you live in quite dread of being sealioned.

And this is insightful, too:

(Additionally frustrating is that everybody is complaining about the safety issues at the park, and instead of addressing them, the city installs a crazy new slide. What? Nobody was calling for that. What about the perverts? What about the bats?)

It does feel like Twitter is putting a lot of time and effort into making its service more appealing and easy to use for newcomers – while ignoring the major challenge for existing users.

Talking of services with two tiers of users, Twitter is making life nicer for its elite “verified” users:

Quality filter allows verified users to hide tweets in notifications containing threats, offensive or abusive language, duplicate content or that are sent from suspicious accounts, similar to the old “filtered” option users had previously. The renamed feature appears to be still rolling out.

Useful for depriving trolls of some of the response they’re looking for – namely “big” names seeing their bile. But as yesterday’s online fracas shows, non-verified users get abuse, too. Just a first step there, Twitter.

[via The Next Web]

Douglas Boulton, one of this academic year’s crop of Interactive Journalism students at City, has just finished a couple of weeks as Ben Whitelaw’s personal coffee table doing shifts on The Times‘s community desk, and he’s shared his experiences:

I’m well aware of the bile that comments sections online are often dripping with, and honestly I was expecting my two weeks of moderating to be a fairly harrowing experience. Fortunately, you guys are alright, really. I don’t know if it’s something to do with the fact that The Times is a paywalled site, but by and large, 95% of you are respectful, rule-abiding, and most importantly, interesting in what you comment.

Not quite what I expected, either. One of the interesting things about The Times right now is that it’s one of the biggest experiments in building community behind a paywall, and that leads to some interesting side-effects. Maybe people won’t pay for the privilege of being arseholes online?

So please, when I give you a warning because you’ve libelled someone with your comment, relax for a minute and think of me sitting in a lonely office half way through a nightshift and a bit sweaty from my fifth cup of coffee, before you send me a furious email in which you call me a “jumped-up little c***.” Cheers.

Well, OK, apparently some of them will…

Once, long ago, when the world was dark, and I was stuck living in Lewisham, I was features editor of a magazine called Estates Gazette. We wrote about the world of commercial property, and one of the things I did was commission expert comment, including some features about property marketing and branding from one Kim Tasso.

She recently took Hazel and I to lunch (a brave thing to do with a toddler), and interviewed me in the brief gaps when my daughter was distracted by other things.

The result? Some thoughts on community development, content strategy and the commercial real estate business.

Worth a read, if you’re interested in the intersection of publishing, online community and B2B publishing amongst the professions…

What happens when journalists interact with the comments section?

Over a study period of 70 days, the TV station reacted to comments on its Facebook page in one of three ways: a prominent political reporter interacted with commenters; the station, using a generic station logo, interacted; or no one interacted.

The results showed that when a reporter intervened in the comment section, the chance of an uncivil comment – defined as obscene language, name calling, stereotyping and exaggerated arguments – declined by 15 per cent compared to when no one did so.

I’ve been teaching this as best practice for years now – based on experience and anecdotal evidence collected from friends working in full-time community management. Nice to see some research starting to emerge that backs up that experience.

Ouch:

The organizations that have the idea for a community, spend weeks selecting a platform, months developing it, and a year before they invite anyone to participate, tend to struggle…a lot. Typically they splutter along for six months before being mercifully cancelled.

I bet anyone who’s worked in community development within any sizable publisher is wincing right now.