The birth of real-time research

Social media provides the potential to research political events as they happen. Carl Miller and Bobby Duffy report back on their live analysis of the EU debates.

Early in April, Nigel Farage and Nick Clegg clashed over the bitterly contested political territory of Europe. For a long hour, in front of a prime-time television audience, they fought for that most precious of all currencies to politicians – the hearts and minds of the British public.

And as they spoke, many thousands spoke back. During the debate, people flocked to #europedebate on Twitter to follow, together, the twists and turns of the debate. Social media platforms like Twitter and Facebook are fast becoming our digital versions of the Greek Agora – central places of assembly and interchange, important points where we join our social and political lives to those around us and where we bring difficult issues to be publicly examined.

On debate night, CASM teamed up with Ipsos MORI to listen to the 65,090 chants, cheers and boos that flooded onto Twitter about the debate. Using technology developed by CASM, we sculpted algorithms to draw out the meaning from this deluge. We wanted to understand this new space, to see what it could tell us about what society thought, felt, feared and loved. So what did we hear?

We heard boos drown out cheers. From the beginning of the debate to the end, people used Twitter to boo and jeer at both contestants. The debate was not, at least on Twitter, a watershed victory for either Farage or Clegg; it was an opportunity for the gathered multitudes to hurl digital tomatoes at two mutually disliked public figures.

Source: Steve Ginnis, Ipsos MORI

Source: Steve Ginnis, Ipsos MORI

Even the party faithful winced: 2732 Tweets came from people who, we judge, overtly identified with the Lib Dems on their public Twitter profile. 338 of these were boos – a little over 12 per cent. 1680 Tweets, on the same measure, came from UKIP supporters – and 9 per cent of these were boos for Farage. You don’t get many words in your Twitter profile to tell the world who you are, and to use some to mention a political party suggests a strong, not casual link: not too far away from one in ten people booing their leader’s speech during a Party Conference.

We heard personality trump politics: As Twitter commented, elaborated, booed and (occasionally) cheered the debate, many more focused on Clegg and Farage as people – on how they sounded and looked, whether they were sweating, and how they were standing – than on Europe or policy itself. This was less an opportunity for a sober analysis of Britain’s place in Europe than it was Wednesday evening’s entertainment – a spectacle not to be taken too seriously.

Figure 2: personality versus politics

But we learned something else too: we could hear the roars and chants of the digital arena in real time. With new technology, and working with leading academics, we could build the algorithms quickly enough to understand Twitter’s reaction to the debate as it unfolded. We were beginning to research and measure attitudes as soon as they were expressed, listening to Twitter as tweeters listened to the debate:

Social research, including researching attitudes, has always lagged behind real events. There has always been a gap between beginning the research process and learning its outcomes. Questionnaires and surveys often take days, weeks, even months or years to design, conduct, collect and assess. Participants must be found and recruited, and interviews transcribed and coded.

In long-term and academic contexts, this isn’t a problem. But it has limited how social research can be useful on the frontline, informing decisions and influencing actions. Snap polls, exit polls and rapid surveys have all been attempts to confront the problem of delayed results, to make research faster and better able to produce answers while they are still useful.

They have had some significant successes, especially in large, scripted events like an election, but overall researching society has typically been conducted retrospectively: to understand something after it has happened; to assess the impact of an intervention or outcome of a phenomenon. It has been useful to help alight upon the lessons learned, to evaluate, to tell us how to do it better next time.

The rise of digital society is changing all this. From social media – our blogs, tweets and Facebook posts – to the cloud of transport, crime, economic and environmental data we now (often unwittingly) produce every day, we are creating a new kind of instant history of ourselves, an immediate chronicle of our social, political and intellectual lives.

Alongside this, with more powerful computers, faster connections and more capable algorithms, we are getting better and better at collecting, analysing and understanding this data also more or less instantly. The time needed to measure a society that increasingly conducts its business through digital, online means is collapsing – from months and years to seconds and minutes.

This means something important: the growth of a new kind of real-time social research. We can now study society not just after the fact – post facto – but also during the fact – per factum. We can begin to observe societal change not after is has happened, but as it unfolds.

Of course, what we heard during the Clegg-Farage debate was different to a poll, and in a conventional sense much less trustworthy. Based on tests that we did, our algorithms could tell the broad meaning of a tweet – whether it was a boo or cheer, or whether it was about politics or personality – about seven times out of ten.

Even when the algorithms get it right, the link between what happens on Twitter and in society in general is also uncertain. The profile of Twitter users, for example, is wildly different from the public as a whole, with barely any older, or lower social class users. More than this, the content is hugely unbalanced, with a small number of users doing a lot of the talking.

Source: Ipsos MORI

But in many ways that’s the point. We can’t judge this type of analysis in the same way as we would a representative survey. It is a new way of asking the question of what people think, and it produces new kinds of answers. Rather than asking a small representative group a series of questions, we sat back and listened to the natural discursive flow of an engaged body politic. This new method ducks a number of problems that have dogged conventional social research (people often have to paid to participate, and when they do they often say what they think the researcher wants to hear, or simply generate views on the spot), and runs into a body of new ones.