Tomorrow the 5th Internet Governance Forum begins in full and if last yearis anything to go by there will be a lot of social media buzz around. Last year I was supporting a group of young people and Diplo Foundation fellows to be social reporters at the event – using blogs, twitter and video cameras to capture and share discussions. This year, the focus in on trying to make sense of the event amidst a sometimes chaotic event and overwhelming amount of content.
Engage Remotely, Connect Locally – The Internet Governance Forum has an amazing distributed participation infrastructure which means people are joining in session from right across the world (over 30 remote hubs are registered!), logging into WebCasts and chats, and able to send questions into the physical sessions.
As Ginger explains participants connecting via the WebCast can bring a new set of perspectives to reporting of what has gone on – able to monitor multiple workshops and to more easily track-back over transcripts and notes. However, it can be tricky for remote participants to ask follow up questions to speakers outside sessions, or to catch the mood of the event from the conversations in the corridors.
So: we’re going to experiment with creating small teams following particular themes – made up partly of people following the WebCast form their own countries, and partly of social reporters physically at the IGF. These groups will be able to work together on creating reports of sessions, and summaries of key issues relating to the IGF themes.
The process will raise some interesting questions about how to integrate online and offline participation in an event – and already a number of ideas around specific language reporting are emerging.
Social Reporting Aggregator – I spend a lot of last week messing around in the innards of a Drupal install to build a ‘Social Reporting Aggregator’ which is capturing all the Twitter messages around IGF (at least those tagged #igf10) and as many blog posts and video clips as I can track down.
All this social media is aggregated in near real-time, and using various APIs and tag-extraction is categorised and has meta-data attached to it. I’ve scraped a copy of the IGF Timetable and used that to build a hierarchical taxonomy of sessions onto which particular tags and categories can be attached. All of which means it should be possible to present back most of the social media discussions around a specific session, or around a theme.
In this handout which will be on the Diplo stand at IGF I’ve suggested a pattern for tagging workshop content (and the aggregator is configured to work with this), but as @apisanty has already said “hasthtags are not dictated from above, they rise from crowdsourcing”.
If sessions settle by crowdsourcing on a different tag from that in the handout, this is not a problem (as long as I spot it!) as the platform can have multiple ‘tags’ against any session. However, I observed with the APC today that adding an extra ‘session tag’ to the ‘event tag’ was only common practice amongst some twitter users. How far to encourage such a practice, or how much just to sit back and watch whether it emerges (and cope if it doesn’t) is going to be an interesting question for the aggregation strand of social reporting. I’ve experimented with adding light-structures to social reporting platforms before, but never with an event so big and diverse (and where it’s impossible to get anywhere near to reading all the content being generated), so how the aggregator works and develops I will be interested to see.
How any of this plays out and what issues come up is yet to see. However, seeing how distributed participation in the IGF has developed over recent years to become embedded in the event – transforming in the process how a UN conference works and blazing a trail for new models of working – I’m pretty excited (though also very nervous) about what we might achieve!
Introducing social reporting to an event can bring many immediate benefits. From new skills for those participating in the social reporting, to increasing opportunities for conversation at the event, and building bridges between those present at an event, and those interested in the topic but unable to physically take part.
However, the wealth of content gathered through social reporting can also act as a resource ‘after the event’ – offering insights and narratives covering event themes, and offering contrasting and complementary perspectives to any ‘official’ event records that may exist.
Many of the tools I use when social reporting at an event have a certain ‘presentism’ about them. Newer content is prioritised over older content, and, in the case of dashboard aggregators like NetVibes, or services such as Twitter, good content can quickly disappear from the front page, or even altogether.
So, as we got towards the end of a frantic four days social reporting out at the Internet Governance Forum in Egypt earlier this year, I started thinking about how to make the most of the potential legacy impacts of the social reporting that was going on – both in the event-wide Twitterstream, and in the work of the young social reporters I was specifically working with.
Part of that legacy was about the skills and contacts gathered by the social reporters – so we quickly put together this handout for participants – but another part of that legacy was in the content. And gathering that together turned out to be trickier than I expected.
However, I now have a micro-site set up at http://igf2009.practicalparticipation.co.uk/ where you can find all the blog posts and blips created by our social reporters, as well as all the tagged tweets we could collect together. Over the coming weeks colleagues at Diplo will be tagging core content to make it easy to navigate and potentially use as part of online learning around Internet Governance. I’ve run the 3500+ twitter messages I managed to (eventually) aggregate through the Open Calais auto-tagging service as an experiment to see if this provide ways to identify insights within them – and I’ve been exploring different ways to present the information found in the site.
Learning: Next time set up the aggregator in advance
I didn’t start putting together the site (a quick bit of Drupal + FeedAPI, with the later addition of Views, Panels, Autotagging, Timeline and other handy modules) till the final day of IGF09, by which time over 50 blog posts had been added to our Ning website, and over 3000 twitter messages tagged #igf09.
Frustratingly, Ning only provides the last 20 items in any RSS feed, and, as far as I can tell, no way to page through past items; and the Twitter search API is limited to fetching just 1500 tweets.
Fortunately when it came to Twitter I had captured all the Tweets in Google Reader – but still had to scrape Twitter message IDs back out of there – and set up a slow script to spend a couple of days fetching original tweets (given the rate limiting again on the Twitter API).
For Ning, I ended up having to go through and find all the authors who had written on IGF09, and to fetch the feeds of their posts, run through a Yahoo Pipe to create an aggregate feed of only those items posted during the time of the IGF.
It would have been a lot easier if I set up the Drupal + FeedAPI aggregator beforehand, and added new feeds to it whenever I found them.
Discoveries: Language and noise
I’ve spent most of my time just getting the content into this aggregator, and setting up a basic interface for exploring it. I’ve not yet hand chance to dive in and really explore the content itself. However, two things I noticed:
1) There is mention of a francaphone hash-tag for IGF2009 in some of the tweets. Searching on that hash-tag now, over a month later, doesn’t turn up any results – but it’s quite possible that there were active conversations this aggregator fails to capture because we weren’t looking at the right tags.
2) A lot of the Twitter messages aggregated appear to be about the ‘censorship incident‘ that dominated external coverage of IGF09, but which was only a small part of all the goings on at IGF. Repeated tweeting and re-tweeting on one theme can drown out conversations on other themes unless there are effective ways to navigate and filter the content archives.
I’ve started to explore how @ messages, and RTs within Tweets could be used to visualise the structure, as well as content, of conversations – but have run up against the limitations of my meagre current skill set with R and iplot.
I’m now on the look out for good ways of potentially building some more intelligent analysis of tweets into future attempts to aggregate with Drupal – possibly by extracting information on @s and RTs at the time of import using the promising FeedAPI Scraper module from the great folk at Youth Agora.
Questions: Developing social reporting legacies
There is still a lot more to reflect upon when it comes to making the most of content from a socially reported event, not least:
1) How long should information be kept?
I’ve just been reading Delete, which very sensibly suggests that not all content should be online for ever – and particularly with conversational twitter messages or video clips, there may be a case for ensuring a social reporting archive only keeps content public for as long as there is a clear value in doing so.
2) Licensing issues
Aggregation on the model I’ve explored assumes licence to collect and share tweets and other content. Is this a fair assumption?
3) Repository or advocacy?
How actively should the legacy content from social reporting be used? Should managing the legacy of an event also involve setting up search and blog alerts, and pro-actively spreading content to other online spaces? If so – who should be responsible for that and how?
In the process of working with a diverse international group at an incredibly diverse and complex event, we gained many insights into social reporting for multiple knowledges – and I’ve tried to unpack some of my reflections and learning below:
Social Reporting for multiple knowledges
One of the great transformations brought about by online digital media is that just about anyone can now create and share rich media to offer their own view of events or issues – and this media can be published where many of the worlds population with an Internet connection will be able to see it. As Deirdre from St Lucia pointed out, it’s not long ago that getting more than one news channel’s coverage of even major events was near impossible.
The main sessions of IGF09 were well recorded, with UN Webcasting in video or audio from every session or workshop, and live transcripts of many sessions available. Formal write-ups of each session will be available in due course. However, with social reporting our goal was not to duplicate these formal records of the event, but was to offer each participant, and particularly the youth team and Diplo fellows (henceforth referred to as ‘the social reporting team’), the chance to report on elements of the event of interest to them. And to do that, we were using simple, near-instant, online social media tools.
The idea of multiple knowledges is of course a complex one, and has many layers – but at IGF09 our core focus was on just one element – supporting the capture and sharing of different perspectives on the event from different actors in the event.
Reflection 1: Train in techniques, as well as tools
Few of the social reporting team we were working with had used twitter, online video or blogging before as a reporting tool. Before the IGF got started, Pete & Dejan ran a short afternoon’s training with some of the social reporting team, explaining how tools like Twitter worked, encouraging team members to sign up for accounts, and getting particpants to practice using Flip camera digital video recorders. They also introduced the team to the Social Reporting at the IGF handbook we had prepared.
However, whilst the handbook does offer a short introduction to the concept of social reporting itself, and mentions a few practical techniques for video interviews, it was only later in the week that we started to do more to demonstrate different techniques and to talk about ‘conceptual tools’ for creating social reporting content.
It would be worth exploring in more depth the range of different techniques (and templates) that can help new (and experienced) social reporters to capture multiple knowledges in their reporting – and to explore how best to train and equip social reporters to choose and use these approaches.
Reflection Two: Let reporters choose their tools – and then build up multi-tool use
A social reporter who is comfortable with many different digital tools, and who is covering a particular conference theme, may start by sharing some insight or quotes from a session by Twitter. They may follow up by catching the panelist who the quote came from, and asking them to share more of their views in a short video interview. They may then upload that video interview, keeping a copy on their computer to edit into a later remix, and when the video is available to view online, they would use Twitter again to alert others to the fact it has been published, actively alerting (by using the @username convention) anyone who expressed an interest in the earlier twitter messages on this topic. Later in the day, when things are quieter, they may embed a screen-shot of the original tweet, and a copy of the video, into a blog post in which they draw out a key message from the video, and link to other blog posts and websites which relate to the topic under discussion.
But – getting from no use of social media tools and no experience of social reporting – to that sort of platform-hopping mixed-media reporting in just a few days is a tall order. In fact, rather than trying to get new social reporters to be platform-hopping from the start, a quick show-and-tell, or hands-on demo of the different tools available, followed by an invitation to each member of the social reporting team to choose which tools they want to explore first, or which they feel most comfortable with, seemed to generate far better results.
Reflection Three: It helps to know your audience
It’s tricky to write when you don’t know who you are writing for. It’s a lot easier to carry out a video interview when you have a sense of who might watch it. And it’s often easier to allow yourself to be present in your own reporting when you know your main audience will be a community you are part of. We all present ourselves differently to different audiences, and so to capture multiple knowleges, it can be useful for a social reporting team to think about multiple audiences.
We didn’t get much time to explore with our social reporting teams who they saw as the audience for the content they were creating, nor to think about the different spaces the content could be published or aggregated to in order to reach out to different audiences – but I have a sense this could be a valuable additional part of training and preparation for social reporting. At first we found all the reporting was talking place in English, but we encouraged our social reporters to create content in whatever language they felt most comfortable with, or that they felt was most appropriate for the content in question.
There were a number of ‘remote hubs’ following the IGF via the web cast, and participating in discussions through Skype and Webex, and in our debrief we’ve reflected on how it may be possible to pair social reporters up with geographical or thematic remote hubs – giving each reporting a strong connection with a specific audience.
Reflection Four: Quick clips cannot capture all knowledges
The Internet Governance Forum is a complex event. Not only does it deal with some complex issues (socially, technically and culturally), but it also is comprised of a vast array of actors, from governments and industry, to individuals and civil society. As a non-decision making body the spirit is neither of consensus, nor of conflict – and black and white statements of positions are rare. The presence of all different shades of opinion, and of the experience of actors from many different countries and contexts, appears to make IGF the idea place to explore multiple knowledges. Yet at the same time, the complexity of context and content makes capturing the multiple perspectives on IGF in ‘social media snippets’ a challenge.
In video reporting, the social reporters needs to have a reasonable domain-knowledge in order to be able to ask questions that illicit insights from interviewees. In quick twitter based reporting, capturing the most relevant points without reducing them to soundbites can be tricky – or can lead to only the most ‘tweetable’ and no neccessarily the most interesting or important ideas being shared. In blogging, the lack of definitive positions to ‘side with’ in writing up a session or theme can mean the social reporter needs to pick a path through many different subtlely different perspectives and to express them in text.
Reflection Five: When the event ends, then things are just getting started…
On the last day of the IGF I hastily put together this ‘Social Reporting after IGF’ handout for our teams – as we realised it was important to make sure that, for the social reporters, the end of IGF09 was not neccesarily the end of their use of social media tools to capture and share ideas. (I’ve also created a ‘Social Reporting’ group over on the Diplo Internet Governance network). Having invited many of the youth team, and the fellows from Diplo, to sign up with various online spaces, including Twitter, for the first time, we also had a reponsibility to make sure they were aware of the implications of continued use of these tools.
But ensuring new social reporters know how they can continue to use social media tools to capture content and create networks is only part of the legacy of social reporting at an event. With the creation of a significant amount of content, there is some obligation upon us to do something with it.
So, over the coming weeks we’ll be thinking about ways to aggregate, archive and curate the content we gathered – and thinking about whether any content can continue to be used in useful ways over the coming year.
There is little point in equipping people with the skills to capture multiple knowledges, and going some of that capture, if the skills are left un-used in future, and the content captured and the knowledges it expresses disappear entirely into Internet obscurity.
I am sure there are many more reflections and learning points from other membes of the team – which they will undoubtedly share in due course.
To support our team of Social Reporters I put together this quick handbook, and I’ve tried to make the first few pages more ‘general purpose’ that just for IGF. It’s uploaded under Creative Commons of course, and I’ll put together a revised version based on how it goes down in the next few days.
I’ll try and post with a bit more background and more links to interesting goings on at IGF in the next few days – but for now – back to packing.
Don’t just buy. Do. As inscribed upon a banana by @oxfordsing, this ‘slogan’ captures the tension at the heart of the Fairtrade movement right now, and a thread running throughout the day. As Fairtrade reaches the mainstream, the connection between Fairtrade and activism, and the importance of linking Fairtrade with Trade Justice can become dilluted. Fairtrade is about building ethics into purchasing decisions, but, it’s also about building ethics and justice into trading relationships. That was a point made well by People Tree founder Safia Minney who spoke of how we need to push companies to respect principles of Fairtrade throughout the production process, not just in the inputs they buy.
The challenge, for the Fairtrade movement, is being in the mainstream, being a set of standards, but also being the first step for people on a pathway to engagement with wider global issues.
Data, data everywhere
I spent a lot of the day in conversations about the digital dimensions of Fairtrade. After a morning including presentations from Dorothea and Ian of the Fair Tracing project, and inputs from Steve Bridger and Pete Cranson on social media and Fair Trade, we spent one of the afternoon open space sessions talking about ‘Fair Trade 2.0’.
Asides from discussions about how social media could lead to greater disinter-mediation of supply chains, we also discussed the transformation of trade as being ‘trade in stuff’, to being ‘trade in stuff + data’. That is, the move, led by retail giants like Wall Mart, to enable every individual product in a supply chain to be tracked from origin to consumption, with vast collections of data on products and customers collected and created. How does Fairtrade, which is in one sense, a very simple bit of data for consumers about the conditions in which a product was made, engage with this environment?
As firms may be compelled to collect more data on each product, to come into line with safety regulation, how can we ensure ethical information is embedded alongside the other data that may follow a product on it’s journey? And how can that information be made meaningful and useful to time-pressured shoppers? Are ethical criteria needed to account for fair trading in the data that might travel with a product – so it does not become a source of commercial exploitation? And can the rise of a data-rich supply chain be subverted in the cause of ethics?
There were quite a few experiments going on with the Fair Trade Futures conference. It was the first time many of the organising team had experienced any Open Space sessions, and for most delegates, the first time they had been at an event being actively digital reported.
I was a little nervous about the digital reporting – as I’ve seen it work well at technology events, and in youth events, but I’ve not experienced digital / social reporting in action with a community for whom social media is not part of the everyday. Yet it worked fantastically. And by the end of the day, many delegates were won over to the potential of social media to help capture, curate and continue conversations started at an in person event.
There is still much to learn about how best to use social reporting to catalyse online community (for example, I would love to work out how best to equip delegates new to social media to try their own blogging and twittering from sessions, without spending too much time training them up, or distracting them from participation in face-to-face discussions), but the team from Amplified certainly demonstrated that we need to be adding digital dimensions to many more events outside the social media mainstream.
(Co-incidentally, Amplified are currently setting up as a non-profit, able to marshal teams of digital reporters to all manor of events, so if you’ve got projects and events coming up that could do with an online edge, I would certainly recommend getting in touch with the Amplified team.)
Fair Trade Futures was the follow on event from a Fair Trade conference held in Oxford five years ago. I have a feeling it may not be quite so long before the next events are held here – and I would love to see an event taking place soon dedicated to the digital dimensions of Fair Trade. No plans yet… but if you might be interested, do get in touch…