Not all impact is always positive or intended; it is therefore important to evaluate impact within the context of the situation, community, or project. It may be difficult to be sure impact can be attributed to a specific video project.
- How do we connect or attribute impact to the use of video? In what ways can we plan and prepare for this in the life-cycle of a video project?
- What are possible ways to evaluate impact? Can you share examples from your work?
What would you say are the values that matter most when you are making a Video for Change project and why? By values I mean the principles or ethics you hold in the highest regard.
Some time back we asked 41 Video for Change practitioners this question in a survey. Here are the results.
The results supported the in-depth interviews we had carried out with Video for Change organisations. We found that most people strongly value:
1. Participation and Inclusion: Including and supporting the participation of under-represented and marginalised actors.
2. Power Analysis: Understanding and challenging existing power imbalances.
3. Risk Mitigation: Identifying and addressing risks for participants, target audiences and others who may be affected by the video.
4. Accountability: Being accountable to the groups or communities you are seeking to support.
These four values are now the cornerstone of the Impact Guide we are now creating for Video for Change makers.
I'd love your feedback and thoughts.
Tanya
I'm going to cheat a little here (with apologies) and pull a section from the impact research working paper that Tanya Notley, Andrew Lowenthal and I worked on for the V4C network impact project. A section I worked on looked at new forms of video4change activism and how we apply some of the principles enumerated by Tanya above to that. It'd be great to talk about how we approach these questions in new forms of citizen journalism from eyewitness media, to curation, to remix etc.
"The latest developments that are changing and driving how video is being used for social purpose activism highlight some of the challenges and opportunities of using an ethically-driven Impact Pathways conceptual framework that emphasises the needs, aspirations, intentions, and safety of communities affected. For example, there are many challenges in applying bottom-up, participatory forms of accountability across the full spectrum of Video for Change initiatives and remix video is perhaps the most problematic of the forms of emergent activism for the ‘impact pathways’ approach. Drawing on what danah boyd (2008) has characterised as the properties of the social media and networked publics − namely, persistence, searchability, replicability, scalability and three related dynamics − of invisible audiences, collapsed contexts and the blurring of public private − we can see how content created in and for one specific audience, time and place is embedded with assumptions around purpose, visibility and privacy. Yet videos can have their context collapsed and their audience and visibility radically altered when they are replicated, remixed, and re- shared, and this can have both positive and negative impacts for participants in the original context.
We see this dilemma present in the human rights and social justice world as well as the world of everyday popular remix and meme-based viral video. An iconic example of the latter is the persistent patterns of remix and re-adaptation of the footage of ‘Star Wars Kid’5 where personal footage of a teenager pretending to be a character from Star Wars was shared, publicised, and remixed (often disparagingly), and was viewed by over a billion people. In these cases, intention and context are removed in ways that were at the time unimaginable to the creator. In the Video for Change context, remix videos are increasingly being created by producers who are physically close to the issues, violence, or trauma they are exposing, and by those who are acting as remote or distant witnesses. One example of this type of practice is the work of Tamer Shaaban, an Egyptian student living in the US, who produced ‘The Most AMAZING video on the Internet #Egypt’. This remixed footage of the Arab Spring went on to become one of the most widely-shared videos of the Tahir Square uprisings of early 2011 (Gregory and Losh 2012).
Another set of questions emerges when we consider the relevance or appropriateness of the Impact Pathways conceptual framework when videos created through acts of citizen witnessing6 and citizen journalism seek to share documentation of a crisis in ways that are not guided by prior strategy, or by an assessment of the ongoing ramifications of risk for communities depicted in or potentially affected by the video. This raises the question: are people creating a Video for Change initiative if their intentions are not clear, or if they are not guided by the ethical principles we have outlined, or if they see themselves as part of a loose collective or an ad-hoc contributor? Curiously, citizen witnesses and citizen journalists who record their own video footage or source it directly from primary or secondary sources – particularly when documenting acts of violence – must make a rudimentary decision about their Impact Pathway when they upload to commercial video-sharing platforms like YouTube. This is because these platforms will often exclude context-less content that breaks their rules on violent acts, hate speech and other forms of objectionable content, but will allow it to remain if context and presentation indicate that it falls within an educational or documentary context: that is, if in its presentation the sharer has made explicit that they want it to be used and seen as documentation, evidence, education, or news (Glenesk 2013).
The practices of video curation and archiving, both involving the aggregation of video content, have also emerged as Video for Change approaches in recent years, and these too offer challenges in terms of ensuring our Impact Pathways approach remains relevant. These kinds of video initiatives often take on the responsibility for assigning context, meaning, and distributive reach to citizen’s media, acts of citizen witnessing, and documenting. Examples of work in this field include the environment and social justice focused video sharing platform, EngageMedia (http://engagemedia.org); the WITNESS collaboration with Storyful and YouTube, ‘The Human Rights Channel on YouTube’ (https://www.youtube.com/humanrights); journalistic experiments such as ‘Watching Syria’ by the New York Times (http://projects.nytimes.com/watching-syrias-war); and many acts of individual curation that emerge as trusted sources in particular contexts, such as the ‘Only Mehdi’ YouTube channel during the Green Revolution in Iran in 2010 (https://www.youtube.com/user/onlymehdi). Each of these Video for Change initiatives cited here clearly aspires to use video to support social change, whether that involves changing minds and behaviours, or changing structures such as policies and practices, or building movements or individual capacities. Yet, these curating and archiving processes sometimes ignore, or do not know the original creator’s intent. In such cases, the removal or addition of context can contribute to less than appropriate handling of video material. This too can challenge the ethical principles that underpin our Impact Pathways framework.
The ethical foundations of our Impact Pathway framework are particularly challenged in the case of so-called ‘perpetrator videos’ − a genre of videos shot by perpetrators of violence or rights violations − that have often been re-purposed and re-contextualized as evidence of both specific human rights violations and of general patterns of violations. In an analysis of the use of Egyptian police violence videos, Gregory and Zimmerman (2010) note how in a number of ultimately crucial cases, footage shot by policemen themselves, such as the el-Kebir case of torture, was collated, re-contextualized, and identified as human rights footage (not as entertainment or an attempt at humiliation) by bloggers like Wael Abbas and Noha Atef. Yet this same footage, can also be found alongside footage from other contexts of police and state violence in videos like ‘Police Brutality − Police Get What They Deserve’, a remix video seen close to two and a half million times on YouTube before it was taken down; in this case specific incidents of police and military abuse are subsumed into a broad narrative that loses all connection to the specificity of each incident within it. In some of these perpetrator video remix incidents − for example, the notorious Squatgate incident in Malaysia − the individuals who were abused and violated in the videos requested that others stop circulating the footage (Padania, 2006).
Citizen witnessing videos and perpetrator videos (to an even greater extent), complicate requirements for informed consent and informed participation – two key ethical principles of Video for Change in the impact framework we have described in this paper. This is due to the fact that informed consent is not considered a fundamental citizen-reporting method, and in the case of perpetrator videos, the stripping of power and agency from the victim is precisely the point of the act of filming (for more, see Gregory 2010). Solutions to these situations are hard to find − and as noted above, may rest more in the vagaries of platform judgements on consent, or in the decisions made by a range of sometimes unidentifiable or unaccountable intermediary actors.
Adding to these concerns regarding shifting online structures and dynamics, is the over-emphasis of the value of online environments, which can lead to widening disparities and the exclusion of marginalised voices. As we have noted, most of the world’s population are not able to easily view videos online. Initiatives that focus only on online distribution need to be evaluated in terms of who they include and exclude and what effect this has on overall impact."
How do we grapple with the potential power and 'impact' (positive and negative) of these new forms of video activism?
Sam
When we started our Video4Change research project (read more here) on creating and evaluating impact, we began by talking to Video for Change organisations around the world and documenting some case studies. Below I've provided links to a few of these case studies. What I like about them is that they represent diversity in terms of location and impact goals but also in I really appreciate that they are all low budget, grassroots and differ in terms of length, style and outreach strategy.
1. Video: Indian Railways Blind to Disability, Videomaker: Amol Lalzare, Organisation: Video Volunteers (India)
- See more at: http://www.v4c.org/en/video-volunteers-impact-case-study#sthash.YordQkHC.dpuf
2. Video: Surat Cinta Kepada Sang Prada (Love Letter a Soldier), Organisation: EngageMedia (West Papua)
- See more at: http://www.v4c.org/en/content/engagemedia-video-change-impact-case-study#sthash.GMPwpncA.dpuf
3. Video: How to Build a Fence in Hebron Video, Organisation: B'Tselem (Palestine)
If you’re looking for impact case studies, here are some other resources:
· 6 diverse case studies from the Center of Social Media and Impact: http://www.cmsimpact.org/resources/case-studies
· 20 case studies from Britdoc’s Impact Guide: http://impactguide.org/library/
These are fantastic, thanks for sharing! I think the case study model is so helpful in sharing lessons learned with the field. I'd also like to share the BAVC (Bay Area Video Coalition) Impact Playbook: Best Practices for Understanding the Impact of Media. You can download it at: https://www.bavc.org/sites/default/files/resource/Impact_Playbook.pdf
Thanks Daniel,
I looked at the Impact Playbook some time back but I'm glad to have re-visited it now. One of the things I like is the way it begins by talking about the benefits of measuring impact. This is useful since there is a huge push by donors to focus on impact but donor interest/focus could perhaps miss the point, or be quite narrow and divorced from the issue and on-the-ground reality. The Impact Playbook reminds me how important it is to start by asking: why do we want to measure impact” It seems like an obvious question but actually the why could take you in very different directions in terms of what you prioritise/measure/evaluate and how. I also find useful the very broad definition used for impact:
"The simplest synonym for impact is ‘change.’ Every media project or story changes some aspect of the world. Impact is the sum of these changes."
I think this guide is very accessible and I like that it is simple, direct and practical. It provides really good, starting pint questions like:
What are my social change goals? What audiences do I want to reach? What metrics correspond to my audiences and goals?'
I guess some people will want a more in-depth guide or resource that also provides evaluation methods. I notice Egbert has provided links to one impact research/evaluation method (the Most Significant Change Approach). I had not visited the Impact Field Guide for some time but now that I have I can see they provide some great tips and links for connecting impact indicators with different evaluation methods. For example, for an indicator relating to how the audience has responded, they suggest the use of audience entry/exist surveys, vox pops and requesting audience email addresses for later follow-up. They also provide some useful tips for creating a good survey. See: http://impactguide.org/evaluating/evaluation-toolbox/
Cheers
Tanya
Hi Daniel,
the Impact Playbook is certainly a useful guide. I wrote up a review of it a year or so ago here
https://www.v4c.org/en/content/dimensions-impact-reviewing-impact-playbook
Tanya has already mentioned some of it's it's benefits, it serves as a great introduction to the space and much of it's thinking.
Some of the limitations I found (and I think this occurs in many of the more Western impact guides) is that the focus on impact is at the reception stage - ie within the context of distribution, screening, promotion etc. One of the elements we've tried to add with the Video for Change impact approach is to articulate multiple points of impact throughout the whole process - we refer to this as 'impact pathways', stages that run though the arc of the initaitive - starting from planning and research, consultation, pre-production, shooting, editing, outreach and engagement, evaluation etc.
I don't meant to single out the Playbook here, as it's more indicative of an approach of many organisations that focus on impact at the reception stage. For many feature documentary makers and journalists that is the most logical approach as the relationship to the subjects or community is usually different. For feature docs the creators of the film are less likely to be from amongst the effective group - though not always.
One of the defining features of the Video for Change approach, and both Cheekay and Lina touched on this is that, in Lina's words, 'the creation and distribution of video is itself a social process". It is also in the social nature of the methodology that much of the impact is created. Engagement and participation in the process of production and/or distribution by the actors directly effected by issue at hand is a defining feature, rather than an add-on.
This isn't to say more traditional feature documentary or journalistic models aren't valid, the type of impact they are generate however differs as their emphasis is at the reception and engagement stage of a larger process. Video for Change initiatives, in my mind at least, see potential impact where ever a social relationship is in play.
This is also not to say that Video for Change initiatives only focus on process, though for some it might be biggest element. At EngageMedia we try to strike a balance between participation of the actors directly effected, and producing media that can resonate with audiences beyond their immediate horizons. It is a tension that is often difficult to balance, but can yeild great results.
Hi Brent - Thanks for getting us started with these questions.
In many media spaces, success and impact are often judged through primarily quantative measures. Metrics are traditionally devised to capture numbers around the distribution of the final video such as views, people reached, etc. However, if you have an advocacy goal, you know that there are many other qualitative measures of impact that will never be captured in a metric. These impacts might be things like how your video furthered advocacy efforts towards your goal (which influential* people saw or talked about your video/cause, other things that happened because of your video - could be positive or negative), the skills your collaborators learned in the process of creating the video, instances where community members or trainees going on to train others on using video, etc. These are all import "impacts" to capture alongside your traditional quantative metrics.
(*"influential" meaning this individual has the power to affect the thing you want to change.)
At WITNESS we have spent the last couple of years examining and redesigning how we capture and evaluate these qualitative measures of success in our work training others on how to use video and creating video ourselves. First and foremost, this means paying attention to the world around us and looking for signs that what we are doing (training, developing resources, producing videos) is affecting the behavior of the individuals and groups we are looking to target (activists using video, policy-makers, etc.).
For each project we run, we are constantly looking for pre-determined "indicators" of success that fall in line with the desired outcomes we have outlined for the project. These could be stories of how something we have created is being used or repurposed, how a campaign or effort we supported is strengthened through the use of video, the production of videos by others as a result of our work, or any other sign of impact that happens as a result of our actions or a video we have produced. We capture these qualitative stories and our impressions of our progress towards our outcomes through a bi-monthly reporting system. Every six months we look back at our desired outcomes and our qualitative successes and evaluate if we are on track to meet our goals or if we need to pivot our strategy. It can be a bit of effort but we have found it invaluable to collect this type of information and make space in our schedules to reflect on our impact.
I would love to hear more about how others are evaluating their impact!
Hi Sarah,
Thanks so much for sharing the WITNESS experiences and approach to evaluating your impact. I think your 6-month checks are an excellent way to make sure activities are still in tune with the desired outcomes and meet the goals set out from the start. On the other side this method is quite time-consuming, will require quite a few resources and pre-suppose a project with a long breath (multiple years at least). Maybe not all Video for Change practitioners are involved in such long-term projects and maybe not all have clearly defined their desired outcomes at the start. So I would like to share one other method, that might be familiar to most of us. Most Significant Change.
Impact resulting from human rights or community empowerment initiatives are often evaluated by indicators set out by donor organizations or large NGOs, usually captured in a log-frame document complete with a Theory of Change. Unfortunately this translates itself in the field to a focus and use of resources aimed at monitoring and evaluating impact along these pre-defined indicators. And although these indicators are good at monitoring impact and able to evaluate the changes that have occured, they do not necessarily always capture what's actually considered meaningful by the beneficiaries of an initiative. The Most significant change method makes this possible, but letting beneficiaires decide what they thought was most 'significant'. In other words they evaluate the impact of an initiative according to the priorities, needs and preferences of the beneficiaries.
Case study from West-Java
I recall a project (Creative Communities) where we used participatory video to give voice to marginalized groups in villages in West-Java. Goal of the project was to increase the participation of marginalized groups within village decisions making processes. A group of women from Ciharashas (Panembong village) decided to make a video on the importance of early child-hood education asking the village government for funds to hire one teacher and a space. Measured along the indicators set out the project was a succes. The video was made by the women themselves, they showed it to the village head, who took it up to the distrcit level, where it was discussed. Hence they participated in decision making processes.
But when we actually listened to the women themselves, they didn't see their 'participation in the decision making processes related to village development' as the most important thing. For them being part of the video-team was an opportunity to be out of the house and to encourage their husbands to be more active in the household. Some had never found the courage to tell their husbands to do something within the house (clean, cook, wash, entertain the kids, do shopping, etc). Through their participation in the video process they found the courage to speak up against their husbands and demand their help. Simply because they had something very important to do, that is make this video. So for them the project was about emancipation in the domestic sphere. That's where they experienced impact.
I hope this case-study explains that we can evaluate our results even better, by finding even more impact, or being able to evaluate it in different ways, if we force ourselves to listen to beneficiaries, instead of being overly focussed on set out indicators. The Most Significant Change method does precisely this.
PS: Quite recently Insight Share has published a new online manual on how to use Participatory Video within this Most Significant Change model. It's available here http://www.insightshare.org/resources/pv-and-msc-guide
PS2: Unfortunately there is no english language documentation of the Ciharashas women's videos. They were shot in Sundanese, the local language of the people in West Java, but if you're interested please contact me and I can share more about this initiative.
Dear Egbert,
Thanks for sharing our InsightShare brand new toolkit! Definitely we believe that Participatory Video combined with Most Significant Change (PVMSC) can be an approach to help measure impact, how and why change happened and the contribution of a specific process.
We had helped in the past War Child to evaluate their community video work in South Sudan using PVMSC. It was a rich process lead by the youth and community members part of that process that provided rich qualitative data for War Child to assess if to scale the community video programme in South Sudan and also extend to other countries.
We have also been supporting recently UNICEF in Uganda measure the use of diverse communication for social change tools (like citizen journalism, theatre, radio programmes and listening groups) in creating peacebuilding. The PVMSC evaluation process has not just provided evaluation and impact data, but it has acted as another programmatic intervention in itself, widening the ripple effects of peacebuilding programming. The communities where we worked have immediately shared positive feedback with the local partner on how important it has been for them to feel consulted and heard through the process of telling their stories and sharing in discussions post-screenings.
We definitely see PVMSC as a good way to measure impact for video for change initiatives when you have access to the diverse groups involved in a specific project.
This is really interesting, as I know it can be difficult to measure qualitative data. If the video/ project brings about an unexpected, yet positive, result how do you factor it into your measurements?
This post is more about sharing a resource, and asking others what they think about it. Participant Media is one of the leaders in the field of film and social change. They released "Th Participant Index" for measuing the social impact of entertainment on its audience. They have different criteria points or what they categorize as impact on diferent levels of engagement, such as "Information Seeking" "Information Sharing" "Taking Individual Action and "Encouraging Community Action." I found it very interesting that they also measured at how emotionally affected audiences were by different films, and drew a conclusion that the greater emotional involvement someone has to a story the more likely they are to move to social action. What do you think?
For more info, see the link below (and the two reports at the bottom, the second one is a good overview):
http://www.takepart.com/tpi
Dear Daniel,
I am just adding two additional resources here that I think relate to this topic and can be inspiring to use.
One is called The Fourth Act. They have designed tools that can collect "real-time audience insights that clarify the issues that matter the most and create story-dialogues that inform your narratives." I found especially their HARVIS tool (Introduction video here :http://www.afourthact.com/harvis/) quite intriguing. It's a mobile web app that allows an entire audience to provide feedback instantly to something they view together. It can be of great use to evaluate whether the impact or emotional responses you were aiming for in your video actually come across.
Another is Boal's (Brazilian theatre director) method of Forum Theatre where a theatrical scene is stopped and then the audience is allowed to comment on what the next action of an actor should be, rewind the play a little bit, or even step on stage and act out the 'change' they would like to see. An interesting term he introduces in relation to this is that of the "spectactor" meaning we are both "SPECTators" and "ACTORs". I think many Video for Change practitioners would love their audiences to be "Spectactors". Forum Theatre principles can, with a little creativity, be applied in small-scale video screenings. For people dreaming of making their Video for Change videos more interactive Boal's work is surely inspiring.
PS: I led a team that translated Boal's famous book "Theatre of the Oppressed" into Indonesian (bahasa Indonesia). Indonesian readers interested can contact me for a free copy of this book.
Hi Daniel,
I actually find TPI problematic, in that it is too focused on engagement and participation from the audience, and not on actual action or impact on the ground or in subject community(ies). Unless your audience and subject community is one and the same, you're missing out on a rather large swath of analysis. What this means to me is that it propagates paternalistic structures of how we think of and create video for change by failing to take into account changes "on the ground." It also weights impact on the final artifact by failing to take into account the process of creation and distribution (and this is especially problematic for interactive media projects).
Again, not to go back to the same resource, but the way we drafted our impact report for WIDC was to rest primarily on qualitative evaluation, where we were using narrative to explore the impact of our campaign -- and also were weighting the impact among our subject community and stakeholders first, and our general audiences second. This required an understanding of audience segments and a stakeholder analysis, which we had done prior to creating any assets for our campaign and which we then revisited when evaluating.
I'm really interested to explore the InsightShare framework Soledad mentioned here -- particularly as it incorporates MSC, which is a a great methodology for both project design and also evaluation. From what I've read here, it seems like it might be a good framework for taking all these aspects into account and serve as a more robust model than TPI.
Impact changes over time -- that is, depending on when you do your assessment, the visible and measurable impacts of your video project changes or evolves.
I would be keen to find out when do you assess your impact? Or have you had the luxury of doing multiple impact assessments over a course of a period? If so, were there changes in how your video initiative's impact changed over time?
The other idea to explore is relating metrics to impact. For example, in online video, there's a huge focus on measuring hits. What kind of meanings are we attaching to viewership based in the context of impact assessment? Are we over-valuing such metrics, and attaching meaning to them, as an assumption of "impact". Or simply put, are we assuming that the more people see our videos, the more impact it has?
Or is there are more nuanced approach to this?
Hello all,
I'm going to jump in to the conversation with one aspect of how my org Video Volunteers measures impact - what we call an 'impact video.'
To explain an impact video I have to first explain how we work: VV has a network of 188 Community Correspondents reporting on issues relating to human rights and corruption, and what we call in India, 'entitlement programs.' ie, the programs the poor are entitled to by law but which so often don't reach them. We started this just as a reporting program, but then we saw - after maybe a year - that they were solving the issues too, ie, getting what we call 'impact.' We asked ourselves how we could get more impact. We realized our Correspondents' - who come from very disadvantaged backgrounds - first priority is income. So we thought there would be a connection between getting impact and getting money. Secondly, we were interested in verification. Our Correspondents would call and say, 'great news, the teacher is coming on time! My video was a success!" Once we had many of these statements coming in , we realized we wanted to verify them. And more selfishly, we realized we wanted to share them with the world. We wanted to say, 'hey, we have this interesting model of making video in the community and here's how they use these videos to make change.'
So we trained them to produce what we call 'impact videos.' Here's how it works:
1. A Community Correspondent (CC) calls and says, "I got an impact on that video I made about the terrible road in my village. The road has been repaired."
2. We ask the CC all the things we want to know: what was the situation before; what is the changed situation; what's the scale of the problem and the solution (ie number of people affected); what had the community done to solve it before and what was the result of that; What does the law say related to the issue/what is the community's right; what did the correspondent do, ie what steps did she take to get the impact. Once we have the narrative we tell them what they must film: evidence of the changed situation; statements from people - often the government official who helped make the impact happen - saying the correspondents' role in the whole thing; and visual evidence of the process - usually a community screening or a rally/meeting. Creating this short written narrative is both a training exercise - ie helping them articulate what they want to put in their video - and also a documentation exercise. After this conversastion to establish the narrative the Correspondent feels ready to go film.
3. The CC makes the video; we check the footage and make sure they have captured all the elements - or most of them, since they often forget a few. We pay them three times as much for the impact video as we did for the issue video- they get about $125 dollars for the 'impact video.' This is in recognition that activism and using the video is often harder than making the video. So we're incentivizing their activism essentially.
We've therefore combined the process of getting impact with documenting impact. We've had 617 instances of impact in the last 5 years; which is a resolution rate of 1 in five, meaning one out of every five issue videos we've made has resulted in impact.
The downside of this - and it's a major one - is that they we saw they were starting to shy away from issues like gender and forced evictions where it is harder to get impact (and thus for them to earn money), and instead have made a lot of videos on infrastrucuture, corruption, health, education, water and sanitation, where it's easier. So now we're dong a lot more trainings on these softer, harder issues to encourage more focus there.
Below is an example of an impact video.
Hello all,
From a novice perspective, I thought this "corporate side" of video was interesting. This is an article from Google promoting their own method (Brand Lift) of evaluating video’s impact of online video on a marketing campaign. Outside of ‘engagement metrics’ (clicks, likes, shares, views), they use google searches to determine interest.There are two groups of users that normally would be target audiences, one a control group that does not see the ad and the other views the ad. Then they compare the two groups google searches to determine impact of the video or ad. For more information see the video below.
Clearly, this method does not capture the full impact of a video, (i.e.impact of production), as many of you have discussed. Do you think it offers any ideas of value for social justice and social progress movements? Are there ways it could be improved to measure a more expansive social justice impact?
Really interesting question Mariah. This kind of A/B testing is actually pretty massive in the area of fundraising with (large, often international) NGOs, where emails with one message are tested against another in terms of hits and donations they receive. When the results of A/B testing are known the winning email is then sent out more broadly or different emails are developed for different categories of people based on things like demographics and issue interest. So this is about fitting the message/actions to the ‘audience’ or participant groups. It’s pretty top down and personally I think a bit spurious! I'm told that large NGOs often buy large databases of email addresses to test on from organisations like change.org who have collected info on our petition signing. So for me that's not far off (methodologically and ethically) gleaning intelligence from search cookies like Google are doing here. It's easy to see the connection between marketing (which Google is promoting here) and fundraising (that NGOs do). Forgetting for a moment the fact that the cost of this Google intelligence I assume makes it prohibitive for most/many social change organisations and activists and the ethical issues some of these organisations may have with cookie search tracking, it’s worth asking: Is this kind of testing still useful for social change video impact? It may be useful to know if your video inspires people to get more information or to join a network or email group, but on the whole I would think it's really limited if you want people to do more than click, join or donate.
Sam Gregory from WITNESS mentioned in an easily post the different ways Video for Change can incite or contribute to change. Using a slightly different list here (which we've been using in our Video for Change and impact research and v4c.org), we can suggest that Video for Change can have impact by doing any of the following:
1. Changing Structures
· The abolition or alteration of existing government, institutional or corporate policies or the creation of new ones (policy change)
· The abolition or alteration of existing laws or the creation of new ones (legal change)
· Altering the practices of governments, companies or institutions (practice change)
2. Changing Minds and Behaviors
· Altering individual or collective attitudes and behaviors (behavior change)
· Altering the way certain groups or issues are represented in the media and/or public sphere (representational change)
3. Building Movements
· Creating relationships, either by building or sustaining them (relationship change)
· Supporting new interaction and dialogue, by creating new spaces for communication (discourse change)
4. Building Capacities
· Increasing people’s knowledge/skills/access to knowledge and information (changing capacities)
I guess you could say this kind of testing by Google can play a small role in assessing some indicators for a few of these forms of social chang, but this would be very limited. In our work at v4c.org we've come up with about 10 qualitative and quantitative methods people might like to use to assess impact. Maybe we could list these here is that's useful? (Egbert or Andrew?)
It would be great to hear more about people's impact assessment methods. Looking forward to reading the new posts on this now too!
cheers
Tanya