The Media Map Project website is finally out of the beta testing phase. We have gone live! We would like to invite our readers to visit and explore our new site. Please play around with the updated datasets; map them, use the charts and create graphs. We love feedback, so please write to us on your experience using the site, your ideas and especially your compliments & complaints.

Keep visiting us regularly as we have a lot more research coming your way before the end of the year. There will be several interesting case studies on varied countries, research papers and other information that will be published shortly.

Thank you for your continued support and let us know what else you would like to see from us in the future.

By Ericha Hager; Internews Center for Innovation and Learning Intern

The second annual Mashable Media Summit was the place to be for media enthusiasts and technology junkies. This gathering, which took place last weekend at The Times Center in New York City, brought together over 300 professionals from a wide swath of the media landscape to learn how new technologies are shaping the future of journalism, redefining the boundaries of consumers/producers, and encouraging the design of new business models and revenue streams. Founders, presidents, CEOs, COOs, CCOs, SVPs and all other powerful acronyms were sharing media secrets of the leading lifestyle publications, television networks, and social media outlets.

Coming from the non-profit media development sector, I felt a little out of my game. Most of the time when I engage in conversations about innovative trends in media, SMS text messaging, interactive voice response, or data visualization platforms are the topics of choice. This conference focused on the exact opposite user demographic than the one typically found in the regions where Internews works. I was a little dubious after the opening plenary listed touch screens as one of the top five trends of 2012 to look out for. I mean, it shouldn’t matter to a media development organization that that 70% of viewers are using “second screens” (smartphone, computer, or tablet) to multitask while watching television or that brands are now starting to become media companies instead of the media being used to advertise for brands, right?

Not necessarily. While not all of the topics discussed at the Summit are applicable in this field of work, it is still important to be aware of where the future of media is headed. The new platforms and slick technologies discussed at the event are just now hitting the market, but they will be considered old in no time. The rate at which technology is developed and disseminated is astonishing. As one speaker pointed out, many digital electronic devices are subject to one of the many variations of Moore’s Law which states that technological advances occur at an exponential rate in such that every 18 months devices are twice as advanced as their predecessor and cost half as much as they did originally.

As these advances occur, technology is becoming more widespread and affordable than ever before. At this point, more people have access to cell phones than clean drinking water in much of Africa. We can only assume that the number of smartphones being used will overtake simple phones in this region sooner than later. Tablets will be next on the horizon with their light weight and ease of transport. Being aware of what is coming down the digital technology pipe and anticipating ways to integrate it into existing forms of traditional media will increase the effectiveness and timeliness of projects. If the media development sector is able to stay ahead of the curve and have projects hit the ground running, the better it will be able to improve the flow of information and open up new channels of communication in areas that need it most.

By Tara Susman-Peña ; Director of Research of The Media Map Project

The Data Without Borders first inaugural Datadive took place Oct 14-16, 2011 in New York.  The event brought together three NGOs with more data than they had capacity to deal with, together with approximately 70 volunteers to work with the data.  Volunteers’ skills clustered around statistics, visualization, and computer programming (or hacking, if you’re a cool nerd).

A map of our collective skill sets:



So what is a DataDive exactly?  I went there to learn just that (I don’t think I quite pass for a cool nerd, sadly).

Day 1 Kickoff:

3 NGOs present their datasets & projects.

1)      NYCLU – the NY chapter of the ACLU.  Their project uses data from the NYC police – what’s called “stop, question, frisk” data that the police produce to track their own activity.  They need help shrinking file sizes through data recoding and cleaning, and visualizing data including geographic mapping of incidents and demographics.  They also want help brainstorming ideas about what to explore in the data.

2)      MiX Market – a source for data on microfinance around the world.  They need scraping for data on microfinance in Africa.  I feel a pang when I learn what scraping is – writing computer code to automatically convert data files from html or pdf into excel, which then update automatically when new data is added.  Oh, the time we have spent manually converting such files (i.e. copy, paste, fix formatting, and proofread) for the Media Map website!  Then MiX Market wanted help with data cleaning and finally, analysis on key issues for microfinance.

3)      UN Global Pulse – the Executive Office of the UN Secretary-General’s innovation initiative.  This group is meant to serve a parallel function to the Secretary-General that The Center for Innovation and Learning serves for Internews.  They propose two projects: one, to “see” fertilizer in Uganda using NASA satellite imagery to try to understand farmers’ practices and the impact that fertilizer has over time.  The other, a fascinating global survey (but participants were sworn to secrecy until the UN announces results in early November).  Stay tuned on that one.

Day 2:

A patchwork summary of the day can be found on the Data Without Borders wiki.

Participants start to trickle in at 8am.  Each group makes an informal and brief presentation about their data, and people form into groups according to their interest and array themselves throughout the Green Spaces loft space in Tribeca.  I work with the largest group, which is working on the NYCLU data, which requires huge amounts of cleaning.  One challenge: there doesn’t seem to be any publicly available statistics on crime in NY to the level of granularity that the NYCLU wants (to be able to investigate the degree to which police stop, question, and frisk people with certain demographic characteristics, out of proportion to the degree to which those demographics commit crimes).  Several people create some maps that look at interesting aspects of the data.  For example, this map shows “percent of blacks stopped to percent of blacks in the population, by precinct, 2010.” (Numbers on top indicate the precinct number; numbers on the bottom is the ratio). The map shows parts of Brooklyn and Staten Island.

A stroll around the loft to look at the progress of other projects finds most laptops displaying inscrutable (at least, to me) computer code.  Saturday saw a lot of bonding, a goodly supply of Red Bull, communal head scratching, hard work, time wasting, brainstorming, asking for and getting help, creativity, learning and sharing, and a fair amount of progress, especially given the short time period.  People come and go, with some lasting until 2am.

Day 3:

The participants from each team present what they accomplished over the weekend.

Each team was given a substantial leg up toward their goal, but there was plenty of work left to do.

Some overall observations:

  • Might be better to identify and select volunteers according to a range of skill sets, organize more beforehand and set up lines of communication in advance
  • Perhaps a bit more structure would help, and clearer information about how the weekend was organized from before we even walked in the door
  • Volunteers could have gotten much further in their analysis and visualization if the NGOs came to the event with clean data
  • It was helpful to have a data expert in the inside of the NGO with identified research questions
  • Lines of communication within a large team were not always clear, so there was potential for people to work at cross purposes or with unaligned goals
  • Would be good to hear from the NGO afterward (and throughout their research project) about what was useful, what not
  • Some interest amongst volunteers to continue on after the event – what is the best way to facilitate this?
  • Additional recommendations from the NYCLU team

DataDives coming soon to San Francisco, and beyond.

By Amy Chen; Masters student at Columbia University’s School of International & Public affairs

Earlier this year I joined a team of student consultants assisting Internews in their Media Map Project. Our objective was to create a report outlining the role of Monitoring and Evaluation (M&E) in donor decisions. Over the course of 4-months we interviewed 23 individuals from the media development field, focusing our efforts on donors: private, public, and multi-lateral. This post, however, is not going to outline the results of that project (for those interested, the report is available online), but recount my experience on the other side, facing the implementation challenges of media development M&E.

My experience in media development M&E took place in Bhutan this summer, but to understand the lens through which I view this experience, it is important to note I spent the first half of my summer conducting an endline evaluation in Chiapas, Mexico. Removed from the ivory tower, I spent six weeks traveling from village to village, collecting data for a randomized controlled trial (RCT). Unpredictable obstacles arose frequently, requiring us to make decisions that sacrificed procedures designed by academia to maintain consistency across 30 surveyors. Though my experience in the villages made me confident the program could have positive affects, I doubted the RCT could accurately reflect the potential impact of the program.

Already frustrated by my experience in Mexico, I arrived in Bhutan a bit uneasy. When my flight touched down I didn’t know what to expect, I didn’t even have a clear idea of what my role at the media development Civil Service Organization (CSO) would be. One staff member informed me I had been pitched as an “expert” on research, brought in to bolster the CSO’s research component. I immediately began to doubt my qualifications for the internship, having had very limited experience conducting field research.

I arrived in the office the next day, still jetlagged and unsure of myself. After a quick orientation into the NGO I sat down for a basic training on qualitative research methods, led by two American lecturers conducting research at the local college. As I acclimated to the pace of the office, I took on my first assignment: transcribing a recent focus group conducted by the program officers. The hour-long recording taught me about Bhutanese youth culture and the culture of the NGO.

Though the entire staff had obtained college degrees, some even had post-graduate degrees; none had been trained in M&E basics. The first 20 minutes of the focus group centered around a subject, though interesting, failed to address the key concerns and questions of the moderators. My first two days in the office left me relieved and concerned; relieved I could contribute some M&E and research knowledge, but concerned my limited experience put me ahead of individuals required to incorporate M&E into their projects. Over the next month and a half, I learned the strengths of my coworkers. All could incorporate M&E elements into their assignments, if only they had the opportunity to learn them.

Several donors had mentioned the lack of M&E training available to implementers in their interviews, but I had not realized the pervasiveness of the problem. Despite my frustrations conducting M&E this summer, I still believe it’s an important component of development work; but the experiences have made me question the extent M&E can capture the outcomes of a program and the implications of an increased emphasis on M&E results.

By Sankalpa Dashrath; Research Associate, The Media Map project

The image populated my Facebook newsfeed on Diwali morning. It was being ‘liked,’ commented on, and sent around in mass emails. It soon grew into the feel-good story du jour; the consensus was that good had triumphed over evil and light had chased away the darkness. Diwali- the Indian festival of light – and its glittering brightness could be spotted from space this year. At least that’s what everyone was saying. A satellite image, purported to be released by NASA, showed a pretty, colorful map of India as seen from space on Diwali night.

As I watched the image go viral over the internet, I wondered about its source. A quick check on the internet confirmed my suspicions. This was a solid case of data misrepresentation. The image was not from NASA, but from the National Geophysical Data Center (part of the USDOC) website. And no, it wasn’t a single snapshot of a specific night but rather a composite of more than a decade of data imagery overlaid into a single data visual.

Someone somewhere had misinterpreted the image and lots of people everywhere were repeating the fallacy, making it grow stronger by the minute. The truth was lost amongst the hundreds of people who wanted to believe this ‘lie.’There are many unfortunate lessons to be learned from this story.

As the world grows increasingly impatient and attention spans decrease, data visualizations are replacing detailed reports. Data visualizations are just incomplete snapshots. They do not require viewers to really think about the information they contain. They are subject to interpretation and can be misleading, as evidenced from the Diwali debacle.

Social media allows half-truths to become solidified in people’s minds. Repetition by large numbers can validate an untruth. Google’s algorithms don’t check facts; they simply provide results with the most hits. Again, repetition by large numbers triumphs accuracy. The reason why stories go viral on social media is that everyone wants to post ‘breaking news’, so the traditional checks on data sources are neglected. Speed becomes the barometer of newsworthiness.

Lost in the intellectual debates, is the sad fact that these seemingly innocuous mistakes mask unpleasant facts. In this case, the areas on the Diwali image that are colored in blue and green used to be visible from space from 1992 to 1998 but cannot be seen any longer. So the picture is now darker, literally and metaphorically. While the wired world celebrates this fallacy, the situation for many un-connected Indians is getting worse, not better; The New York Times reports that, “Almost half of the Indian population has no access to the electricity grid, and many more people suffer hours without power”. Meanwhile, India’s ranking in the UNDP’s Human Development Index has steadily dropped from 126th in 2006 to 128th in 2007/08 and to 134th in 2010. But, instead of trying to analyze factual data from reputable institutions, the connected world is busy celebrating a fake picture with no source.

A version of this blog also appears on The Morningside Post http://themorningsidepost.com/2011/10/the-truth-about-lies/

By Ericha Hager; Internews Center for Innovation and Learning Intern

I recently attended a US State Department  “Tech@State” conference. These daylong events occur quarterly and focus on different technology topics. This one addressed growing trends in data visualization. Data visualization, the visual representation of data, has been cited as an incredibly impactful and efficient way to transfer knowledge and meaning when done correctly. The amount of time it takes for people to absorb information through the visual representations of data is much faster than by reading data alone. In our increasingly fast paced and visually stimulated world, the ability to catch people’s attention in a quick, direct, and meaningful way is crucial to conveying messages and gaining support for causes.

The data visualization industry is becoming robust as data visualization is becoming a cornerstone element to any modern website. Companies and organizations are looking for ways to create stimulating and interactive ways to display data and apparently many of them are willing to pay for it. I have to admit that for the most part the Tech@State Data Visualization day seemed more like an opportunity for competing data visualization firms to self-promote and pitch their product. Most of the morning panelists were showing examples of data visualization that would completely deplete any medium sized company or organizational budget.  The rest of the day was primarily a show and tell of websites that have been restructured around a strong data visualization element without any tips and tricks on how to do your own data visualization in-house. From the standpoint of a newcomer to the field, I walked away with no new skills to test out, which was rather disappointing for a full day’s attendance.

One valuable thing about Tech@State is it gave me a better perspective as to what Internews’ Center for Innovation and Learning’s Media Map Project could achieve through well-developed data visualization. The Media Map website contrasts with the other websites I witnessed at Tech@State in that one of its primary purposes is data presentation. Media Map is designed to harness the power of data visualization to create a clear, easily navigable, well organized site that has the ability to display large amounts of information that can be cross referenced to draw conclusions about how media develops across countries and over time, and how media relates to development and to governance. There are a few key aspects of data visualization present on Media Map that were highlighted at the conference: the user has control over how the data is presented and what variables they want to test for; the data sets are available for download; and there is supporting text to augment the visualizations. While Media Map has integrated many successful aspects of effective data visualization, I think there is room to expand and change as people utilize it as a resource and identify ways in which it could be a more impactful research tool.

Data visualization can be very powerful and it is important that people develop the ability to critically process it. Data itself is very influential and, as I mentioned previously, data visualization allows people to ingest more information at a higher rate than average. This means  people need to become data literate to some degree, because the presentation of data can be misleading, data can be dirty, and even the data itself can be inaccurate and or simply false. As with all information, it is important to be aware of all influences, circumstances and omissions.  Overall, I think that data visualization is very impactful and look forward to the development of more universally accessible tools so everyone can leverage its utility.

The Media Map Project invites you to participate in the Media Map Challenge.

The object of the Media Map Challenge is for participants to visualize one or more of the raw datasets made available for download on the Media Map website (www.MediaMapResource.org) to compellingly illustrate a story about media in an original way.

Participants are welcome to use any of the forms of visualization found on the Media Map website (global map, scatterplot, and bar chart), and are also encouraged to use other ways to visualize their chosen data, including through free web visualization programs (such as Many Eyes, Swivel, Google Data Explorer), or by creating an original static or moving design.  Original drawings, photographs, or short animations or videos (to a maximum of 5 minutes) are also welcome.  At least one dataset made available on the Media Map website must be used.  Additional datasets from the Media Map website or other sources may also be used to explore relationships between media and development.

Submissions should be sent to MediaMap@internews.org by midnight EST on August 30, 2011.  Submissions should include the visualization with the file labeled Last name_First initial_Name of visualization, a CV, and a short description of the visualization including the dataset(s) used, an explanation of the visualization, and any relevant statistical or technical details (explanation no more than 1 page, 12pt font).  One entry per person allowed.  Employees of Internews, The World Bank Institute, and the Bill & Melinda Gates Foundation are ineligible to participate.

Winners will be chosen by the Media Map project team for their clarity, thoughtfulness, and compelling use of data to tell a story about media around the world. The winner will be announced before October 31, 2011, and will receive a $1,000 prize.  The winning visualization will be featured on the Media Map website.

Just published: a welcome new contribution to the cause of measuring media and its impact on development.  The volume of scholarly papers, Measuring Press Freedom and Media Contributions to Development: Evaluating the Evaluators, was edited by Monroe Price, Susan Abbott and Libby Morgan. The book is described as “bringing together a variety of viewpoints and perspectives on evaluating media assistance, Measures of Press Freedom and Media Contributions to Development offers a critical reflection on the theories and tools of measurements that are used by the academic, donor, and civil society communities.”  The book promises to significantly advance current thinking on how best to measure media and its impacts.

For a shorter, more policy-focused take on the volume, look at John Burgess’ report for The Center for International Media Assistance based on this book called Evaluating the Evaluators: Media Freedom Indexes and What they Measure.

Tara Susman-Peña is Director of Research, The Media Map Project

The Media Map Project was in Mali in February to research the impact of donor-funded media development interventions over the last 20 years.  While we’re still preparing our findings on that topic, I wanted to share some observations.  Some striking characteristics of Mali’s media landscape are directly related to the discussions we have been having on this blog about the advantages and limits of using Freedom House’s Freedom of the Press Index as the go-to dataset to stand in for media development.  Freedom House gives Mali a “Free” rating.   Our field research supports this assessment.  Our desk research also shows that good empirical evidence already exists that supports a strong connection between the media and development.  In a soon-to-be released report outlining how quantitative data has been used to measure the health of a country’s media sector in the context of economic development, Sanjukta Roy writes, “The academic literature,  through theoretical models and empirical testing, has validated the role of the media in facilitating good governance and favorable developmental outcomes.”  The index that these studies have overwhelmingly used to measure the media sector?  Freedom House’s Freedom of the Press Index.

But does that mean that we can assume that a country with a Free Press rating will have similarly high ratings in governance and development indexes?

Unfortunately, no.  But given the way that the academic literature has emphasized the Free Press measurement, it’s hard not to assume that a Free Press rating also implies a healthy media sector and positive development milestones when you narrow the focus to the country level.

Sadly, in Mali, that is not at all the case.  Mali has been free from colonialism for 50 years and is celebrating its 20th year of democracy.  Mali is also one of the ten least developed countries in the world, by various different measures of development.  Mali has a notably higher mortality rate, lower level of education, and worse environmental challenges than sub-Saharan Africa as a whole, and indeed, low income countries when taken as a group.  The government of Mali is democratic, but not transparent about its activities.

What about the media in Mali?   Yes,  Mali’s press is free.  But our trip to Mali vividly illuminated that the rating “Free,” accurate though it may be, cannot be taken to represent the health of Mali’s media sector.  What we saw in Mali was that the media system is characterized by disorganization, hyperbolic and sensationalized reporting, and a lack of professionalism overall.  Not a single journalism school exists.  A harsh libel law is still on the books and is enforced from time to time.  Most media, whether private or community, cannot sustain itself as a business, and so brown envelope journalism is alive and well.  Almost 75% of citizens are illiterate, limiting the types of media most people can access.  Many Malians also lack the ability to discern the difference between good and poor quality information.  No comparable, good quality data on the media market or on media use is collected consistently.   Thus, it is difficult to assess the media across the country and over time from either a business perspective or from the citizens’ perspective.

It’s not all bad news. Recently the Minister of Health had to step down because the media exposed the misappropriation of international grant funds going on in his office.  Community radio is alive and well, with over 250 stations spread throughout much of the country.  Mali has been connected to the Internet since 1996 (though to put that in context, the number of internet users is still far lower than the average for low-scoring sub-Saharan Africa), and mobile penetration is on the rise.

Media has received a lot of support from donors over the last 20 years, but seems to be less of a priority for donors now that the Malian democracy is pretty effective.  But clearly the media system still has a long way to go.

So what to do?  What would be the best approach to leverage Mali’s Free Press into a developed media system?

Tara Susman-Peña is Director of Research of the Media Map Project.

The trials and tribulations of finding reliable proxies for media development continues! As I discussed in my last post, I’ve been searching for some way to quantify media development, by combing through data that other, larger organizations have already collected. I’ve been using the Media Sustainability Index’s 5 criteria as a template. The unfortunate problem with the MSI is that it only covers a limited number of countries, and for a limited number of years. I discussed in my previous post, this makes conducting statistical analysis difficult.

I thought it would be interesting to share some of the information that I’ve been gathering. I’m very receptive to your comments or ideas, and it might be useful for you to know exactly how hard we are working to get good data! I drew from a variety of datasets, most notably the World Bank Development Indicators, the Global Competitive Survey, the Global Integrity Report, and the Institutional Profiles Database. So, without further ado, I present some tentative proxies that may get at media development.

MSI Criteria 1: Legal and social norms protect and promote free speech and access to public information.

This may be one of the easier criteria to satisfy, as several organizations are interested in measuring legal and social norms. Reliable measurement is always an issue when trying measure something as amorphous as ‘norms’; however, survey data and expert opinion is really the only feasible way to get at some of these issues. Most notably, the Freedom House Freedom of the Press index covers most countries for several years, which clearly gets right to the heart of free speech. Additionally, their Freedom in the World data analyzes civil and political liberties, which are tied to social norms promoting freedom of speech.

The Global Competitiveness Report measures indicators that may serve as a proxy: intellectual property rights, antimonopoly policy, burden of regulation, and auditing and reporting standards. Some of these may also fit into MSI Criteria 4 and 5, but having variables overlap doesn’t it lend itself to rigorous analysis, so I keep them here.

The Institutional Profiles Database also covers some of the MSI 1. They have measures for freedom of association, freedom of movement and peoples, and emulation of neighboring countries.

The Global Integrity Report quantifies Rule of Law.

MSI Criteria 2: Journalism meets professional standards of quality.

Unfortunately, I have yet to come across anything that I think even begins to address this criteria. Journalist salaries may be one way of looking at this issue, though this data can be difficult to obtain as well. I have some leads on websites relevant to this area, so stay tuned. Ultimately this area may be the most difficult of the 5 to approximate using other sources.

MSI Criteria 3: Multiple news sources provide citizens with reliable, objective news.

Global Integrity Report has some interesting data here. They ask about ‘credible media sources’ and ‘public access to information.’

The World Bank Development Indicators look at how many newspapers there are per 1000 people, which at least gives us an idea that there are newspapers. It also looks at internet, radio and television use. It sadly does not tell us about the quality, nor even the number of news sources.

MSI Criteria 4: Independent media are well-managed businesses, allowing editorial independence.

There’s a lot of information about businesses in general, but less about media as a business. So most of the proxies I found deal with business as a whole, and so by extension, good business environment may also mean good media business environment, although that may be stretching a little bit. Perhaps a good way to put fears to rest is to see if there is a correlation between business environment and freedom of speech.

The Global Competitiveness Report asks about ethical behavior of firms, and the procedure to start a business. Additionally, Institutional Profiles Database also asks about the ease of starting a business, and the World Bank Development Indicators has data on the number of new businesses registered and the ease of starting a business. The Global Integrity Report also examines business and regulation.

MSI Criteria 5: Supporting institutions function in the professional interests of independent media

This last criteria can be taken extremely broadly. I included the Global Competitiveness Report’s ‘Electricity supply’ and the World Bank’s ‘Energy Use’ into this category, because I felt that stable electricity is an important asset for media development. As most communication is done using radio waves or televisions (and the internet!), a steady source of electricity may be an important variable. What may be interesting to look at is how electricity supplies actually affect media development, because there may be no causal relation at all.

The Global Competitiveness Report also includes a ‘Technological readiness’ measure, and the World Bank has a measure for Science and Tech Research and Development spending. Again, this may be stretching the idea a bit too far, but there will be time to refine the variables as the project moves along.

Another important institution that supports media development is education. The World Bank has data on literacy rates per country, which may be a useful proxy for the quality of education. Likewise, the Global Competitiveness Survey measures a ‘Quality of Education.’ One would hope that high levels of education help lead to professional standards, quality media reporting AND an interested audience.

More work should be done on finding other supporting institutions. Rule of Law and fair judiciaries may also be proxies to include in this section. As the project develops, some indicators may switch around, and we may throw some out altogether. It’s about the process, right?

Each of these datasets have their own unique benefits and drawbacks. Like the MSI, most of them do not span each country, and the country-years are generally low, meaning that the datasets only cover a few years. However, by compiling this list, we’re getting a little bit closer to having a wide range of data that help us to better measure media development. Unfortuantely, large scale, reliable datasets spanning 30 years are hard to find (and if they exist, they may not even cover the necessary areas!), so trying to piece together something useful is a challenging endeavor.

Kim Johnson is a Master’s Student at the School of International and Public Affairs, Columbia University.



Internews
Empowering local media worldwide

1640 Rhode Island Ave. NW, 7th Floor
Washington, DC 20036
www.internews.org (w)
+1 202 833-5740 (p)
+1 877 347-1522      
+1 202 833-5745 (f)