Blog | June 15, 2016 | Indra de Lanerolle
The phrase “It’s not about the tech” has become a truism in the overlapping landscapes of ICT for Development (ICTD), ‘civic tech’ and ‘technology for transparency and accountability’ (T4T&A) - as well as neighbouring fields of social and technical innovation.
It embodies a number of appealing ideas: first, that anyone working towards social change should focus on the social — on the people who are meant to benefit by that change; and second, that however powerful technologies may seem, human agency matters — choices are made by people, not the machines they invent.
The phrase may have also gained support in response to some technology evangelists' over-enthusiasm and overly optimistic hopes and promises of the impacts of technological innovation.
But a truism should be true.
Sometimes it really is about the tech
To question it, our recent report on tool selection is titled: ‘Sometimes, it is about the tech’. In the course of our research, we interviewed thirty-eight people in Kenya and South Africa who had been responsible for choosing technologies when implementing transparency and accountability initiatives. We were interested in the processes they went through in choosing the particular tools that they used.
We found that, in many cases, people are not choosing the tech well and are not choosing the right tech. Interviewees recognised, in retrospect, the importance of the technology choices they had made - though many also said that at the time, they were much less aware of how much those choices might affect the outcomes of their initiatives.
Most would have chosen differently if they were in the same position again.
“Technology is neither good nor bad; nor is it neutral”
We found many examples of the ways in which digital tools, and choices of tool, affect the success of ‘civic tech' initiatives. Certainly it matters how technologies are used. But each specific tool also embodies assumptions and ideas in its design that affects how it is, and can be, used.
This is not a new discovery. As the technology historian Melvin Kranzberg put it: “Technology is neither good nor bad; nor is it neutral”. Lawrence Lessig made a similar case when he argued that: “Code is Law”. He pointed out that software — along with laws, social norms and markets — can regulate individual and social behaviour. Laws can make seatbelt use compulsory, but car design can make it difficult or impossible to start a car without a seatbelt on.
Technology choices in initiatives that aim to improve transparency and accountability may have powerful effects. If an initiative uses WhatsApp as a channel for citizens to report corruption, the messages will be strongly ‘end-to-end’ encrypted - regulating the behaviour of governments or other actors if they seek to read those messages (though meta-data will still be preserved). If the initiatives use Facebook Messenger, content will not be encrypted in the same way. Such decisions could affect the risks users face and could also affect their willingness to use a tool. Other applications, say for example, YouTube and Vimeo may differ in their consumption of data which may increase costs for users which may in turn, affect use.
The initiatives’ purposes and the tools in our study varied greatly. Some were focused on mobile or online corruption reporting, others on public service monitoring, open government data publishing, complaints systems or public data mapping and budget tracking. The tools they used included ‘off the shelf’ software, open source software developed within the ‘civic tech’ community, bespoke software created specifically for the initiatives as well as popular social media platforms.
Across this variety we found that less than a quarter of the organisations were happy with the tools they had chosen.
They often found technical issues that made the tool hard to use for their purpose, while half the organisations discovered that their intended users did not use the tools to the extent that they had hoped (a trend that was often linked to specific attributes of the tool).
De-coding tools can be challenging
Choosing between the many technologies available is not always easy, though. Differences are not transparent and the effects of those differences and their relevance to the aims of the initiative may be uncertain. Many of the people we spoke to had very limited technical knowledge, experience or skills and this limited their ability to de-code the differences between options.
One of the most common frustrations interviewees reported was that the intended users didn’t use the tool they had developed. This uptake failure is common not only in the ‘civic tech’ fields we examined in our research. It has been noted since at least the 1990s in the business world as well as in the development world. In IT departments of large corporations, ‘change management’ techniques were introduced to answer this problem - changing the work practices of employees to adapt to the introduction of new technologies. In ‘civic tech’, the users are rarely employees who can be instructed or even trained. The tech choices must be adapted to the people we want to use them, not the other way around.
Try before you buy
So what should those working in the field do about improving tool selection? We developed six ‘rules’ for better tool choices grounded in our research findings. Possibly the most important recommendation we make is to urge people to test or ‘trial’ technologies before making a final selection. This might seem obvious, but in our sample, it was rarely done.
Testing in the field offers the opportunity to explore how a specific technology and a specific group of people interact. It often brings issues to the surface that are initially far from obvious, and exposes explicit or implicit assumptions about a technology and about its intended users. Most importantly, it focuses our thinking on where tools belong: in the hands of people. Human agency does matter. As our research suggests, people are at least as complex and surprising as any of the tools that people have built. But tools are built by people too.
Find an online guide to tool selection here.
About the author
Indra de Lanerolle runs the Network Society Lab, University of Witwatersrand, Johannesburg, South Africa
This blog was originally posted on the Making All Voices Count website