Computational Journalism


Introduction

To be successful in modern journalism, both media literacy and digital literacy are incredibly important. In some ways, computational journalism is the ultimate combination of modern data and technology alongside traditional journalistic information-gathering and disseminating techniques. That’s because computational journalists immerse themselves in numbers, data, and digital tools, and they use highly developed skills to draw out and make sense of the stories inside them—often with the result of dynamic insights regarding systemic problems.

According to media scholar Nicholas Diakopoulos, computational journalism is “the application of computing and computational thinking to the activities of journalism, including information gathering, organization and sensemaking, communication and presentation, and dissemination and public response to news information, all while upholding core values of journalism such as accuracy and verifiability.”

Although this still-developing branch of journalism includes what is known as “computer-assisted reporting,” computational journalism is much more expansive than merely applying technology to the journalistic process. Computational journalists use computers and their vast affordances to analyze information by aggregating it, automating it, drawing correlations within it and a variety of other skills. For this reason, computational journalism reflects a number of the same subfields shared by computer science, including information retrieval, content analysis, visualization, and personalization. Computational journalists themselves are usually skilled in numerous programming languages and familiar with a variety of data analysis software. These journalists also use algorithms to automate journalistic processes or decrease the workloads and costs associated with them.

Computational journalists also employ what is known as “computational thinking,” or the ability to think mathematically and strategically in order to efficiently and creatively solve, automate, or otherwise rethink problems. Journalists in this subfield are highly digitally literate, and they use computational thinking to create and apply tools that make journalism more efficient and make audiences more invested. A day in the life of a computational journalist might include scraping government sites for public records data, creating a script to analyze those data for key patterns, and then applying programming and design skills to a data visualization or digital tool that makes sense of the big picture for news audiences.

Recent examples of data and computational journalism include coverage of corruption within secret offshore financial organizations, an examination of the link between race and city infrastructure in the U.S., and a national look at evictions during the COVID-19 pandemic. All of these stories involved gathering a substantial amount of data and, perhaps more importantly, analyzing, organizing, and presenting that information using logics that intelligently blend journalism with other fields. Notably, in recent years, there has been a clear and growing appreciation of data journalism and its value to newsrooms and news audiences, and that has resulted in increasing demand for data journalists at local and national news outlets.

Computational journalism comes with the same high ethical standards as traditional journalism, such as verification and transparency, to ensure that the process of gathering, analyzing, and disseminating information to the public is accurate and inclusive. Because it often relies on making sense of large datasets, data journalism is a notably collaborative subfield. Journalists often create open-source datasets, coded scripts, and other developed tools of the trade, and then share those resources with one another and with the public. For example, The New York Times launched a short program to teach its journalists data skills, and the paper made that course open-source when publishing it online. You can also find the actual code powering projects by outlets ranging from FiveThirtyEight to The Marshall Project to The Washington Post on the code-sharing platform GitHub.

Key Takeaways

  • Computational journalism covers both “the application of computing and computational thinking” to various journalistic activities, including information gathering, sensemaking, and information dissemination. Such an approach maintains an allegiance to core journalistic values like verification and transparency.

  • Computational journalists are often skilled in numerous programming languages and familiar with a variety of data analysis software. However, such journalists will often work in teams with non-programmers to produce high-impact journalism.

  • Data journalism is a growing and particularly collaborative genre of journalism. Journalists often share (or make open-source) datasets, scripts, and other tools of the trade with each other and with the public.


Attachments