Computation + Journalism 2013, Conferences

4 lessons from Computation + Journalism 2013

Full disclosure: The DeWitt Wallace Center for Media and Democracy, which houses the Reporters’ Lab, was a sponsor of the 2013 Computation + Journalism Symposium.

Mo Tamman | Photo courtesy of Georgia Tech

Maneesh Agrawala, associate professor of electrical and computer engineering at the University of California-Berkeley, and Mo Tamman, Reuters data journalist, discuss the use of exploratory data analysis during a session at the 2013 Computation + Journalism Symposium at Georgia Tech Jan. 31. | Photo courtesy of Georgia Tech

On the surface, Georgia Tech’s Technology Square Research Building may seem like an odd choice for a gathering focused on the future of journalism.

Its steel-and-glass facade is meant to evoke the high-tech work under way inside by faculty and students in the electrical engineering and computer science departments. Yet last week, some of these same researchers joined other technologists, academics and reporters for a discussion about the influence and potential of computational techniques on the journalism industry.

In more than a dozen sessions at the 2013 Computation + Journalism Symposium, speakers covered a range of topics from the origins of “precision journalism” to the implications of Google’s Project Glass. Here were my big takeaways.

1.) Good data journalism applies the scientific method

Phil Meyer

Phil Meyer, professor emeritus at UNC-Chapel Hill and author of Precision Journalism, told the audience his techniques were about applying the scientific method to the journalism process. | Photo courtesy of Georgia Tech

When Phil Meyer was a reporter covering the race riots in 1967 for Knight newspapers, the world’s fastest supercomputer had around 1/500th the processing power of today’s iPhone.

Yet many of Meyer’s techniques — which eventually won the staff of the Detroit Free Press a Pulitzer — would be recognizable to today’s computer-assisted reporters.

“The beginning of precision journalism wasn’t about computers, it was about applying the scientific method to journalism,” he told the crowd during a conversation with Georgia Tech Professor Irfan Essa.

Meyer, a professor emeritus at UNC-Chapel Hill who authored the seminal textbook Precision Journalism, said this commitment to finding the “truth about the facts” has become even more important in an age overwhelmed by information. By testing hypotheses with data, reporters can become the kinds of sources audiences can trust and build market value along the way.

As Essa and Meyer pointed out, however, it’s not enough to get information into readers’ hands; we have to get it into their heads. To accomplish that, Meyer said data journalists will need to borrow strategies from narrative journalists.

Nick Lemann, dean of Columbia’s Graduate School of Journalism and New Yorker staff writer, reiterated that need in a session later in the day.

Good narrative, Lemann said, has the power to engage audiences in a fundamentally different way than basic news stories — an observation backed up by brain science.

“I’d like to hold the line on narrative,” Lemann said. “Narrative has power over the human mind that’s incredible.”

2.) Acknowledge the value divide between technologists, journalists

Journalists have gotten pretty good at articulating the kinds of tools they want from technologists, be they academic researchers or commercial developers. At the Reporters’ Lab, we’ve even got our own wishlist of sorts.

Stay connected

Get more information about the 2013 Computation + Journalism Symposium and join the discussion.

But communicating the workflow requirements of a better Web scraper or more reliable entity extraction software takes more than just a technical vocabulary. If journalists are to capitalize on the work being done in the fields of computer and information science, we must acknowledge the value divide highlighted by Maurice Tamman, data journalist and editor at Reuters.

“The problem you have in journalism is that you can’t be wrong,” Tamman said. “They’ll sue your ass.”

During his session with UC-Berkeley electrical and computer engineering Professor Maneesh Agrawala, Tamman said journalists must be able to use data analysis techniques to make declarative, accurate statements even with messy, unstructured data. Although developers and computer scientists are sometimes willing to put up with this noise and the accompanying uncertainty because it’s part of the business, journalists have to be more picky.

To hedge against inaccuracy, Tamman tells his team that every data analysis it performs must be done “to the benefit of the defense” before it’s published.

This value divide is far from a deal breaker; it just requires honest conversations and an understanding about what each side expects.

3.) Computational journalism tools need better incentive structures

Duke Professor Jay Hamilton and Emily Bell, director of the Tow Center for Digital Journalism at Columbia, explain the challenges posed by the changes in the media economy. | Photo courtesy of Georgia Tech

Duke Professor Jay Hamilton and Emily Bell, director of the Tow Center for Digital Journalism at Columbia, explain the challenges posed by the changes in the media economy. | Photo courtesy of Georgia Tech

There’s a fundamental conversation news organizations are often unwilling or unable to have with those who can build reporting tools: What are journalists willing to give in return?

Academics need funding for graduate students.

Developers need startup capital.

And judging by an industry largely devoid of skunkworks, there’s little evidence to suggest newsrooms are making the investments necessary to create solutions to big reporting problems that drive up the costs of doing business.

However, as UNC-Chapel Hill Professor Ryan Thornburg pointed out on Twitter, there is investment in other areas.

Consider some of the tools suggested at CompJ:

  • In his session with Tow Center Director and Columbia Journalism Professor Emily Bell, Duke Professor Jay Hamilton said he’d like to see Narrative Science use its computational muscle to build and sell a “lead generator” — something that could tell a reporter that public official X has an X percent chance of being corrupt.
  • Tamman, despite working at one of the most sophisticated data companies in the world, said he’s still looking for a system to automatically extract names from documents, visualize their corresponding social networks and tie in to other structured data.

For tools like these, more page views and increased revenue are only positive externalities: they may lead to — but do not guarantee — higher-quality journalism in the long run.

As former New York Times data artist in residence Jer Thorp pointed out during his session with Columbia’s Lemann, newsrooms can’t just expect to hold hackathons and get what they want in exchange for pizza and coffee (although this is a great start).

If we want the kind of software capable of upgrading journalism’s watchdog potential, we need better incentives to entice top developer talent and better collaboration to ensure existing tools don’t gather dust.

4.) Journalism has plenty to offer other disciplines

Fortunately, there are already a few promising experiments with these new incentive structures.

Knight-Mozilla’s OpenNews project has Code Sprint grants available to fund solutions to “specific, repeatable journalistic problems” — a practical example of the “extended hacks” Thorp suggested. There’s also innovative work being done at academic-affiliated projects like the Brown Institute and the Knight Lab (not to mention our own prototypes).

Some of that work was on full display Thursday evening, when researchers, developers and journalists demoed a collection of tools to both tackle journalism problems and enhance the presentation of stories.

These projects, at least to me, are proof positive that the journalism industr has plenty to offer researchers and developers in other disciplines.

According to Lemann, that value may be the ability to engage with the public. To Hamilton, it might be the “positive spillover” that defines watchdog reporting from an economic perspective.

In both cases, we’re talking about the potential to make impact.

That can be an enticing prospect, regardless of your background. And judging by the attendance of CompJ and its high-tech location, quite a few people agree.

About Tyler Dukes

Tyler Dukes is the managing editor for Reporters' Lab, a project through Duke University's DeWitt Wallace Center for Media and Democracy. Follow him on Twitter as @mtdukes.
comments powered by Disqus

The Reporters' Lab welcomes relevant discussion from readers, but reserves the right to remove comments flagged as inappropriate or spam. The lab is not responsible for the content of user comments and cannot guarantee their accuracy.