Expertpedia

News room staffs have shrunk by one-third over the past decade even as competition with leaner, more profitable Web sites has intensified. The remaining reporters are forced to produce more stories with less time to prepare for them.

Reporters increasingly find themselves working on stories on unfamiliar topics, leading to a real risk of errors, misleading stories or the type of pack journalism where everyone interviews the same source regardless of relevance or real expertise.

University PR departments and think tanks also prey on overworked reporters. The most aggressive ones pounce on major news stories, offering reporters interviews with their faculty, whether they are relevant or not to the actual story. Harried reporters feel time pressure to accept this.

My project, Expertpedia, aims to be a newsroom tool to help reporters quickly locate the most relevant experts on the needed topic.

Page 1

1. Users would input a search phrase on the site. Expertpedia would then input it into Google Scholar. The author of the first article to pop up, the most cited academic paper, would become the first person on our list.

2. That person’s name would then be rerun through Google Scholar along with the original search to call up all the relevant academic papers he/she has written on the topic.

3. The author’s name would then be run through a Google News search to come up with op-eds and analyses he/she may have written in the mainstream media.

4. The site would then trawl through university Websites to pull up contact information on the author.

5. There would also be space for the author’s to be rated by Expertpedia users based on their ability to explain their ideas well on TV, radio and in print.

Unnamed Mockup 2

Expertpedia is not a database that needs curating or updating, like some other Websites that aim to bring experts to journalists. But instead, it operates more like a search engine.

There are some problems here. One issue could be timeframe. A plane crash paper about engine failures in WWI era fighter planes might have more citations just because it’s been out longer than other ones. Also, some scientific papers could have more than a dozen authors.

Using Google news to get op-eds would only turn up current ones, not older, but still possibly relevant ones.

Also, how would we get contact information from people retired from academia? How does it deal with people with common names and would bad ratings for one academic accidentally tarnish another one? How do we prevent PR departments from gaming the ratings system?

And the searches initial Google Scholar searches are not always relevant. The fourth entry on the page of Plane Crash Experts is somehow Mercer Mayer, the author of such children’s books as ‘’Frog, Where are You?”

Page 3

 

Nonetheless, I am confident these hurdles could be overcome in development, leaving reporters with a valuable time-saving tool and the public with better, more relevant information.

2 thoughts on “Expertpedia

  1. Hi Ravi, Please let me know when it is ready to test. I would like to contribute your testing.

  2. Ravi, I like the idea a great deal. As we discussed in class, other efforts have entered the space, but they’ve quickly been coopted by PR forces – see Profnet. I think this is a tool that’s really designed with reporters in mind. As a result, it raises questions of who will end up paying for it – I can’t imagine reporters being willing to pay much, but perhaps if it becomes part of the suite of tools any professional newsroom needs, there’s a revenue stream there. I think you would want to be careful about letting sources pay, as that’s likely to taint the network in one way or another.

    There are countless questions about how an algorithm would work best. I think you took some good first steps in showing how you could retrieve pieces of an entry – the really interesting next step would be to assemble some manual queries for a couple of real-world requests – i.e., pick a few breaking stories, think about who you’d want as a source, and walk through a process to conduct Google Scholar and Google News sources to surface a list of experts, then see how your system did in meeting your needs. Next step is to do the same thing for someone else and react to his/her feedback. All of this will end up shaping the prototype and wireframes, and bringing you closer to something you can propose for funding.

    I think this is something that’s a few turns of the crank away from something fundable, but I think you could get closer by tuning the algorithm yourself, manually, before turning it into software. Would be excited to see this go further.

Comments are closed.