WORKSHOP: Media2012 & ScraperWiki \ Hacks and Hackers (2011)
Since the phone-hacking scandal engulfed Rupert Murdoch’s news empire, the methods journalists use to find stories have been brought into question. Luckily there does exist a new set of tools to help ambitious newshounds break new ground, and one that does not involve them getting on the wrong side of the law. Scraperwiki is that tool, an online platform that by helping to gather and sort data can help journalists, researchers or even curious citizens better understand the issues of the day. This workshop acted as an introduction to ‘scraping’ and focused on the forthcoming Olympic Games.
ScraperWiki is a place where a community of programmers come together to sift information. It is for people who know that ‘copy and paste’ from the web does not work. It is a data intermediary platform that allows information to be transformed into a structured database which can be interrogated, analysed, mapped and scheduled to refresh automatically as the source information changes.
Supported by MANET