crowdsourcing site screenshots

12
Crowdsourcing of Research Information Selected Sites Professor Beth Evans Brooklyn College Library May 17, 2013

Upload: myspacelibrarian

Post on 16-Dec-2014

167 views

Category:

Education


0 download

DESCRIPTION

 

TRANSCRIPT

Page 1: Crowdsourcing Site Screenshots

Crowdsourcing of Research Information

Selected SitesProfessor Beth EvansBrooklyn College LibraryMay 17, 2013

Page 2: Crowdsourcing Site Screenshots

Users Add Content

Page 3: Crowdsourcing Site Screenshots

“In New York, this harnessing of collective wisdom was on display on October 22 at the New York Public Library for the Performing Arts, where the public was invited to contribute to Wikipedia’s articles on musical theater. The six-hour event, which NYPLcalled “Wikipedia! The Musical!” attracted dozens of people who made use of the library’s special collections to add and update entries to the open-source site. At any given moment between noon and 6 p.m., an average of 20 people plugged away at their laptops and researched materials from the library’s closed-stacks collection of newspaper clippings, videotapes, manuscripts, correspondence, sheet music, stage designs, programs, posters, and photographs.”

http://americanlibrariesmagazine.org/news/11012011/libraries-tap-crowd-power

http://editathon.org/

http://www.nypl.org/events/programs/2011/10/22/wikipedia-musical

Page 4: Crowdsourcing Site Screenshots

http://www.projectnoah.org/

http://eol.org

Page 5: Crowdsourcing Site Screenshots

Users Work With Content to Produce New Content

Page 6: Crowdsourcing Site Screenshots

“Faced with a tough data analysis challenge as he struggled to answer questions about how the immune system works, Dr. Ramy Arnaout of Beth Israel Deaconess Medical Center took an unusual step. He went beyond his circle of Harvard colleagues and beyond the expertise of fellow biologists; he turned to software programmers scattered around the world who had little expertise in the life sciences.

The result: A deeply biological problem — analyzing the makeup of genes that produce proteins involved in the immune system’s ability to identify microbes — could be rapidly and efficiently answered by a community of more than 400,000 computer programmers who try to solve competitive coding challenges posted on TopCoder, a platform used by big companies such as Google, Intel, and Facebook.

The case study offers evidence that academic research may benefit from adopting approaches more often employed by businesses, which already use prizes and crowdsourcing to solve basic problems that stymie their own research teams.”

http://www.bostonglobe.com/business/2013/02/11/crowdsourcing-innovation-harvard-study-suggests-prizes-can-spur-scientific-problem-solving/JxDkOkuIKboRjWAoJpM0OK/story.html

https://www.innocentive.com/

http://www.topcoder.com/

Page 7: Crowdsourcing Site Screenshots

Christian Terwiesch, “An Introduction to Operations Management”https://www.coursera.org/course/operationsStudent users solve problems presented in a MOOC on Operations Management. Their solutions, in turn, are added to the course as examples and enhance the course offered through the University of Pennsylvania.

Page 8: Crowdsourcing Site Screenshots

Reader Sourcinghttp://www.readersourcing.org/A conceptual model, not put into practice on this particular site. Reader Sourcing proposes using reader opinions in place of referees to vet scholarly papers. Each reviewer would be rated, “so that judgments from those who have proven to be good reviewers do count more than those who should not be trusted. Such a rating is implicitly and dynamically generated by the system, through the continuous comparison of the judgments expressed by the readers on each paper with its current score; providing - or having provided - correct (wrong) judgments will therefore lead to higher (lower) reader ratings, hopefully generating a virtuous circle.”

Page 9: Crowdsourcing Site Screenshots

Users Transcribe, Decode or Translate Content

Page 10: Crowdsourcing Site Screenshots

http://www.ucl.ac.uk/Bentham-Project/

http://menus.nypl.org/

Page 11: Crowdsourcing Site Screenshots

http://www.digitalkoot.fi/index_en.htmDigiTalkoot was a joint project run by the National Library of Finland and Microtask to index and make searchable the library's archives of Finnish history. Where automation has failed to recognize individual words in scanned documents, volunteers participate in making the fixes by playing the games, Mole Hunt and Mole Bridge.

http://perspektiv.bf.dk/Nyheder_BPR/2011/FagligeTermer.aspx

Librarians in special libraries in Denmark have set up a LinkedIN group where the members try to translate library jargon into more comprehensible language. Many of the group members need to explain the workings of classification or indexing to their boss or co-workers in a company where they are the only library professionals

Page 12: Crowdsourcing Site Screenshots

http://transcribe.lib.warwick.ac.uk/yt/index.php/Main_Page

“We are looking for volunteers to translate the documents from Yiddish to English”.