Massimo Poesio just sent me a pointer to the following awesome web application:
- Phrase Detectives (the AnaWiki annotation game)
Annotation Game
Annotation games (aka “games with a purpose”) were popularized by van Ahn’s ESP game, though I first heard about them through David Stork’s Open Mind project.
Unlike the mechanical Turk, the games approach tries to make the task somewhat fun and competitive. It seems like making the users “detectives” is a thin veneer of “fun”, but they maintain the metaphor beautifully throughout the whole site, so it works.
As with many games, Phrase Detectives pays out in leader board bragging rights and cash prizes rather than directly for work completed as on Mechanical Turk.
Phrase Detective Tasks
The really brilliant part is how they break the coref annotation task into four easy-to-answer questions about a single highlighted phrase.
- Not Mentioned Before: yes/no question as to whether the referent of highlighted phrase was previously mentioned in the text
- Mentioned Before: highlight previous mention of a given phrase
- Non-Referring: pick out non-referential nouns (like the “there” in “there is trouble brewing”)
- Property of Another Phrase: pick out other phrases that describe someone already mentioned (e.g. attributives or apositives)
The site also has nice clean, easy-to-follow graphics, and appears to still have users after two years.
Adjudication Phase
OK, they call it “Detectives Conference”, but the idea is you get to vote yes/no as to whether someone else’s answer is right. This is a good idea widely used on Mechanical Turk because it’s easier to check someone’s work than to create it from scratch.
Read All About It
It was developed by academics, so there are at least as many papers as contributors:
- Chamberlain, Jon, Massimo Poesio and Udo Kruschwitz. 2008. Phrase Detectives: A Web-based Collaborative Annotation Game. I-KNOW.
- Chamberlain, Poesio and Kruschwitz. 2008. Addressing the Resource Bottleneck to Create Large-Scale Annotated Texts. ACL workshop.
- Poesio, Kruschwitz and Chamberlain. 2008. ANAWIKI Creating anaphorically annotated resources through web cooperation. LREC.
- Kruschwitz, Chamberlain and Poesio. 2009. (Linguistic) Science Through Web Collaboration in the ANAWIKI project
Coreference Annotation
There are “expert” annotated within-doc coref corpora for the MUC 7 and ACE 2005 evaluations (available from LDC, who charge an arm and a leg for this stuff, especially for commercial rights).
LingPipe does within-document coreference and we’ve worked on cross-document coreference.
More Like This
As soon as you find one of these things, you find more. Check out:
- Sentiment Quiz (Facebook app to do sentiment annotation)
I’d love to hear about more of these if anyone knows any.
Leave a Reply