Dr. Kyusong Lee
Postdoc, Language Technologies Institute, CMU
- Tuesday, October 30, 2018 - 12:00pm
- Gates Hillman 6501
Robust testing of (spoken) dialog systems requires large numbers of real users who can try the system out. Crowdsourcing offers the possibility to recruit such users, but entails considerable time and effort for designers in familiarization with platforms and learning how to design and implement suitable tests. Moreover, researchers (the requesters) have to overcome the following difficulties: connecting the dialog systems that are to be assessed to the crowdsourcing platform, paying the workers with fair wage, assessing the quality of the workers' production, getting solid final results. DialCrowd makes system assessment using crowdsourcing easier by providing tools, templates, analytics, and quality controls to allow designers and researchers to quickly implement high quality and effective recruitment and testing. This 'crowdsourcing for dummies' approach saves research time and resources, and will help ensure high quality in research and results.
Lunch will be provided. If you plan to attend, please sign up here:
For the Crowdsourcing Lunch Seminar please go to:
To subscribe Crowdsourcing Lunch Seminar mailing list: