Carnegie Mellon University at CSCW 2017
News
Carnegie Mellon University's Human-Computer Interaction Institute is headed to Portland, Oregon, for the 20th ACM Conference on Computer-Supported Cooperative Work and Social Computing. CSCW is the premier venue for presenting research in the design and use of technologies that affect groups, organizations, communities, and networks. The HCII faculty and students will be presenting four papers, including two Honorable Mentions during the 2017 CSCW conference.
[[{"fid":"3101","view_mode":"preview","type":"media","link_text":null,"fields":{},"attributes":{"alt":"Honorable Mention Icon","class":"panopoly-image-thumbnail media-element file-preview"}}]]Honorable Mention: Shaping Pro and Anti-Social Behavior on Twitch Through Moderation and Example-Setting
HCII authors include Ph.D. student Joseph R. Seering as well as the Newell-Simon Professor of Human-Computer Interaction Robert E. Kraut and HCII and H. John Heinz III College Associate Professor Laura A. Dabbish.
This paper reviews the potential for online communities to be supportive, cruel and anywhere in between. HCII authors analyzed millions of messages sent on the video streaming platform, Twitch, in efforts to determine the effectiveness of different techniques to encourage positive behavior and discourage negative behavior. Their research showed clear patterns of imitation and deterrence as well as the influence of certain types of users over others. Read the paper for a detailed review of this research project and the practical applications of the author's results.
[[{"fid":"3101","view_mode":"preview","type":"media","link_text":null,"fields":{},"attributes":{"alt":"Honorable Mention Icon","class":"panopoly-image-thumbnail media-element file-preview"}}]]Honorable Mention: Fruitful Feedback: Positive Affective Language and Source Anonymity Improve Critique Reception and Work Outcomes
HCII authors include Associate Professor Laura A. Dabbish and Ph.D. student Felicia Ng as well as former HCII faculty Steven P. Dow, now of UC San Diego.
Receiving feedback isn't always the most enjoyable experience. In this paper, the authors study online feedback including the tone of the language and the power relationships between the source and the receiver to make provide design implications for future platforms. The authors conducted an online experiment where they manipulated the language and source of feedback on a writing task. The researchers found that using positive language reduced annoyance and frustration, and improved post-feedback work quality. Likewise, anonymous feedback was preferred to receiving feedback from both a peer or authority figure. To learn more about the online experiment and recommendations for future feedback systems, read the complete paper.
HCII authors include Associate Professor Laura Dabbish and HCII Project Scientist and visiting scholar Mariah Tomprou partnered with researchers for this collective intelligence paper.
Collective intelligence is a group's capacity to perform tasks and is a key factor in successful collaboration. Dabbish, Tomprou and team used physiological synchrony to indicate coordination and rapport in collective intelligence for computer-mediated teams. In this project, 60 dyads completed the Test of Collective Intelligence while wearing physiological sensors. In their experiments, they found that synchrony in facial expressions was associated with collective intelligence and that synchrony in electrodermal activity was indicative of group satisfaction. View the results in detail and the implications for online collaboration in the full paper.
Team Dating Leads to Better Online Ad Hoc Collaborations
HCII authors includes Newell Simon Professor of Human-Computer Interaction Robert Kraut and former HCII Steven Dow
Dating within a team is often frowned upon, but could team dating lead to better online collaborations? That's what Kraut, Dow and Ionna Lykourentzou of the Luxemburg Institute of Science and Technology want to know in their project, "Team Dating Leads to Better Online Ad Hoc Collaborations." In this project, researchers studied team dating through two online experiments. In the first experiment, workers from a crowd platform completed a task independently, discussed with three other workers, and evaluated their team date interactions. Workers selected their preferred teammates from average ratings of these interactions. In the second experiment, the researchers replicated the individual and team tasks, and created teams either honoring pairwise preferences or by randomly partnering teams together. The results show that teams performed better when partnered using the preferred method than when randomly assigned. For more information about this implication on ad hoc collaboration, read the full paper.
More Papers From Carnegie Mellon University:
The Impact of Assistive Technology on Communication Quality Between Deaf and Hearing Individuals
Authors: Jan Gugenheimer, Ulm University; Katrin Plaumann, Ulm University; Florian Schaub, Carnegie Mellon University; Patrizia Di Campli San Vito, Ulm University; Saskia Duck, Ulm University; Melanie Rabus, Ulm University; Enrico Rukzio, Institute of Media Informatics
Photo Sharing in the Arab Gulf: Expressing the Collective and Autonomous Selves
Authors: Norah Abokhodair, University of Washington; Adam Hodges, Carnegie Mellon University Qatar; Sarah Vieweg, Qatar Computing Research Institute
[[{"fid":"3101","view_mode":"preview","type":"media","link_text":null,"fields":{},"attributes":{"alt":"Honorable Mention Icon","class":"panopoly-image-thumbnail media-element file-preview"}}]]Algorithmic Mediation in Group Decisions: Fairness Perceptions of Algorithmically Mediated vs. Discussion-Based Social Division
Authors: Min Kyung Lee, Carnegie Mellon University; Su Baykal, Google Inc.
Girls Rule, Boys Drool: Extracting Semantic and Affective Stereotypes from Twitter
Authors: Kenneth Joseph, Northeastern University; Wei Wei, Carnegie Mellon University; Kathleen M. Carley, Carnegie Mellon University
Putting the Pieces Back Together Again: Contest Webs for Large-Scale Problem Solving
Authors: Thomas W. Malone, MIT; Jeffrey V. Nickerson, Stevens Institute of Technology; Robert J. Laubacher, MIT; Laur H. Fisher, MIT; Patrick de Boer, University of Zurich; Yue Han, Stevens Institute of Technology; W. Ben Towne, Carnegie Mellon University
Crowd Guilds: Worker-led Reputation and Feedback on Crowdsourcing Platforms
Authors: Mark E Whiting, Carnegie Mellon University; Dilrukshi Gamage, University of Moratuwa; Snehalkumar (Neil) S Gaikwad, Massachusetts Institute of Technology; Aaron Gilbee, Independent Researcher; Shirish Goyal, Stanford University; Alipta Ballav, Independent Researcher; Dinesh Majeti, University of Houston; Nalin Chhibber, Independent Researcher; Rajan Vaish, Stanford University; Michael S Bernstein, Stanford University
Young Ji Kim, Massachusetts Institute of Technology; David Engel, Massachusetts Institute of Technology; Anita Williams Woolley, Carnegie Mellon University; Jeffrey Yu-Ting Lin, Magic Leap; Naomi McArthur, Riot Games; Thomas W. Malone, Massachusetts Institute of Technology