Reflections on Agile Research University

Kurt Luther (left) and students from Northwestern University’s Design, Technology, and Research (DTR) course

Since our Crowd Lab builds and studies crowdsourcing systems, I often think about how our research might apply to ourselves. Research labs can have surprising overlap with online labor markets like Amazon Mechanical Turk and Upwork, in both positive and less flattering ways. Shared characteristics might include transient workers, mixed levels of experience, diverse motivations, technology-mediated interactions, and even geographic distribution (my own group is split across two campuses).

I became more aware of these similarities as my own lab grew. After few grants came through around the same time, I was suddenly fortunate to have money to recruit more grad students, and quickly. Yet, I was also turning away good candidates for undergraduate research because I felt I was at my capacity advising five or even ten students per semester. These are good problems to have, but I wanted to make the most of my opportunities. I wondered: How might crowdsourcing research help me scale up my “crowd”?

Fortunately, some exciting new research suggested answers. Other crowdsourcing researchers, namely Haoqi Zhang and Michael Bernstein, have formalized their lab-building efforts into successful research enterprises, like Agile Research Studios (ARS) and the Stanford Crowd Research Collective, whose findings benefit the broader academic community.

When Haoqi and colleagues published their first paper on ARS at CSCW 2017, I was excited to adapt some of their techniques for my own group. I had already begun integrating agile methods into our project meetings and hired a part-time scrum master. ARS proposed a more comprehensive computational ecosystem, with a strong emphasis on both research and education. I was intrigued, and over the summer and fall of 2017, our lab became more ARS-like.

During this time, Haoqi and I had a few email and in-person conversations about how I was fitting ARS into the unique context of my lab and university. He invited me to visit Northwestern as part of his Agile Research University program. It was an opportunity to see firsthand how his students and colleagues put ARS into practice. I gratefully accepted.

Over a few days in November, Haoqi and his Delta Lab co-directors, Liz Gerber, Matt Easterday, and Nell O’Rourke, immersed me in the ARS experience. I attended lab meetings and project meetings, sat in on the interdisciplinary Design, Technology, and Research (DTR) undergraduate course, participated in pair research sessions, and asked lots and lots of questions. It was an intense, but illuminating experience.

I arrived at Northwestern feeling like I had a pretty solid grasp of the ARS process. I’d read the paper multiple times, implemented some of its components, and discussed it with Haoqi and his PhD student, Leesha Maliakal. However, I found that being there in person allowed for a much deeper understanding.

While Haoqi’s paper clearly outlined the mechanics of ARS, it’s harder to describe the culture that Delta Lab members and the DTR course cultivated. I can at least share some observations that impressed me. I watched students enthusiastically partner up with others whom they didn’t know well, but were eager to help. I saw students vigorously, yet politely, debating the pros and cons of a design alternative. Students bought lunch for their classmates and reflected together on their progress, with remarkable courage and vulnerability.

I discovered that Haoqi and his colleagues had made many small, but important decisions to foster this culture at their institution. Lab meetings begin with playful, positive exchanges. Breathing exercises during class breaks build trust and encourage reflection among students. Haoqi lets students borrow his office for meetings, and its spare, configurable furniture served a wide variety of needs. File cabinets became impromptu seats, and students gathered around an external monitor for a quick demo or critique.

I also noticed how the design decisions enabling ARS, both curricular and environmental, mutually reinforced one another. Students in ARS perform well because they’ve been selected from a large pool and passed a rigorous interview process, but this screening process is scalable because it’s delegated to students. Students tackle ambitious projects requiring diverse expertise because DTR is an interdisciplinary course that draws from multiple majors. Students benefit from peer mentoring and organizational memory because they get course credit for enrolling multiple semesters.

During breaks and over meals, Haoqi and I talked about ways I could bring some of what I’d seen and learned back to my lab at Virginia Tech. We agreed that adaptation was essential. Each institution and group of researchers is unique, and brings its own affordances and constraints. We also agreed that a gradual ramping-up was the best way to proceed, so we talked about prioritizing different pieces of the ARS model.

I left inspired and with detailed plans for the next semester, which I’ve recently implemented. These included a more flexible use of physical space, a shift in meeting structures to encourage undergraduate mentoring, and new digital resources for helping students with long-term planning. We’ll approach this semester’s changes like we have in the past–as an experiment, subject to iterative design and formative evaluation. Based on our past success with ARS techniques, and my experience visiting Northwestern, I’m optimistic.

With their focus on culture as well as process, attention to detail, and systems thinking, Haoqi and his team have created something special. I look forward to capturing some of that magic, to build new connections between my crowdsourcing research and my own work practice.

Paper accepted for CHI 2018

CHI 2018 logo

Our full paper on CrowdLayout, a system that uses crowdsourcing to design better layouts of biological network visualizations, was accepted for the CHI 2018 conference in Montreal, Canada. The acceptance rate for this top-tier HCI conference was 26% (of 2,590 submissions). Congrats to Crowd Lab alumni Divit Singh and Lee Lisle, and collaborator Dr. T.M. Murali, on this accomplishment.

Notable Paper Award at HCOMP 2017

Adam Kalai, Jeff Nichols, and Steven Dow presenting Notable Paper Award to Kurt Luther at HCOMP 2017

Our paper on crowdsourced image geolocation and diagramming won the Notable Paper Award at HCOMP 2017. Congrats to Crowd Lab alums Rachel Kohler and John Purviance, co-authors of the paper, for this recognition. In the photo above, Dr. Luther receives the award certificate on behalf of his co-authors from Adam Kalai and Steven Dow (HCOMP 2017 co-chairs) and Jeff Nichols (Awards committee).

You can read the award-winning paper, Supporting Image Geolocation with Diagramming and Crowdsourcing, in the online proceedings.

Second GroupSight workshop at HCOMP 2017

People sitting around a poster board with sticky notes on it

Dr. Luther, along with Danna Gurari (UT Austin), Genevieve Patterson (Brown University and Microsoft Research New England), and Steve Branson (Caltch), co-organized the second GroupSight Workshop on Human Computation for Image and Video Analysis at HCOMP 2017.

The workshop featured two keynote speakers, Meredith Ringel Morris (Microsoft Research) and Walter Lasecki (University of Michigan), along with seven paper presentations, a poster session, break-out groups, and a sponsored lunch. At the conclusion of the workshop, Dr. Luther handed out the Best Paper and Best Paper Runner-Up awards.

For more details, check out the Follow the Crowd blog post written by Dr. Luther. There will also be a short write-up in AI Magazine.

GraphSpace published in Bioinformatics journal


Our work on GraphSpace, an online hub where scientists can share network data, was published in Bioinformatics journal. The title of the article is, GraphSpace: stimulating interdisciplinary collaborations in network biology. The authors include Dr. Luther, his collaborator Dr. T.M. Murali, and Crowd Lab alum Divit Singh.

Here’s the abstract for the article:

Networks have become ubiquitous in systems biology. Visualization is a crucial component in their analysis. However, collaborations within research teams in network biology are hampered by software systems that are either specific to a computational algorithm, create visualizations that are not biologically meaningful, or have limited features for sharing networks and visualizations. We present GraphSpace, a web-based platform that fosters team science by allowing collaborating research groups to easily store, interact with, layout and share networks.

This work was partly funded by our NIH Big Data to Knowledge grant.

Paper accepted for GroupSight workshop at HCOMP 2017

Our paper on GroundTruth, a system that allows experts to collaborate with crowds on image geolocation tasks, was accepted for the second GroupSight workshop at HCOMP 2017. Congratulations to Crowd Lab alumni Rachel Kohler and John Purviance, co-authors on the accepted paper.

Here’s the abstract for the paper:

Geolocation, the process of identifying the specific location where a photo or video was taken, is an important task in verifying evidence for investigations in journalism, national security, human rights, and other domains. However, experts typically perform geolocation work as a time-consuming, manual process. This paper introduces GroundTruth, a web-based system that leverages the powerful vision system of crowd workers to support experts in image geolocation tasks. We describe the technical contributions of GroundTruth and present preliminary results from an evaluation with expert geolocators and novice crowds.

For details, please check out the paper and and the corresponding video.