Summer Schooled on Human-Centered AI

Posted August 6, 2019

The following story, written by Sam Nunn School of International Affairs and School of Interactive Computing Assistant Professor Neha Kumar, was originally posted on Medium. The original article can be found here.

The Association for Computer Machinery's Special Interest Group on Computer Human Interaction (ACM SIGCHI)-sponsored summer school on Expanding the Horizons of Human-Centered AI (HCAI), which took place at the India Habitat Centre in New Delhi, concluded on July 27. In the school's last four days, we journeyed through several diverse perspectives on AI, human-centeredness in AI, and the challenges and opportunities that designing for human-centered AI entails. These perspectives came out of — and throughout — 16 thought-provoking sessions on topics pertaining to AI Systems and Design, Human-Centered Machine Learning (HCML), AI & Social Good, and Critical Perspectives on HCAI.

The presenters in these 16 sessions came from diverse disciplinary backgrounds, and various institutional commitments. There were researchers from Microsoft (Jacki O’Neill, Kalika Bali, Amit Sharma, and Mohit Jain), IBM (Danish Contractor), and Google (Divy Thakkar) Research, there were representatives from non-profits such as HaiyyaLove Matters India, and MakerGhat, there were academics from various universities in India (Naveen Bagalkot from Srishti, Janaki Srinivasan from IIIT-Bangalore, Aaditeshwar Seth from IIT Delhi) and the United States (Rosa Arriaga and Munmun De Choudhury from Georgia Tech and Tapan Parikh from Cornell), and research institutes with a focus on AI and social good/policy, such as Wadhwani AI (Jerome White) and Tandem Research (Urvashi Aneja), were also represented.

There were approx. 50 participants in addition to these presenters, and came from all over, including institutes such as IIIT-Delhi, Georgia Tech, the Centre for Internet and Society, Microsoft Research, University of Washington, and more.

This incredible diversity across participants made for a wildly intellectually stimulating summer school, challenging several beliefs and assumptions, inviting endless questions, and filling one with the confidence that as long as we have so many brilliant, talented, and thoughtful minds able and willing to ask the hard questions, there is hope for our future(s).

Each day featured a 30-minute “Madness” session, in which participants were asked to introduce themselves to the room using one slide with information about where they had come from — their backgrounds, their current affiliations, and where they wished to go (for example, exploring the value of HCAI in menstrual health education). The questions they shared were complex — “How can different communities (designers, researchers, social scientists, industry) come together in praxis to design inclusive, responsible AI?” and provocative — “Do we need AI?” and led to many rich discussions.

Not all, not even some, of the participants’ questions were answered to anyone’s satisfaction, I bet, but that was not our goal to begin with. As the days progressed, we hoped that participants would become adept at digging deeper, learning to ask more nuanced and complex questions, and in their areas of work/walks of life in particular.

What came across throughout was the stance that HCAI technologies must be designed in user-centered ways, but these users are not only the direct/immediate users of systems, they are also those impacted in other, non-immediate ways, for example those who are excluded from the data infrastructures used by AI technologies for social, economic, cultural, and/or political reasons. Even as we focus on taking ecological perspectives to design, we must remain conscious of tendencies towards technological fetishism, as markets inevitably have their way with us.

Taking a holistic view to see who/where the human(s) is/are, when we talk about HCAI, we learned to recognize and question the larger systems in which these humans are embedded, and investigate their relationships (both vertical and horizontal) with others within these complex systems. We learned to question the distance and impact of these relationships as well, as we ourselves experienced the hardness of designing for ourselves and why it is even harder to design (albeit in a very different way) for others far removed from us. These design activities brought us to imagine the futures that we are progressing towards, and ones we might attempt to shape and influence in our varying capacities and affiliations. They also cautioned us to remain conscious of who we speak for, and where we speak from.

It was helpful to ground the questions these exercises raised in the contexts of concrete examples presented by several presenters — such as the case of community networks in Gram Vaani, pest detection in cotton farming as Wadhwani AI is exploring, or chatbots in health/agricultural contexts. These examples emphasized the importance of factoring in diverse stakeholder perspectives, but also brought us to discuss even more practical questions such as how ethnography and participatory design perspectives might be empowered to shape such interventions, and how gender-diversity might be one very obvious place to begin. Perhaps our focus in these four days was on projects and research taking place in the Global South, but the discussions we had were inarguably of relevance for the HCAI discourse emerging in Northern locations just as well.

Where we — the #xhcai summer school participants — go from here, time only will tell. There were many conversations though, it is clear, that participants were burning to engage in more deeply. Even I have not had sufficient time to reflect on all the ways in which the boundaries of my mind have been stretched in these few days. This, I must add — a summer school truly offers an incredible format/safe space in which to question everything, and to find common ground with others so very different from us.

I am confident that many horizons were expanded, even if we began by struggling to define what human-centeredness meant for us all in the first place. I am also quite convinced that the dimensions along which they were expanded may take their time to emerge, but emerge they will, as participants — individually or (hopefully) in collaboration — work towards translating the lessons they have learned into their own diverse contexts.

Related Media

Neha Kumar with participants of the Association for Computer Machinery's Special Interest Group on Computer Human Interaction's 2019 summer school program in New Delhi. 

Neha Kumar with participants of the Association for Computer Machinery's Special Interest Group on Computer Human Interaction's 2019 summer school program in New Delhi. 

Contact For More Information

Rebecca Keane
Director of Communications
rebecca.keane@iac.gatech.edu
404.894.1720