blue lights

Fubon Center Doctoral Fellow Research

Augmenting or Automating?

Breathing Life into the Uncertain Promise of Artificial Intelligence | Kevin Lee

Augmenting or Automating? Breathing Life into the Uncertain Promise of Artificial Intelligence

Kevin W. Lee is a PhD candidate in organization theory at NYU Stern. His research concerns  the dramatic transformations in work and organizing that we have witnessed across today’s  economy. He has paid special attention to how people living through these transformations  define what is valuable and important, focusing on how this has informed their ability to let go of  the past and embrace an uncertain future. 

Organizations developing artificial intelligence (AI) have had enormous power and influence  over the future we collectively face. While many affected communities – be they organized  around race, gender, or class – often have not had a say in AI’s design, one type of community  has had a distinctive amount of access to and presence within technology development  organizations: occupational communities. That is, members’ experience with and expertise in  their communities’ work uniquely qualify them to build AI technologies that can perform it. And  while little study has been devoted to exploring how these people relate to and approach developing AI technologies, past scholarship predominantly has been built on the assumption  that occupational communities are a source of solidarity, and that members will push back  against threats to their community’s work and the value of its craft. The literature thereby suggests that people will work to ensure that the technologies they are developing will not  automate, or substitute for, their communities, and instead augment, or complement, them. 

However, my investigation of these people surfaced some findings that were puzzling given this  theoretical backdrop. I have been following a set of developers within an organization building  an AI that composes music, all of whom have primarily identified as members of the  occupational community their technology will affect: music composers. To gain an  understanding of their lives within their organization – a startup which I anonymized as Reverb – I have been using ethnographic methods: an approach with roots in anthropology, where the  researcher gains an understanding of a group’s culture by hanging out with its members and  experiencing first-hand their way of life. To gain an understanding of their lives within the  broader occupational community, I have been comparing how they have talked about it with how  music composers unaffiliated with the company have talked about it in my depth interviews with  them, asking primarily about their work, and what is at stake with the advance of technologies  like AI. In so doing, I have been able to study how the developers have been navigating any  tensions that arise between their memberships in both their organization and their community.
Through such study, I have discovered that Reverb’s developers, though initially acting in ways  consistent with what the literature might predict, diverged from these behaviors. At first, they set  out to ensure alignment between their organization and community, working to make sure that  their AI technology would augment music composers, rather than automate them. To do so, the  developers engaged in what I call “reflexive imagining”: they filtered prospective technological  futures and features through the lens of what they might want from the technology themselves,  assuming that their own background as composers allowed them insight into what their broader  community might value and want. Specifically, they appreciated technologies which  “collaborated” with them by doing work they did not want to do, allowing them the time to do  more valued work instead, and assumed that their community would want the same from the AI  they were developing. These beliefs were selectively consistent with what the community actually believed: while not representative of everyone, these beliefs did pull on a set of  established and prevalent beliefs on the role that technologies should play in compositional work. 

However, and though Reverb’s developers had professed a dedication to augmenting their  community, they and their organization eventually shifted toward automating these people,  creating some misalignment between what their organization intended the technology to do and  what they had felt their community might want. Feeling increasing pressure from investors to  produce a return on investment, Reverb discovered that composers were not using their AI, and  proposed targeting video content producers instead, therein “competing” with the market for  human-composed stock music: a form of music extensively used in the background of videos,  and from which some composers derived artistic value. And while this conflicted with the  developers’ initial intentions, they justified their organization’s shift as still aligned with what  their community wanted. Again engaging in “reflexive imagining,” they devalued stock music,  arguing that it was work that neither they nor their community wanted to do, and that automating  it would allow them to do more valued work. They thus positioned it in terms consistent with  how they had always talked about work their technology could defensibly take over, restoring  alignment between their organization’s technology and what their community would want. These  beliefs again were selectively consistent with what their community actually believed. 

My study has implications for how occupational communities may chart out the future of work,  especially when positioned within organizations developing technologies that can threaten their  craft. It reintroduces the age-old notion that work may be differentially valued across a  community, and that some types of work may be considered more valuable than others, placing  this insight in the context of how we make decisions about what to hold onto and let go on the  frontier of the future. In particular, my study uncovers how unvalued work may be a source of  vulnerability in occupational communities, suggesting that its automation may not elicit the kind  of resistance that we have come to expect from occupations facing disruptive technologies.  Moreover, my study indicates that we need to look closely at who represents occupational  communities within organizations, and at how and to what extent their values are aligned with  what the community cherishes. Occupations are “imagined” communities: members relate to  them primarily through their own local experiences of them, and their resulting imaginations on  what these communities might want, shaped by these experiences, may not be representative of  the desires of all their members. Looking closely at what those within organizations value will be  crucial to understanding who and what within their communities they will be willing to leave  behind: an important part of forecasting what futures they will accommodate and produce.