Will Artificial General Intelligence Replace Me?

We have been having conversations with each other and with colleagues for the last few months about what Artificial General Intelligence will mean for higher education. It is opaque at best, but we at least want to lay out some pieces to initiate the conversation. So far Ray Schroeder has written a few pieces for Inside Higher Ed that appear to be the entire corpus of thought on the topic. We appreciate his engagement with the topic, but also recognize there is a lot more to think through. 

Definitions. This is a contested space. Most commentators and researchers approximate AGI as a system of intelligence at or near human cognition for the majority of human knowledge work. When we imagine this sort of intelligence we tend to anticipate it being economically viable rather than a single system costing millions of dollars to run, so it is not just an achievement of computation, but also of efficiency. Some people hate the term altogether, or are critical of the discourse around the concept, but we want to start with the most common definition as a starting place for considering what roles, if any, higher education will play in a world where affordable AGI exists. 

What will students need to know? It will be different than it is today in almost every area. We are already seeing movement in spaces like computer coding, accounting, and law as employers anticipate having more of the work done by AI or an AI empowered workforce. It is easy to scale this up to other professions like business development, marketing, social work, and almost everything with writing or content production at its core. It is less clear in other areas like health care and education. We can certainly anticipate AGI as an enhancer of diagnostic work and evaluation in health care and of K-12 education–providing the kind of individualized tutoring that drives learning gains. 

There are still humans in this picture, but they are probably working differently. In conversations about the future of coding work some researchers point toward skills like clear communication being more important than the technical aspects of coding as we will need to tell models what we want more than know how to do it ourselves. 

It is possible to scale this out as a skill and think more fundamental skills like critical thinking, ethics, and communication will become central to the AGI world. We can also imagine that at least in the short-term having some disciplinary knowledge will be important. For instance if you want to work with an AGI to develop new therapeutics for a disease it will be important than you, not just the AGI understand what has already been tried. Otherwise we will be like mathematicians working in isolation completing proofs over the course of a lifetime that were actually resolved centuries before in a book we didn’t read. 

How will they learn it? We think this is the more difficult question for us, because it might not involve us. Personally, I am really good at teaching this senior level course on Freedom of Speech. I have developed some expertise in the space despite not having a legal background and have improved my strategies for helping students express themselves in writing about a difficult topic area. Am I better at teaching the class than a cohort of elite constitutional scholars, new media experts, and embodiments of the founding fathers–probably not. That is the kind of dynamic made possible by AGI which will be able to aggregate multiple fields of knowledge from different time periods and customize an instruction protocol for each individual student. 

I record videos for students about content, but also videos that explain assignments, identify common trouble spots, and pull together threads from across the semester. How does that compare with an AGI that creates content for each individual student catering to their experiences, understanding level, and draws on examples relevant to them from a profile of them aggregated from their work in the class along with scraping the open internet? There are a few possibilities.

  1. We are still in the picture, but have a different role. If we are present in this space we are reviewing AI curriculum and instruction as well as feedback on student work. We are like a course supervisor overseeing the most qualified assistants imaginable. 

  2. We are the assistants. With AGI instruction it is possible humans will have to fill niche cases and provide human interaction to make sure learning is happening. In this theory part of learning is having another human who cares about you and your development. 

  3. We are not in the picture at all. AGI builds curriculum, assessments, instructional aids, and evaluates student progress providing the instantaneous feedback and always available status that students have often desired and we have (understandably) never been able to provide.  

We have a set of recommendations for this uncertain future so that institutions can position themselves well. 

  1. Start the conversation on your campus. These will be different for regional comprehensives, research institutions, and community colleges, but all of us need to think about how we fit and what roles we might play. 

  2. Set your institutions up for rapid change. During COVID some teaching and learning changes that normally took months or years were reduced to a single email exchange with the respective college dean. Making this sort of agility a permanent feature in higher education equips us to change quickly when the moment calls for it and that will be more important than ever as we approach a world with AGI.

  3. Strengthen relationships with off-campus partners. We are already seeing changes in what employers are looking for and what participation in civic life looks like. If we want to stay in front of the “what do they need to learn?” question this needs to be an ongoing dialogue rather than an occasional check-in. 

  4. Finally, and this is the hardest for academics–we need to decenter our own hubris in these conversations. This is already the case as every workshop I run has someone in it who is sure that the current version of AI cannot do their “thing” because they are so special. Rarely is this the case and it will likely never be the case in an AGI world. Higher education institutions are collections of some of the smartest and most creative people in the world–we have to harness this attribute rather than having it hold us back. 

Next
Next

The Assistant for the Rest of Us