6 things you need to know about AI for the Fall

By Zach Justus and Nik Janos

According to PEW as of May 2023, only 14% of Americans had used ChatGPT. Awareness skews higher for those with advanced educational attainment, but it is still quite low. In light of this reality we are offering a guide to what you need to know if you are headed into the Fall 2023 semester with low exposure to Artificial Intelligence in Higher Education. We will also publish a follow-up on what you can do right now to prepare yourself.

1. Generative Artificial Intelligence is here, it is powerful, and it can do a lot of things we have been asking students to do. We started by having ChatGPT respond to quick write style prompts, but have since been exposed to the broad capacity of these programs. They can code, create slide decks, edit photos/videos, solve complex math and science problems, and complete multiple choice tests. They can even write from specific human perspectives with adequate prompting. The more advanced version of ChatGPT performed quite well on a series of standardized tests.

Robot in a classroom

2. Students are already using these programs to complete homework and a variety of other tasks. It is widely believed among commentators that usage will significantly expand with the integration of ChatGPT in Microsoft Office products and with Bard into the Google Suite. Ignoring these technologies is not a viable response.

3. Use of AI programs is an in-demand job skill. Many of the careers we have been preparing students for now include AI prompting/use as a preferred skill if not a requirement. The ChatGPT Report podcast often reminds listeners “AI is not coming for your job, but someone who knows how to use AI is coming for your job.” There do seem to be some exceptions where large groups are being laid off and replaced with AI, but at a minimum we need to be preparing our students to exist in this new world.

4. The AI detection software we have access to is pretty terrible. We have a series of youtube videos about some of these products. The broader consensus is that the tools do not produce actionable information since they often cannot detect AI written work and also falsely identify student work as AI generated. We are preparing a more robust rundown of these technologies.

5. Development of these tools is unpredictable. ChatGPT 4.0 was released months after version 3.0 and the difference was staggering. That said, there is some recent evidence that the program is getting worse at some specific tasks. It is impossible to know what this landscape will look like in six months.

6. There are legitimate privacy concerns with all of these platforms. Some institutions are disallowing required usage because how the data can be used is opaque.

There are ongoing developments we highlight on this site. We also recommend the blog of Ethan Mollick (Higher ed focused) and the ChatGPT report podcast (more generalist). We look forward to hearing from and working with you as these developments unfold.

Previous
Previous

5 things to get started with AI this fall

Next
Next

Summer break