Skip to main content
How is Artificial Intelligence Impacting Higher Education?
Assistant Dean for Programs and Assessment Sharon Stoerger discusses educational uses of AI, how it is transforming student assessment, and approaches educators are taking to prepare students for STEM careers.
Assistant Dean for Programs and Assessment Sharon Stoerger discusses educational uses of AI, how it is transforming student assessment, approaches educators are taking to prepare students for STEM careers.

Beginning on November 30, 2022, the day OpenAI launched ChatGPT, educators at institutions of higher education have been concerned about its impact on students and faculty alike. Would students use ChatGPT to write papers? Could ChatGPT be useful in any scenario?

On February 15, 2024 OpenAI provided another jolt to the higher education community when it announced the launch of Sora, which enables users to create impressive videos from just a few text prompts.

To better understand how AI is impacting the higher education landscape, at Rutgers as well as across other American colleges and universities, SC&I spoke with Assistant Dean for Programs and Assessment Sharon Stoerger, who explains the ways AI is presenting both opportunities and challenges to educators.

SC&I: Does AI have the potential to be a valuable tool for educators?
SS: Education policymakers currently face several key struggles in shaping the curriculum, including the need to address varied levels of unfinished learning, concerns about the ethical use of AI, and the potential for AI to exacerbate disparities. It often requires balancing the competing needs of diverse stakeholder groups. This may involve debates about standardization across courses and programs versus the benefits associated with taking an individualized approach that addresses the needs of individual students. Given the demand for workplace preparation, finding ways to teach foundational knowledge along with critical skills for future careers is challenging. Much attention has been paid to STEM education, yet with the evolution of AI, evolving societal demands for financial literacy, digital fluency, and socio-emotional learning are gaining importance.

With a close eye on the demands, educators, in general, are dedicated to preparing students for success in their academic institutions and in the workplace. Technology is changing at great speed, and keeping pace with those advancements can be difficult. We are attempting to keep the curricula current, as well as aligning them with dynamic job markets in an attempt to prepare students for unknown future careers. The jobs our students may move into in the next few years may not currently exist, yet we must prepare them to expect the unexpected. At the same time, we also have to ensure that we are integrating any technology into our curricula in a responsible and ethical manner.

To do this work requires resources. Budgets are tight, and to effectively develop and implement any curriculum changes, resources for instructor training, materials, and technology are needed. Equipping instructors with the knowledge and skills to implement new curriculum demands ongoing support and professional development. Time to do this work is critical, and educators are already juggling multiple demands. As this suggests, time is a very limited resource.

The jobs our students may move into in the next few years may not currently exist, yet we must prepare them to expect the unexpected. At the same time, we also have to ensure that we are integrating any technology into our curricula in a responsible and ethical manner.

AI does have the potential to be a valuable tool for educators, including ways to maximize their time and boost productivity. There are many ways AI might assist in the education of our students. One is through the analysis of student data to personalize the experience and tailors learning pathways based on individual needs and strengths. AI-powered tools can also assist instructors in creating engaging activities and assignments, as well as providing grading and feedback assistance. As this suggests, AI could be used to develop curriculum by identifying trends, analyzing learning outcomes, and recommending best practices for instructional design. Because these tools can analyze large datasets, AI has the potential to help educators identify gaps in the curriculum, note areas for improvement, and provide understanding of curriculum choices.

While AI tools could be beneficial to educators, it is important for us to remember that the technology is not a savior. These tools are not neutral, and the biases present in the data and algorithms can perpetuate inequalities in education. We have seen the problems associated with proctoring tools and the ways in which they put certain student groups at a disadvantage when completing their work. Growing numbers of students struggle with mental health issues, and the use of technologies that work against them can exacerbate these problems, be detrimental to their learning, and derail their future success.

In addition to problematic proctoring technologies, the use of emotion AI raises many red flags. Emotion AI tools are being implemented in selected corporate settings to assess attention and engagement in real time, and the ways this technology could be misused – intentionally or unintentionally – are troubling. We already recognize the problems with AI detectors and the ramifications on students who have been inaccurately accused of academic integrity violations. Even if used with good intentions, the risks of emotion AI and its ilk greatly outweigh the benefits. As a student and as an instructor, I would not feel comfortable being in a class with these tools. Starting with some very basic questions, who has access to this information, and how are they using it?

Given that student engagement is a concern on many college campuses, I am certain there are emotion AI vendors coming to our doorsteps very soon or are in talks with ed tech providers. Will this soon become an integrated feature in Zoom, PowerPoint, our learning management system, others? Currently, we have the ability to enable the AI Companion in Zoom. In addition to providing us with a summary of the meeting, does the company plan to add attention tracking and engagement to the report documentation it produces? That could happen soon, with or without any announcement.

While AI tools could be beneficial to educators, it is important for us to remember that the technology is not a savior. These tools are not neutral, and the biases present in the data and algorithms can perpetuate inequalities in education.

Furthermore, questions about privacy and ethics also need to be addressed. Transparency is lacking among the AI companies, and we do not know what data they are using to train their models, nor do we know what they do with our inputs. In addition to teaching students about the content in our curriculum, we also need to provide them with the skills needed to be responsible users of AI.

While AI could assist educators in shaping the curriculum, this technology could also make the work more complex and challenging in unexpected and unintended ways.

SC&I: How might AI change how educators assess student academic performance?
SS: Educational uses of AI present a number of opportunities, as well as challenges. The rapid emergence of recent AI tools has brought to light the need for us to rethink the way we assess students. Assignments and activities that can be completed quickly by AI, such as responding to questions on standardized tests, might not encourage students to activate higher order thinking skills. These forms of assessment may have privileged memorization, recall, and rote learning rather than the application of knowledge in practical ways. Thus, instructors are attempting to move toward authentic assessment where students apply their knowledge to address real-world problems, scenarios, and situations. These applications can be powerful for students. The skills and strategies obtained through these activities can be used beyond the classroom. Not only can this approach discourage misuse of AI, but it can also promote learning and engagement.

Students learn in different ways. To address diverse needs, instructors are providing multiple options for students to demonstrate learning. Examples include presentations, simulations, and other creative projects. These representations put ownership of the learning in the hands of the students where they can adopt an approach that capitalizes on their strengths and/or builds on their weaknesses, authentic and personalized assessments can provide a more accurate picture of their progress and identify areas for support.

While there are many benefits associated with authentic assessments, they are difficult to scale. They take considerable effort to scaffold, and providing feedback on these types of assessments can be very time consuming for instructors. AI may encourage more instructors to adopt this assessment strategy as the technology can assist them in creating these learning opportunities, along with any necessary scaffolding.

Educational uses of AI present a number of opportunities, as well as challenges. The rapid emergence of recent AI tools has brought to light the need for us to rethink the way we assess students.

Grading authentic assessments in a timely manner can be another challenge. With AI, students may be able to obtain immediate feedback on their work, identify gaps in their knowledge, and have access to additional resources. AI might also open up the possibility of continuous and formative assessment throughout the learning process. The final exam, often a high-stakes assessment, would no longer be needed or relevant. Instead, the focus could be on continuous learning, which could encourage students to identify and develop their own strategies for learning. AI tools could assist in all of these activities and more. It might even enable instructors to view assessment as a pedagogical act that can enhance the relationship they have developed with their students.

Yet, there are strong arguments to be made for using AI with caution. Despite output that can seem magical at times, AI does struggle with certain tasks. Moreover, AI tools can perpetuate biases based on those present in the training data. Using student data for assessments involving AI raises ethical concerns about privacy and transparency. These tools can also hallucinate, promote misinformation, and violate the rights of content creators. In other words, they can do real harm.

Use of these tools involves many unknowns, so human judgment is critical and must be part of any use of AI in our pedagogical practices. Being able to critically evaluate AI tools and their output is key. Further, this technology should complement, not replace, people and their activities. We must be intentional about our use of this technology. This requires careful planning among educators and technologists to ensure that AI-informed assessments are fair, effective, and ethically sound.

SC&I: How can educators best prepare students for careers in STEM given the rapid changes already resulting from AI?
SS: The knowledge economy has emphasized the importance of technology, and we have seen this unfold at our institutions through the demand for STEM subjects. Our students have viewed these subjects as the ticket to lucrative and stable jobs. But in recent days, many tech companies have announced layoffs involving thousands of employees. At the same time, we have seen AI advance at a rapid pace. Students are taking notice of what appear at first blush to be two opposing forces and have been questioning what this means for them and their future careers.

AI will certainly impact the work they thought they might do during their STEM career, but the vision for the types of knowledge and skills they will need to weather the tech-induced changes is unclear. Some have suggested that we are moving into a “relationship economy” where a greater emphasis will be placed on people skills and the humanities, alongside STEM subjects. Regardless of the label placed on this era, we need to tap into the value we, as humans, bring to our work, our creations, those things that cannot be generated by an AI tool in seconds. It is often the human aspects that are missing from technology discussions. At the same time, we need to learn how to work effectively with AI, and these collaboration skills will be highly valued.

AI is on the verge of transforming life for society. As AI evolves, so too will our need to adapt to those changes to survive and thrive. Educators often talk about lifelong learning, and this is going to be a crucial mindset to embrace in a world that is shaped by AI. Many good things could happen, and many bad things could happen. Some of these things could happen tomorrow. Ultimately, we need to imagine AI as a human-centered practice – one that includes conversations about professional ethics.

SC&I: Are other states or universities changing their curriculum in the wake of ChatGPT’s launch in 2022?
SS: It is important to remember that AI is not new, and it got its start at the university. This was before anyone was making – or attempting to make – money with it. Our students are savvy and have been using AI powered tools for years. In fact, many educators have been using these “older” instances of AI, as well. Commercial tools, like Grammarly and Turnitin, are a couple of examples. When ChatGPT was launched in November 2022, the adoption rate exploded. It was easy to use, and the output was seductive. By December of that year, student occupied computers at the library often displayed ChatGPT windows on the screens. We knew our students were actively using or tempted to experiment with AI. The question about what to do about it became a burning one.

Some states, such as Ohio, have launched an AI toolkit to encourage instructors to use AI in the classroom. We are having conversations with our campus community at many different levels: one-on-one conversations with instructors, workshops and presentations, meetings with partners across the institution, discussion with our national and international colleagues to determine next steps. At the individual instructor level, we have provided statements they can add to their syllabi and course materials to communicate appropriate uses of AI in their courses. Uses range from no AI to selected forms of AI use to full AI. This varies from instructor to instructor, from course to course. For students, this can be confusing as the policies about appropriate AI use may vary. Thus, we strongly encourage instructors to talk with their students about AI use.

AI is being incorporated into many tools we use on a daily basis. Many of the software tools we use to write papers, present content, and analyze data are or will soon be AI-powered. We are gaining access to AI-powered search engines, like Perplexity, which could transform the way we “google” information. Learning management systems in use on many campuses are leveraging AI to provide a personalized learning experience for students, as well.

AI tools can perpetuate biases based on those present in the training data. Using student data for assessments involving AI raises ethical concerns about privacy and transparency. These tools can also hallucinate, promote misinformation, and violate the rights of content creators. In other words, they can do real harm.

Instead of taking a wait-and-see approach, campuses are experimenting with AI technologies in innovative ways; some have even formed partnerships with AI companies. The University of Michigan has joined forces with Microsoft to develop Maizey, its own generative AI. Arizona State University is collaborating with OpenAI, the company that launched ChatGPT, which will be used, in part, to augment teaching and learning. Ferris State University has been working on two virtual “students” who are enrolled this spring in selected hybrid courses. In this experiment, the virtual student will complete assignments and activities, and interact with their classmates. In the fall, undergraduates at the University of Pennsylvania will be able to major in a new degree program – the Bachelor of Science in Engineering in Artificial Intelligence. This is the first AI undergraduate degree for a school in the Ivy League, and it will likely not be the last.

Many of these conversations surrounding AI are similar to the ones educators were having with the emergence of the internet or even Wikipedia. Today, these technologies are simply tools in our teaching and learning toolkits. In a few years, educators may view AI in the same light, or perhaps we are on a completely different journey. Educators and their students must be part of the trajectory and prepared for whatever the future of AI brings.

Discover more about the School of Communication and Information on the website

Image: Pexels 

Back to top