A.I. and Academic Honesty

A.I. content on a computer screen.
Image: A.I. as the new HCI.

A.I. Machine Learning. Large Language Models (LLMs).

These technical terms are no longer news to many practitioners of higher education. Why? Because A.I. (“AI”) is already having a huge impact on the classroom and learning activities. In 2023, students gained free and easy access to AI-augmented tools like ChatGPT to fulfill writing assignments and related coursework.

There are already dozens of A.I.-driven writing applications. Soon there will be thousands. What does this mean for the future of learning?

AI is already transforming the classroom.

The use and misuse of AI tools is now part of the classroom conversation. There is no avoiding it. In some classes, using AI may play an important part of coursework, and in others it may be frowned upon or expressly forbidden. Unless there are campus-wide policies (which seem unlikely at the moment), each faculty member will determine what is best for a particular subject.

Of course, many learning applications are already using machine learning to create learning experiences for students: Personalized and adaptive learning systems offer different learning paths for students. Automated assessment and analytics support teacher feedback. And software always promises administrative efficiencies. But AI writing tools are having a much greater impact, and more immediately. Why? Because AI-influenced writing defies our old methods of assessment.

If a student turns in a well-organized and seemingly well-written assignment, we might assume they spent time on it. But no longer. If they used an AI writing tool, their assignment may be largely fabricated by AI. This is the basis of a skeptic’s stance towards allowing AI: How can I know what (or how) the student really thinks?

However, a faculty proponent of AI might respond: Using AI to think and write is the future. Students need to practice using it. For this faculty member, reading and reviewing AI-influenced writing is part of the job. A “traditionalist,” on the other hand, may see AI-influenced writing as obscuring the possibility of accurate assessment. How can the concerns of both these perspectives be assuaged? The answer seems to be knowing what exactly is being assessed, and in this way, using AI tools may not be too different from using and citing any research aid.

Academic honesty is key.

In this environment, faculty expectations for academic honesty will take on a heightened importance. In fact, I think we will get to the point (sooner rather than later!) where degrees are accompanied by something akin to an “‘honesty rank.” Like a credit score rates credit worthiness, an honesty rank will put a rating to a student’s knowledge worthiness.

Perhaps ironically, it will be AI-driven tools that help faculty assess students knowledge worthiness, and rank their honesty against their peers. As a feature of an academic credential, this will help academic communities and employers better understand an individual’s educational background, and their ability to use both their mind and the AI-driven tools available to them.

This is just one way I can imagine educators adapting to this new landscape of writing, researching, and answer-seeking. The conversation on campus is really just beginning, and it is an exciting and interesting time for charting the future of learning. ChatGPT offers us the follow reminder:

While these potential transformations are promising, it’s essential to approach the integration of AI in higher education with careful consideration of ethical, privacy, and security implications to ensure the responsible and effective use of these technologies.

– ChatGPT 3.5

(Now, that’s not wrong, but is it really apropos?)