
Dear Commons Community,
Artificial-intelligence tools — including generative AI — will now be integrated into Canvas, a learning-management platform used by a large share of the nation’s colleges, its parent company announced on Wednesday.
On the Canvas platform, faculty members will be able to click an icon that connects them with various AI features aimed at streamlining and aiding instructional workload, like a grading tool, a discussion-post summarizer, and a generator for image alternative text.
Canvas’s parent company, Instructure, is also in partnership with OpenAI, the maker of ChatGPT, so instructors can use generative-AI technology as part of their assignments.
The announcement comes amid a still-tense debate about AI’s place in the classroom. While many instructors in academe are skeptical of the technology, some universities have embraced it; starting in the fall, for instance, Ohio State University will require all its graduates to be “AI fluent.” As reported by The Chronicle of Higher Educaton.
“We firmly believe that AI will not replace educators, but educators actually need to understand how to use these tools,” said Ryan Lufkin, vice president for global academic strategy at Instructure. “We’ve moved beyond the age where educators can simply not use technology in the classroom. The modern student expects it; they’re on their phone all the time. We need to meet them where they are, and if we’re not doing that, we’re failing them, essentially.”
The discussion summarizer can synthesize all discussion posts submitted in a class for the instructor to review. The feature also includes a model that can evaluate a student’s submission against a rubric and give a score. And a rubric generator can “take the existing content and create a first pass of a rubric,” said Lufkin.
A search tool lets users look up keywords or terms to find where they appear in lecture notes or PowerPoint slides — something that the University of Central Florida had already been building, Lufkin said. Instructure’s AI software, IgniteAI, also contains features that can create model quizzes or flashcards for students to review.
And there’s a Super SmartGrader feature that can provide first-pass feedback on submitted assignments.
Instructors can choose to create assignments paired with existing large language models, including Gemini and Microsoft Copilot. Educators can instruct the LLM to adopt a persona a student can engage in chatbot conversation with, such as a key historical figure that they’re learning about in class. Instructors can monitor and view a transcript of those conversations, and the software will provide a summary of the interaction that includes analytics on source usage or time spent on the assignment, key learning indicators, and areas for improvement. The software can also offer suggested feedback.
Students have continued to expand their AI usage; a recent study estimated that around 79 percent of students use generative AI as a learning aid. But Instructure officials say they are thinking of how to best integrate AI into the instructional side amid fears that faculty jobs are at risk.
Part of the necessity for keeping the “human in the loop” is the unreliability of AI — it can make up, or “hallucinate,” information, or make incorrect and incomplete statements. It’s vital, therefore, for educators to manually check AI-generated feedback and make adjustments, Lufkin said.
But “because faculty are overworked, their behaviors may or may not match the expectations” of learning-management platforms that have integrated AI, said Dylan Ruediger, principal for the research enterprise at Ithaka S+R. Since early 2023, his team has been studying higher education’s adoption of generative AI in teaching, learning, and research.
“We have seen a lot through the research we’ve done that institutions don’t necessarily have a good handle on how people are actually using the tools that are available to them,” Ruediger said. “This may help because it sounds like they can track it now, but it also is difficult to know exactly what is useful about this technology when you don’t know how people are using it. So it’s hard to know whether it’s filling a demand or creating one.”
Edward Watson, who heads the Institute on AI, Pedagogy, and the Curriculum at the American Association of Colleges and Universities, was more bullish about the Canvas updates. The tools might alleviate heavy workloads that lead to grading fatigue and make grades more consistent, Watson said.
All of the new Canvas features are optional. Instructors can opt out of using AI entirely. But as AI becomes more integrated into the educational experience, “it’s likely going to get more difficult for faculty and students to understand when they’re using AI and when they’re not, to establish or maintain policies and boundaries around it,” said Ruediger.
Some worry that integrating AI into the grading and course-planning process might eliminate the need for adjunct instructors and could expand class sizes. Ruediger noted that in online courses, there is already “a level of technology mediating between instructors and their students,” a gap that might only widen and make it more difficult for students to know if their instructor has personally evaluated their work.
But by assisting the grading process, AI could allow instructors to reinvest their time in directly mentoring students, Watson said.
In a conversation with a group of students in December, Watson said a student recounted an example of a professor telling the class that they graded papers from 7 p.m. to 2 a.m. She hoped that she was graded in the earlier hours, before the professor was exhausted or ready to just be done.
“How might we re-envision what it means to be a professor,” Watson asked, “in an era where AI might take over some of the more tedious administrative parts of the course?”
This is an important step forward in integrating AI into teaching and learning.
Tony