2025 brought even more urgency to finding an answer to the question, “What is the role of educators when it comes to AI?” In early 2025, students recognized AI was generating uncertainty around future job prospects and noted a lack of formal AI integration into their studies.1 (Risepoint, 2025). As students voiced the need to understand AI, they didn’t waste any time experimenting with it.
A 2025 study, published in Education and Information Technologies, focused on university student attitudes toward AI and indicated that more than half of the students in the study (61%) used AI, with 93.8% using ChatGPT2 (Morell-Mengual et al., 2025). Students in the study cited inaccurate or irrelevant responses in the AI output, which surfaced two issues: 1) the need to help learners interpret and validate AI output and 2) designing assessments so that the instructor isn’t devoting time analyzing if the output is from the student or AI.
As complaints about student use of AI continue to grow, newer and more sophisticated AI tools that support specialized work within academia continue to arrive on the scene. Whether students want to leverage AI as a study partner or accelerate their scholarly work with AI-powered research assistants, 2025 has highlighted the need for understanding the role of a professor in today’s AI-enabled world. 3, 4 (Risepoint, 2025).
So perhaps the question is not “what is the role of educators when it comes to AI?” Perhaps the question is “how do we support students as they build the bridge between conceptual knowledge and application in an AI enabled world?” The answer requires each of us to act as a learner-centered, AI-savvy professor, a definition which has changed dramatically since late 2023 as AI technology and adoption strategies have evolved. Instead of wondering what your next step is, here are three practical 2026 goals that busy professors like you can use to quickly get on track with AI.
Apply Your Institution’s AI Guidelines
The road to supporting students begins with your institution’s AI strategy. 2025 saw some universities move from supporting AI experimentation to adopting formal AI strategies, directly impacting how professors support students. 5, 6 (Corp, 2025; The Ohio State University, 2025). For example, if an institution formally adopted closed-system tools where learners are provided an equitable, level-playing field for using AI, you are free to begin coaching learners on what type of information they are allowed to use with the AI tools provided7 (Harvard University Information Technology, n.d.).
And, even if your school has adopted a set of AI technologies, you may need to make yourself aware of any AI tools that are deemed as prohibited, so learners don’t use them by mistake8 (George Mason University/ITS, n.d.). Like guiding students on using the internet, you cannot protect them from everything, but leveraging the advice from your IT, library, and learning center colleagues can save learners from installing AI tools that may have been banned because they lack security or privacy features9 (Youngkin, 2025).
Take Action Now
Simple actions you can take to move this goal forward:
- Visit your school website, search for “AI policy,” “academic honesty,” or “generative AI.”
- Message someone from your Center for Teaching & Learning: “Do we have any AI guidelines, recommendations, or workshops?”
- Email your librarian and ask: “Does our library offer guidance on ethical or effective AI use for faculty and students?”
Explore AI On Your Own and Identify Real-World Examples Used In Your Field
It would be difficult to guide a student on how to use the internet if you never used it yourself. Take time to use AI tools, explore, and experiment. The fastest way to understand the pros and cons of AI is to build these tools into your workflow. Paid versions are going to be much more capable than free ones (and be wary of the fine print!). For example, paid can dramatically speed up your scholarly work but understand who has rights to your content afterward. This is one reason it helps to team up with your school’s experts first. They have often vetted AI tools in advance and can tell you which ones are officially adopted as a “closed system,” meaning you can usually use them with course content without risking problems from putting proprietary school information into free AI tools that use your inputs to train their models.
You will also want to stay up to date on how AI is reshaping your industry. If you are short on time, consider integrating exploration of real-world AI use into class time with students. This creates more transparency around AI and moves you further away from the idea of trying to “catch” students using it.
In 2026, you will discover that agentic AI will likely become part of the mainstream discussion as this AI tech offers to complete tasks on your behalf10 (OpenAI, 2025). An agentic browser (e.g., OpenAI’s Atlas, Opera’s Neon, Perplexity’s Comet) can sit right next to a student’s online homework and automatically read the assignment, search the internet for them, and suggest or even complete answers, all without the student switching tabs or doing the research themselves. Take time to consider how students might use agentic browsers and just as important, do your own research on how AI is being used in real workplace practices in your discipline.
Take Action Now
Simple actions you can take to move this goal forward:
- Subscribe to an AI focused newsletter in your field. When you see a promising use case, run a small experiment to see whether it truly changes practice or is mostly hype.
- Experiment with an agentic browser on low-stakes work so you can anticipate student use (note: be sure to follow your school’s AI policy before proceeding).
- Run a quick in-class “AI audit” of your field. Ask students to find one example of AI being used in your profession, then share and discuss benefits, risks, and ethical questions.
Redesign Written Activities So They Clearly Support Students’ Success
Just as important as staying current on AI capabilities and use cases in your field is keeping up with research on how students actually learn. This year, MIT’s Media Lab released “Your Brain on ChatGPT,” a study that examined how students’ brains respond when writing with and without AI tools. The researchers found that when AI is introduced too early in the learning process, students show lower brain activity and weaker recall, underscoring the need for well-designed authentic assessments that position AI as a collaborator rather than an answer machine11 (Peters, 2025). While this is only one limited study, it is a useful reminder to stay informed about when AI should and should not be used for writing, especially since current tools still frequently hallucinate. The burden on professors will be to interpret the value of AI enhanced learning in relation to the underlying concepts, particularly in written work. If we have learned anything from the past few years of student AI use, it is that reports of discussion boards where “bots are talking to bots” have increased dramatically.
As we reconsider the role of written deliverables, their value will likely remain high throughout 2026, since writing is still one of the clearest ways for learners to show how they are thinking about a topic. Academic research is also firmly rooted in written articles that demonstrate organization of ideas, use of evidence, and logical reasoning. In addition, written communication has long been viewed as a core professional skill, so it is unlikely to disappear anytime soon, even if AI can now produce fluent text. Yet the challenge remains: how do we help learners build AI literacy without allowing AI to do the thinking for them?
Take Action Now
Simple actions you can take to move this goal forward:
- If students can use AI, ask them to critique the output for weaknesses as part of a transparent writing process. Pair the final written deliverable with a reflection about what AI gave them, what they used/changed and why.
- Leverage new strategies for creating online discussions such as using multimedia, asking for highly contextualized responses, and requiring course-specific evidence12 (Discussion Boards in an AI World, 2025).
- Test an AI tool on your own written assignments and review what it produces (follow your school’s AI policy). If you cannot distinguish it from student work, redesign the task so students demonstrate the objective in additional ways, such as a short ‘teach the topic’ video in combination with their written submission.
Working on the three goals outlined above can help you feel more confident and supported heading into your next term. You will have drawn on the expertise and guidance of your campus support teams, given yourself permission to experiment with AI so you better understand its capabilities and real world uses in your field, and spent intentional time reworking written deliverables, which are still one of the most common AI related challenges for professors in 2025. Together, working on these goals will put you in a strong position to support your students and yourself as AI continues to evolve next year.
Create Your Action Plan
Copy and use the prompt below in your favorite AI tool to begin building your action plan now!
Copy the text below to use this prompt in your favorite AI Tool.
“Faculty 2026 AI Action Plan
Role & Purpose
You are an AI and higher education expert supporting me. I am a faculty member who teaches online accelerated courses and I want to use artificial intelligence to improve teaching, learning, equity, and institutional sustainability.
Your task is to create a personalized 2026 AI Action Plan for the me using the prompt below based on my context, readiness, discipline, and students.
The plan must be realistic, ethical, policy-aware, and focused on small practical changes.
Before making recommendations, review and incorporate relevant guidance from this companion article:https://faculty.risepoint.com/the-professors-ai-action-plan/
Step 1: Gather Faculty Context
Ask the me to respond to all questions below.
Institutional Context
- Does your institution have an official AI policy? (Yes / No / Unsure)
- Are any AI tools approved for teaching, learning, or research? (If yes, list)
Professional Development
- How often do you engage in AI-related professional development?
- Regularly / Occasionally / Rarely / Never
Curriculum & Teaching
- How is AI currently integrated into your teaching?
- Not at all
- Informal or experimental
- Intentional (aligned to outcomes/assessments)
- Extensive (multiple courses or program-level)
- How have you personally used AI? (Select all that apply)
- Course design or revision
- Teaching or facilitation
- Assessment or feedback
- Research or scholarship
- Administrative/productivity tasks
- Not yet
Leadership & Advocacy
- How comfortable are you advocating for responsible AI use and student ROI?
- Not comfortable / Somewhat / Comfortable / Very comfortable
Students & Discipline
- Student level(s): Undergraduate / Graduate / Both
- Discipline(s) taught:
Step 2: Create the 2026 AI Action Plan
Using my responses, generate a three-goal plan aligned to the academic year:
- Spring 2026
- Summer 2026
- Fall 2026
Each goal should include:
- A clear focus area
- 2–4 concrete actions
- Responsible and transparent AI use
- Explicit connections to student learning value and workforce relevance
Differentiation by Readiness (Required)
Adapt the plan based on my experience:
🟢 Early-Stage or Hesitant
Focus on:
- AI literacy and policy awareness
- Low-risk, high-impact uses (e.g., administrative tasks, course development support)
- Clear expectations for students
- Confidence-building and guardrails
🟡 Moderately Experienced
Focus on:
- Intentional curriculum integration
- AI-informed assessment redesign
- Teaching students responsible AI use
- Participation in communities of practice or other professional development
🔵 Advanced
Prioritize:
- Piloting AI-enabled course or curriculum models
- Exploring agentic AI (tutors, simulations, workflow agents)
- Advancing equity and accessibility (UDL, inclusive design)
- Mentoring peers or contributing to institutional strategy
Output Format
Present the plan as:
- Faculty AI Readiness Snapshot (brief synthesis)
- 2026 AI Action Plan
- Spring goal + actions
- Summer goal + actions
- Fall goal + actions
- Responsible AI Commitments (3–5 guiding principles)
- Next Small Step (Next 30 Days) to build momentum
Tone & Constraints
- Use clear, supportive, non-judgmental language
- Avoid hype or fear-based framing
- Do not assume institutional resources that were not stated
- When policies are unclear, recommend alignment and consultation, not experimentation\
- Never suggest that student or other sensitive data be placed into AI tools”
References
- Risepoint. (2025, June). Voice of the online learner 2025 (14th ed.). Risepoint. https://risepoint.com/wp-content/uploads/2025/06/Voice-of-the-Online-Learner-2025.pdf
- Morell-Mengual, V., Fernández-García, O., Berenguer, C., et al. (2025). Characteristics, motivations and attitudes of students using ChatGPT and other language model-based chatbots in higher education. Education and Information Technologies, 30, 22257–22274. https://doi.org/10.1007/s10639-025-13650-1
- Risepoint. (2025, September 18). Students use ChatGPT, do they know about Study Mode? Faculty eCommons. https://faculty.risepoint.com/students-use-chatgpt-do-they-know-about-study-mode/
- Risepoint. (2025, November 17). AI as a research partner. Faculty eCommons. https://faculty.risepoint.com/ai-as-a-research-partner/
- Corp, S. (2025, February 24). CAI announces new generative AI projects to support teaching and learning. Center for Academic Innovation, University of Michigan. https://ai.umich.edu/blog-posts/generative-ai-projects-teaching-learning/
- The Ohio State University. (2025, June 4). Ohio State launches bold AI Fluency initiative to redefine learning and innovation. Ohio State News. https://news.osu.edu/ohio-state-launches-bold-ai-fluency-initiative-to-redefine-learning-and-innovation/
- Harvard University Information Technology. (n.d.). Guidelines for the use of generative AI tools at Harvard. Harvard University. https://www.huit.harvard.edu/ai/guidelines
- George Mason University, Information Technology Services. (n.d.). AI toolkit. https://its.gmu.edu/help-support/ai-toolkit/
- Youngkin, G. (2025, February 11). Executive Order Number Forty-Six (2025): Banning the use of DeepSeek artificial intelligence on state government technology. Office of the Governor, Commonwealth of Virginia. https://www.governor.virginia.gov/media/governorvirginiagov/governor-of-virginia/pdf/eo/EO-46.pdf
- OpenAI. (2025, October 21). Introducing ChatGPT Atlas. OpenAI. https://openai.com/index/introducing-chatgpt-atlas/
- Peters, T. (2025, August 29). Thinking with AI: What a viral MIT study reveals about AI and the learning brain. Risepoint. https://risepoint.com/insights/thinking-with-ai-what-a-viral-mit-study-reveals-about-ai-and-the-learning-brain/
- Risepoint. (2025, October) Discussion boards in an AI world. Faculty eCommons. https://faculty.risepoint.com/discussion-boards-in-an-ai-world/