Artificial Intelligence Offers Opportunities and Challenges for Teaching: Perspectives from the Drake Institute

Implications of recent and rapid advances in the development and availability of generative artificial intelligence (AI) systems such as ChatGPT are resounding across the landscape of teaching and learning. From discussions of academic integrity to efforts in course design, AI is quickly becoming an embedded element of the teaching and learning process that requires the acknowledgement and attention of instructors, instructional designers, and academic leaders. 

The Drake Institute for Teaching and Learning recognizes the emerging opportunities and challenges that many instructors at Ohio State face in understanding, anticipating, and responding to AI in the context of teaching and is committed to supporting the needs of instructors in this space. The Institute also recognizes the highly complex and evolving nature of this issue as one that will ultimately require significant contributions from a large and diverse group of collaborators and stakeholders committed to excellence in teaching and student success in higher education. 

In response to current concerns and questions of instructors at Ohio State, the Drake Institute, first and foremost, recommends continued reliance on known, evidence-based approaches to instruction in guiding decisions around embracing or otherwise addressing AI in the process of planning, implementing, and evaluating teaching and learning. To this end, the Institute offers the following broad suggestions for instructors and warmly invites questions, conversations, and consultations around these ideas: 

  1. Similar to most efforts in redesigning and iterating on instruction, instructors are encouraged to first reflect on course goals and objectives for student learning prior to determining if and how generative AI applications can be effectively integrated into instruction. This “backward design” approach (Wiggins and McTighe, 2005) can help clarify objectives, support alignment across class activities and assessments, and improve student learning and academic performance.    
  2. When goals and outcomes suggest opportunity for effective infusion of AI as part of the learning process, plan active learning approaches using evidence-based implementation strategies (Nguyen et al., 2021 and Andrews et al., 2022) that support student motivation and equity while minimizing potential student resistance. Ideas to get started include: 
    1. Establish and communicate clear expectations around the use of AI in the course, broadly, as well as within specific assignments and activities. 
    2. Intentionally discuss with students the purpose of planned activities with justification for, and reasoning behind, the infusion or exclusion of AI tools. 
    3. Approach facilitation of active learning engagements with a focus on expressing enthusiasm around the learning, creating an inclusive and welcoming learning environment, and enhancing motivation and excitement, particularly around AI as a tool for learning. 
  3. Evaluate and revise design elements of your course with transparency around AI at the forefront. From syllabus construction (Wheeler, Palmer and Aneece, 2019), to assignment design and delivery (Winkelmes et al., 2016), to assessment practices (Balloo et al., 2018), taking concrete steps to improve the transparency of purpose and expectations associated with various aspects of instruction can significantly improve experiences and outcomes for all students. A simple but potentially impactful first step can be small changes in the way assignments are presented to your students through use of the Transparency in Learning and Teaching, or TILT, framework (Winkelmes et al., 2016). This framework establishes new norms for introducing students to assignments in the course before they begin any work using a basic, three-step process: 
    1. State the purpose: For assignments that leverage or limit use of generative AI tools, compose a short narrative on, and discuss with students, the purpose of assignment, the skills that students will practice, and the knowledge that the assignment is designed to support. Clarify how the assignment connects to future coursework or work in the discipline and, where relevant, how AI might be expected to impact those areas now and in the future. 
    2. Define the task involved: List or describe the steps that a successful student would take in completing the assignment. This might involve explicitly describing when and how AI tools should and should not be utilized to support progress in the assignment.  This step has the potential to transform the experience of students who might be in some way disadvantaged and unequipped with the experience and knowledge necessary to confidently begin the work assigned in the course. 
    3. Establish and engage students in understanding criteria for success: Create, or when possible, co-create with students a rubric or checklist that establishes the scope and quality of submission that is necessary to be successful on the assignment. Involve considerations of AI in this process and seek student input or feedback on policies regarding AI, particularly as they related to the assignment criteria. Consider allowing the students an opportunity to apply the criteria for success to sample work and reflect on what they learned from the experience. While adding a step to the assignment process, the potential for improving student outcomes through this type of exercise may be well worth the added investment. 
  4. Incorporate attention to instructional practices that involve AI in efforts to evaluate and reflect on teaching effectiveness. When instruction is revised or redesigned using strategies and approaches such as those described above, the impact of such changes on student learning and experience will not be immediately clear.  Understanding implications of course and instructional redesign efforts related to AI will require intentional steps of evaluation and reflection. Fortunately, courses, by nature and design, offer a variety of data sources that can be used to measure and evaluate the impact of teaching. Instructors are encouraged to review and reflect on data to help inform changes to instruction over time. Several ideas to help in the evaluation and reflection process are outlined below. 

    1. Review student performance on AI-aligned summative assessments (exams, projects, papers, etc.). Summative assessments are those used to measure student learning after the learning has occurred and provide direct measures of student learning outcome achievement. For areas of the course in which instruction has either leveraged, limited, or otherwise explicitly addressed AI, student performance on relevant summative assessments or assessment items can serve as one line of data to help instructors better understand the effectiveness of the instruction provided.  

    2. Review formative assessment data related to AI. Formative assessments include activities and engagements that provide a measure of progress in learning while the learning occurs (e.g., quizzes, in-class activities, student reflections). Some such exercises used in the course might incorporate the use of generative AI tools. Monitoring student progress through a review of student submissions to these formative assessments can not only help to inform future changes to instruction but also create opportunities to provide students with supportive feedback. 

    3. Collect and consider mid-course student feedback on instruction. Student surveys or Institute services, such as the Small Group Instructional Diagnosis (SGID) provide clear opportunities for students to share their own thoughts and perceptions of what is, and what is not, working to support their learning in the course, and can be adjusted to explore issues and questions surrounding AI as a teaching and learning tool. The SGID service is available to all instructors teaching at Ohio State and provides valuable student feedback through a focus group-style effort during the term. To request a SGID for your course, simply e-mail the Drake Institute at To learn more about the SGID service and other opportunities for collecting and reviewieng student feedback, visit the Drake Institute's webpage on Using Feedback to Improve Teaching.

    4. Reflect, personally, on perceptions of teaching effectiveness. When incorporating new approaches to instruction, such as those that involve AI, take time to step back and reflect on the experience, from an instructional perspective. How did the students respond? What observations were made during the instruction? Personal perspective, while not something to rely on alone, can serve as a valuable line of data in the larger effort to evaluate the impact of instruction.

The ideas presented above are offered in hopes of supporting instructors in navigating through new opportunities and challenges that emerge from increasing access to generative technologies and tools that leverage AI. These approaches are designed to foster motivation, engagement, equity, and academic integrity. However, they in no way eliminate risks of, or opportunities for, unethical and inappropriate student application of AI in course contexts. The Institute encourages instructors to work with the University Committee on Academic Misconduct (COAM) to address any instances of suspected misuse that might constitute academic misconduct. 

The presence of generative AI tools is and will continue to be a reality that must be acknowledged and addressed with care. The Drake Institute is committed to sharing evidence-based strategies for addressing issues and challenges involved in the teaching and learning process and to working with all who teach and support teaching at Ohio State in advancing instructional excellence. 

For questions or to request a consultation around teaching, please visit or e-mail us at  


Literature Cited:

Andrews, M., Prince, M., Finelli, C., Graham, M., Borrego, M., & Husman, J. (2021) Explanation and facilitation strategies reduce student resistance to active learning. College Teaching, 70(4), 530-540.

Balloo, K., Evans, C., Hughes, A., Zhu, X., & Winstone, N. (2018) Transparency isn't spoon-feeding: How a transformative approach to the use of explicit assessment criteria can support student self-regulation. Frontiers in Education, 3

Nguyen, K.A., Borrego, M., Finelli, C.J., DeMonbrun, M., Crockett, C., Tharayil, S., Shekhar, P., Waters, C., & Rosenberg, R. (2021). Instructor strategies to aid implementation of active learning: A systematic literature review. International Journal of STEM Education, 8(1).

Wheeler, L.B., Palmer, M, & Aneece, I. (2019). Students’ perceptions of course syllabi: The role of syllabi in motivating students. International Journal for the Scholarship of Teaching and Learning, 13(3).

Wiggins, G., & McTighe, J. (2005). Backward design. In, Understanding by Design (2nd ed., pp. 13-34). Pearson Merrill Prentice Hall.

Winkelmes, M., Bernacki, M., Butler, J., Zochowski, M., Golanics, J., & Weavil, K.H. (2016). A teaching intervention that increases underserved college students’ success. Peer Review, 18(1/2), 31-36.