Carnegie Mellon University

Eberly Center

Teaching Excellence & Educational Innovation

Generative AI Tools FAQ

image of printed circuit boardThe recent evolution of AI tools, such as ChatGPT, DALL-E 2, and GitHub Copilot, is impressive, as is the associated volume of media coverage. Are such tools an opportunity for or a threat to teaching and learning? Is it appropriate for students to use these tools? Do instructors need to change their approaches? Tools like this, and the important questions they raise, are not necessarily new. And, the answers are debatable and highly contextualized by disciplinary norms, course learning objectives and assignments as well as students’ level and background. Consequently, we acknowledge that recent developments can generate both instructor enthusiasm and concerns. 

In response to inquiries from CMU colleagues, we compiled a list of frequently asked questions. We based our responses on evidence-based and inclusive teaching strategies, CMU policies, and the current state of technology tools. Furthermore, we approached these questions with two beliefs in mind. First, we acknowledge that both perceived opportunities and challenges can be daunting, yet we believe that you can and will be effective teachers amidst these developments. Second, we do not assume that students will automatically make academically dishonest choices solely due to the emergence of new technologies.   We hope this resource will help instructors deliberately and intentionally think about the evolution of AI tools, and other technologies as they arise. If you’d like to talk to an Eberly colleague about your teaching context, please email us at eberly-assist@andrew.cmu.edu. We’d also like to hear from you if you have had an encounter or use case that can inform this resource and our support for instructors.

FAQs

Artificial Intelligence (AI) tools can generate art, computer code, or natural language which may often be difficult to distinguish from human work. Using a range of parameters, you can prompt a given tool to instantly respond to simple and complex requests, like writing comparative essays, producing the steps for some math problems, writing or fixing bugs in code, generating an image, and writing emails. 

Perhaps the most well known new AI tool is ChatGPT, a “chatbot” which can produce writing and code, including common assignment forms such as reading reflections, generating and/or fixing code, or research essays. If it cannot fulfill a request, it will respond with clarifying questions. Because ChatGPT is not a “mind,” but a model trained on existing writing, its results may sound convincing but be factually incorrect. And the tool cannot be used to assess the validity or sourcing of information that it has generated. (More reliable research alternatives are collected in this document from CMU LIbraries.) ChatGPT is being updated rapidly, and many other AI chatbots are coming on the market, including some specialized for programming languages.

It is important to know that students may use these tools in more creative ways than simply generating a full, final product. Anecdotal evidence from students across the country suggests that students are using AI tools to look up information, create study guides, brainstorm possible topics or approaches to problems, write outlines or pseudocode, and polish or correct the English grammar of their own written work. 

Because AI tools can generate natural language, functioning code, or artistic products, they can be concerning for instructors who rely on assessing these forms of student work. It is important to continue to be intentional in your teaching choices, because being reactive (only) isn’t sustainable in our technologically advancing world. You can take this opportunity to think deeply about your courses, and the kinds of knowledge, skills, values, and attitudes you want your students to develop. As with any new technology–such as graphing calculators, search engines, or Wikipedia–the fundamental principles of how learning works and what we can intentionally do to facilitate learning remains largely the same. If you would like to discuss your courses in light of emerging AI tools with an Eberly Center consultant, please contact eberly-assist@andrew.cmu.edu.

The university’s policy on academic integrity is not changing as a response to generative AI tools. According to CMU’s Office of Community Standards and Integrity, CMU’s academic integrity policy already implicitly covers such tools, as they may be considered a type of unauthorized assistance. However, this policy is intentionally designed to allow instructors to define what is “authorized” vs. “unauthorized” and what constitutes “plagiarism” and “cheating.” We recommend that instructors carefully examine their own policies to make these distinctions clear in both writing and verbally to students.

Please see these example syllabus policies which include prohibiting and encouraging generative AI use as well as blend of these policies. 

Note that expectations vary between instructors and what is considered authorized assistance in one course may not be acceptable in yours; clarity is key, especially because expectations, norms, and policies may vary across instructors.

We recommend that you adopt an academic integrity policy that considers the following:

  • Whether or not AI tools are considered authorized or unauthorized assistance and in what circumstances. 
  • How students should cite assistance from or content/ideas generated by either AI tools or humans. 

Additionally, consider doing the following:

  • Engage students in transparent discourse about the rationale behind your policies, learning objectives and assignments.
  • Discuss academic integrity with students, including its importance and your expectations. Don’t assume that students know what is acceptable and not in your context. Norms change across disciplines, cultures, and courses. Sometimes this is referred to as “the hidden curriculum.”
  • Include improvement across assignments, rather than performance alone, as a component of how you calculate grades.  
  • Talk to your students about AI tools as they relate to all of the above.

For more general help, see this resource on crafting your own academic integrity policy

Each semester, students take multiple courses, all of which may have different expectations and policies. It is important to talk with your students about your own expectations so they don’t make incorrect assumptions. If your policies are not explicitly stated, then your students may NOT be able to effectively interpret what kinds of AI use are authorized or not in your course (for example, in cases of suspected violations of academic integrity). See FAQ 2 for what you can consider as you write an academic integrity policy.

There are several access points to start a conversation. We encourage you to be curious and have an open mind when discussing AI tools with students, rather than assuming or focusing on the worst-case scenario. 

Communicate the purpose of your assignments and why they will benefit your students’ learning.
Knowing why they are doing an assignment can help students to increase their motivation. How does the assignment connect to your course’s learning objectives and to the world beyond your course. Describe the skills you want your students to practice and/or the knowledge you’d like them to gain through completing an assignment. Alternatively, for larger assignments, you could ask students to think about the purposes of the assignment in small groups or as a class before you share your perspective. You can also give students time to reflect on how the assignment connects to their personal or professional goals or values.

Convey confidence in your students' ability.
Let them know your goal is to support their learning. Discuss your course learning objectives and why they matter. Provide positive encouragement that students can succeed with effort, good strategies, and support (see Promoting a Growth Mindset).

Talk about academic integrity early on and why it's important.
Define what plagiarism, unauthorized assistance, and cheating look like in the context of your course because students may assume another course’s policies are the same as yours or may have a different cultural understanding of academic integrity. Provide examples of what kinds of work is appropriate and not. Use AI tools, like ChatGPT, as an example and discuss the ways in which it can be appropriately used (if any) in the context of your course and discipline. 

Ask students about their experiences with AI tools generally.
Have they heard of them? Do they know what they do? For many, these are exciting, fun tools. Acknowledging that can provide a way to connect with your students. If students are familiar with AI tools, what kinds of prompts have they plugged into them and what did they think of the responses? Try playing with an AI tool yourself using both class-related and non-class-related prompts so you can also share about your experiences. 

Be transparent about why AI tools are concerning or exciting to you in the context of your course.
This is an opportunity to explain how your assignments are structured to help students develop key skills and expertise, and how the use of AI may disrupt or enhance this process, either helping or hindering student development in the short and long term. Articulate for your students the inherent inequities that arise when some students are generating their own work for the class, while others are automating that labor. Being transparent about the purpose of your policies around academic integrity and assignment guidelines helps students understand why they are beneficial, rather than arbitrary. 

Give students multiple opportunities or means to ask questions about academic integrity.
Starting from the perspective that students do not want to cheat, allow students to ask questions about academic integrity and AI tools without judgment. This can be as simple as inviting questions using any of the above approaches and by encouraging students to contact you outside of class or in office hours. Remind students that it’s better for them to ask you well before an assignment is due than to operate from a place of uncertainty and anxiety as they are trying to complete it.

Regardless of the technological environment, the first thing to consider in assessment design is always whether what you are having students do aligns with what and how you want them to learn. Be transparent with your students about why your assessments are designed to support their learning and help them develop the skills and patterns of thought that they will want to rely on in their future professions. Additional structure that illuminates the how of their assessments, like grading criteria, evaluation rubrics, and assignment briefs, will often make it easier for your students to engage in the work than not. 

Scaffold assessments:
Break your assessments into smaller pieces that will build on each other. The final product could be a culmination of the prior components, which also had the chance to benefit from formative feedback and low-stakes evaluation. Alternatively, consider requiring multiple drafts and value improvement across drafts in response to your feedback. Provide in-class time for students to work on these components but allow them to expand on or refine their work outside of class as well. Rather than focusing on the product alone, scaffolded assessments like this prioritize the process of generating the final deliverable. Students are less likely to turn to quick solutions for a deliverable if they’ve already put in considerable time and effort, and gained their own expertise, to the point where they feel confident in their own ability to do quality work in a timely fashion. 

Schedule assessments to balance workload:
Students may turn to AI tools if they are feeling stressed, overwhelmed, unsupported, or out of time. Even if they are motivated and engaged, external pressures and incentives can often lead them to make choices that save time rather than enhance their learning. Consider timing assessments to take place outside of typical exam weeks (see also Assign a Reasonable Amount of Work). Prioritize preserving students’ breaks (for wellness) and giving a longer time horizon for when items are due. Build in some time in-class for students to work on assignments or projects.

Focus on process:
Ask students to explain their process and reflect on their own learning. This could look like: 

  • a reflection checklist or rubric

  • a list of specific steps they took, what they could have done differently, and why

  • annotations on an assignment or deliverable justifying the creative choices they made (or a separate deliverable reflecting on and referencing specific aspects of their previous work).

You might consider assessing students on how much they have improved rather than on one instance of their performance, which traditional exams and final papers often do. This could mean awarding additional points for students who are able to articulate mistakes, why the mistake was made, and how they can avoid them in the future.

Design assessments to make learning visible through connections:
In his 2013 book Cheating Lessons: Learning from Academic Dishonesty, James Lang defines “original student work” as that in which students “create an original network of connections.” This network can be made from various sources that the student is uniquely positioned to curate (e.g., information presented in the course, from other courses in their curriculum, their personal experiences, and external sources they’ve encountered), and is helpful for learning as those connections between their prior knowledge and new course content make students more likely to remember and be able to apply the new information in the future. This idea can be leveraged in assessment design, where the emphasis is less on the originality of the ideas students are generating (countless scholars have already analyzed the same poem, or have written a similar line of computer code and their work is out there), and more on how students relate these disparate ideas to one another. This can be accomplished by reframing assignment descriptions and rubric criteria, as well as considering the types of deliverables that best align with the learning objectives and which allow students to demonstrate the original network they’ve created (see also Designing Aligned Assessments). Remember that providing an environment of positive support, which instills in your students the confidence to generate their own unique and successful ideas, can go a long way in promoting students’ motivation.

Provide choice, creativity, and flexibility for assignments:
Students may turn to plagiarism or AI tools when they lack motivation to complete assignments. One way to increase the value perceived by students (thus increasing their motivation to complete them authentically) is to provide more choice on the assignment deliverable. For instance, if your goal is to assess how students synthesize, evaluate, and communicate about multiple sources, some students may choose to write an essay, while others could demonstrate those same skills in a video or by designing an infographic, as long as the deliverable demonstrates the required learning objectives. Consider the component skills your assignment is targeting and what competencies students must demonstrate. Then design an assignment prompt that includes these skills, but which allows more choice in what the final product looks like. Finally, design a rubric with criteria that are agnostic of the form of the deliverable.
Click here for additional examples and considerations for designing assessments that allow for student choice.  

Avoid over-reliance on hand-written deliverables, in-class evaluations, or oral exams and presentations:
We do not recommend drastically changing your assessments to exclusively or excessively rely on the aforementioned approaches as a reaction to concerns regarding generative AI tools. While one or more of these approaches may appear to be a simple solution, these changes could raise more difficulties than they solve, particularly for reasons of equity and inclusion. For example, some of these approaches may inadvertently disadvantage English language learners or students requiring accommodations for disabilities (see also FAQ 5 below). Prioritizing student success means providing an environment where everyone has an equitable opportunity to demonstrate their capabilities. Timed, hand-written exams, for example, may disadvantage students who know the material well, but are unable to hand-write their answers quickly. Oral presentations may put extra stress on students with anxiety, who then are faced with additional challenges which their peers do not face. Does completing a writing assignment during a class session, without the ability to adequately revise while drafting, authentically and fairly assess written communication skills across students? We suggest reflecting on who will be advantaged or disadvantaged by particular assessment choices and how well they align with your highest priority learning objectives. Ultimately, a mix of assessment approaches and providing support for student success maximizes equity and inclusion.

Don’t necessarily redesign all assessments to focus on the perceived, current limitations of AI tools: 
All new technologies have limitations. However, limitations change over time as technologies are refined by their developers. The capabilities of AI tools have evolved rapidly. A tool may now (or soon) perform well on a task on which it performed poorly six months ago. For example, natural language generators, like ChatGPT, are trained on historical data that needs to exist and be available to the tool online prior to the training. When it was originally released, Chat GPT was not particularly good at responding to prompts about current events. It also struggled to cite peer reviewed literature accurately, could not leverage data that was protected by paywalls, such as JSTOR articles, and could not reference classroom discussions. Nevertheless, AI technologies evolve just as the new data they train upon and generate evolves. Therefore, some of the limitations of ChatGPT described above have changed or will change over time. Consequently, designing assignments around any current limitations of an AI tool  may be a temporary solution, but not a sustainable one. Instead, consider some of the strategies discussed previously.

Unauthorized assistance, cheating, and plagiarism create inequities; all are unfair to students who do the work themselves. However, some approaches to preventing academic dishonesty, or students’ use of AI tools, may inadvertently create inequities or marginalize some students. All teaching strategies have pros and cons, so we recommend that you consider potential implications for equity and inclusion.

Designing assessments:
Some assessment strategies directly support equity and inclusion. These include
providing student choice when appropriate in assignment deliverables or topics, varying the type of required assessments or deliverables, strategically leveraging low stakes assessments, scaffolding high stakes assignments to include milestones and drafts, and more (see Centering DEI in Teaching, Creating a Welcoming Classroom Climate, and suggestions above). However, other strategies may disadvantage certain students. For instance, some commonly discussed potential strategies to eliminate the use of AI tools include intentionally shifting assessment designs toward hand-written deliverables, in-class evaluations, or oral exams. Relying exclusively or excessively on these approaches may prevent English language learners or students with disabilities requiring accommodations from fully demonstrating their learning. Additionally, in-class writing or other time-limited assessments may not align well with learning objectives. Adopting such approaches may result in assessing students’ speed more than their true competency. Consider whether speed is a high priority learning objective or a fair assessment criteria across your students. For additional support on determining how assessments may impact students with disabilities or how to make appropriate accommodations for CMU students, please contact the Office of Disability Resources

Considerations for choosing resources:
Requiring students to purchase particular texts or resources (e.g., the newest editions of textbooks, subscriptions to educational cases including newspapers, or purchasing sample data) to avoid the expertise of certain AI tools may disadvantage students with limited resources and cause undue financial burdens. Consider how you can provide such resources for free through Canvas or University Libraries. Additionally, if you are encouraging use of AI tools, consider whether or not they are digitally accessible to all learners. Please carefully consider the legal considerations of requiring students to use AI tools in your courses (see FAQ 8 below


AI tool output may be biased:
AI tools will reproduce any latent biases in the data on which it was trained. Depending on the prompt, AI tool output can directly cause harm to underrepresented or marginalized students via microaggressions. Also, relying on these tools will inherently bias student work towards mainstream, existing ideas, if the data they train on is biased or not representative of underrepresented or marginalized student identities. Consequently, some applications of AI tools for education may be at odds with efforts to center diversity, equity, inclusion, and belonging. Careful consideration must be given to how to still engage with marginalized ideas or viewpoints.

Remember that AI Tools are web resources, and like other such tools may not always be accessible to you or your students. Before planning any activities or assignments using this tool, ensure you and your students can go online and successfully and equitably access it (see also FAQ 5 above). Additionally, carefully consider the legal considerations of using or requiring AI tools in your course (see FAQ 8 below).

As with any new technology, there are often exciting avenues for new or enhanced learning experiences. It is important to be transparent with your students about the purpose you have in mind. Let them know the best way to approach the technology to maximize their learning. Try connecting this purpose to one of your existing learning objectives. If you would like to integrate AI tools into your course, here are some ideas:

Explore the limitations:
Let your students explore the capabilities and limitations of AI generation. Guide them on big questions surrounding what defines things like communication and interaction. For example, if ChatGPT writes your emails for you, are you really communicating? Have your students think about the nature of the data an AI tool pulls from and its intersections with ethics. For example, what is the range of “inappropriate requests” and why? What might your students want to change about an AI tool  to make it more useful for their lives? What does it mean to create with or without AI assistance? How might the use of an AI tool enhance equity or create inequities?

Spot the differences:
Prepare a class session where students attempt to identify differences between two pieces of writing or art or code, one created by their peers, and the other created by an AI. In advance, choose a set of prompts to provide to small groups of students to input into the AI. For example, ask students to request a paragraph, email, or poem from ChatGPT in a particular style or from a certain perspective to a specific audience on a topic. Next, ask each group to write their own response to a different prompt and collect them. Then match the student- and AI-generated responses to the same prompt. Give each pair to a group. Be sure you don’t give students the same prompt that they wrote on. Challenge your students to identify differences in tone, clarity, organization, meaning, style, or other relevant disciplinary habits of mind and which sample was AI-generated. 

If you’re teaching a math course or computer science course, input some homework problems, and have your students critique where the AI succeeds or not (and how it could improve) or articulate alternative solutions. Can your students determine whether code was written by humans or an AI?   

Facilitate discussion:
Have students prompt ChatGPT to generate discussion questions for the next class session, then have students create their own responses to those questions. ChatGPT can also follow-up questions and responses of its own, and students can continue their discussions with AI assistance. This approach to discussion facilitation could work well in small groups first, with a large group debrief afterwards. This helps students engage and learn about the topic while fostering and sustaining discussion, but it will also bring up interesting secondary questions. For example: Will the small groups have all learned and discussed the same things? Different things? Did ChatGPT lead some groups off topic?

Language prompts:
Assign a topic and let your students come up with different ways to input it into ChatGPT. Then task them with writing the same thing, but in a different way. Ask your students to explain their decisions. How might they change the language? Why? What rhetorical strategies could make it sound better, worse, more beautiful, more parsimonious, or more confusing? Have your students take on the role of an instructor and “grade” ChatGPT on its output. 

Generate samples for students to critique:
Have your students enter your assignment prompt into an AI tool. Then ask them to use your grading criteria/rubric to evaluate the output that the AI tool generates. This can be a helpful way to provide “sample” work to your students who may be looking for examples or curious about what a “good” and “bad” version of the deliverable looks like. You can also include your own comments and critiques and use the AI-generated output like you would use an example of a past student’s work. This approach not only enhances transparency of grading criteria, but also helps students practice and get feedback on necessary skills.

Have fun:
Have ChatGPT write an academic integrity policy forbidding its use. Ask it to write an email to students’ pets. After requesting that it write in another language, compare the output to other translation algorithms. Input an unsolvable math or coding problem into an AI Tool. Be creative! Regardless, talk about what it means to do things “the human way;” have your students make a list of all the things they would rather do than have an AI do for them, then have them ask an AI tool to write up that list and compare!

See additional ideas for classroom learning activities leveraging AI tools that generate code or text and considerations for responsible use of AI tools (created by colleagues in the Heinz College of Information Systems and Public Policy). 

The Eberly Center recommends extreme caution when attempting to detect whether student work has been aided or fully generated by AI. Although companies such as Turnitin are beginning to offer AI detection services, none have been established as accurate. In addition to false positives and false negatives, detection tools may often produce inconclusive results. A detection tool can provide an estimate of how much of a submission has the characteristics of AI-generated content, but the instructor will need to use more than just that number to decide whether the student violated the academic integrity policy. For example, an instructor will need to have a plan for how to proceed if a tool estimates that 34% of a submission was moderately likely to be AI-generated. Even very strong evidence that a student used AI may be irrelevant unless you have a clear academic integrity policy establishing that the student’s use of the AI tool constitutes “unauthorized assistance” in your course. Furthermore, research suggests that the use of detection tools may disproportionately impact English language learners.  

Until (and after) robust and stable AI detection tools are available, we recommend that you consider the variety of instructional design and teaching strategies provided in this resource. You might first want to consider if AI output will pose a problem for your teaching and learning context. Start by trying out a few applicable tools using your assignment prompt to see if you need to make any adjustments (see FAQ 4 above).

The university vets teaching technologies for pedagogical value, compliance with both the Family Educational Rights and Privacy Act (FERPA) and the Americans with Disabilities Act (ADA), security, and stability. Before using any technology tool or app, including AI tools, ensure that its use falls within the university’s legal guidelines. 

For more information or help finding a vetted tool that fits your and your students’ needs, please contact eberly-assist@andrew.cmu.edu to schedule a consultation. 

Can I require students to use an AI tool to complete an assignment?
Because AI tools are not vetted through the university and are not compliant with FERPA by default (student work, which is part of their academic record, is being shared with third-party individuals and/or platforms), students cannot legally be required to create an account to use an AI tool for course assignments or work. Note that this is and always has been true for any third-party, unvetted platform or app! Therefore, if you plan to use such tools in your course, you will need an alternative plan for any student who does not want to create an account. For example:

  • Original plan: Individually, students enter a prompt question into ChatGPT and analyze the response.
  • Alternative 1: With a partner (in case some students do not have an account), students enter a prompt question into ChatGPT and analyze the response.
  • Alternative 2: Students generate the prompt language and the instructor inputs it into ChatGPT and shares the responses with the class, and then students analyze the response.

If you would like to discuss ideas or alternative assignments for your course(s), please contact eberly-assist@andrew.cmu.edu to schedule a consultation.

Can I prohibit the use of AI tools in my course?
Legally, yes. Make sure to be transparent about why you are doing so (see FAQ 3 for examples on how to talk with your students about the use of AI tools), and remember that detection is extremely limited so enforcement may be both difficult and unreliable (see FAQ 7 for more explanation).


What kinds of student data or work can be shared with an AI tool platform (and by whom)?
The main concern with sharing student work relates to students’ privacy rights. FERPA protection begins after an instructor accepts an assignment for assessment and grading. If you are permitting the use of generative AI in your course, students are allowed to (but, again, should not be required to) submit their own work into an AI tool. However, you should set up an assignment workflow where students export their course work from the tool (e.g., saving the work as a PDF), which they can then submit for review and grading. The instructor should not need to access the tool to see or grade the submitted work.

Some instructors may be interested in using an AI tool to grade student work. If you choose to do so, every effort must be made to anonymize the student’s work by not connecting that work to any Directory Information on the student (see the FERPA guidance for examples of Directory Information). If you would like to discuss technological tools to assist with grading, please contact eberly-assist@andrew.cmu.edu to schedule a consultation.

For more on the use of AI tools for various academic contexts, please see the Guidance Memorandum from University Contracts (current as of Summer 2023).