Artificial Intelligence

At Davidson, faculty, staff and students work together to make sure artificial intelligence is used responsibly in teaching, learning, research and administrative work.

AI tools—like generative AI, retrieval-augmented systems, agent-like AI and similar technologies—raise important ethical, professional, and legal issues in higher education. Deep engagement with AI will help our students grapple with these issues and prepare them to lead and serve in an AI-driven future. 

What Is Generative AI?

Generative AI is an approach to producing text and images by learning from existing sources. It works by ingesting a vast and diverse set of content—websites, books, images, code, videos—and building an incredibly large and complex mathematical model to represent, reproduce and riff on patterns found in those sources. These “foundation models” can be fine-tuned for specific tasks.

AI tools launch, develop, update and change at a fast pace. Engaging with AI as a tool for teaching, learning, and productivity requires a critical eye and open mind, and a willingness to test, trial and adapt. 

Thinking Critically About Using AI

As with any tool, when used properly, generative AI may help you learn and accomplish routine work, including writing and summarizing documents, analyzing data, developing websites and presentations and writing code. Generative AI can help save time on tedious administrative tasks, generate images, audio, and video, and sufficiently advanced systems can even act autonomously on the user’s behalf.

Potential Uses

  • Streamline repetitive tasks: Generative AI can be helpful in duplicating content, retyping text found in images, converting file formats and comparing lists.
  • Review your administrative writing: Proofread emails and documents, get suggested edits to your writing, and format documents.
  • Organize new information: Use tools to summarize existing articles and documentation, search the web for relevant information, and format it in a document you can easily review.
  • Decision making assistant: Generative AI can help you explore ideas and organize your thinking by helping you create lists of options and pointing out questions you may want to consider.
  • Review large documents: Upload a document and ask the tool to help you find information you are looking for within that document.

Key Concerns

  • Privacy and intellectual property: Unless you are using generative AI models that require a license, any information you provide in a chat or post in publicly accessible sites may be used to feed future models and be output to other users. Tools you use with your Davidson login - Gemini, Amplify, or Davidson-paid ChatGPT/Claude licenses - do not use your chats to train models.
  • Hallucination & Bias: By nature, generative AI is reliant on the patterns it learns from training. This means that its sense of “truth” is limited. Information provided by AI can be biased, misleading, or just plain wrong. Don’t assume what AI is telling you is factual or without bias. Review and revise as necessary any content that is generated by AI tools.
  • Monotony: Although AI can speed up human activity and mimic human output, it cannot replace human ingenuity and creativity. Stories or essays written by AI can be shallow, derivative, and flat.
  • Cognitive loss: Emerging evidence suggests that using AI may impact or impair human cognition. College is a time to think, experiment, create, to find your own voice and learn to express your own thoughts. Overuse or inappropriate use of AI may mean risking all you came here to accomplish.
  • Carbon footprint: Training large language models has a large carbon footprint, and there are sustainability considerations around the use of generative AI.
  • Research replacement: For all of these reasons, researchers should think of generative AI as one tool that can be used from a much larger research toolbox, rather than as a one-stop shop for answers. If you are seeking fact-based answers to questions, you should always use an array of vetted information sources to build knowledge rather than relying on generative AI to provide knowledge-like answers. Librarians can help discern the differences and guide you to reputable sources.
     

AI Tools for the Davidson Community

Davidson’s Google Workspace account includes access to Google Gemini. If you are logged in with your Davidson account, Gemini provides the same data privacy protections already in place for tools like Google Drive and Google Docs. Public, internal, or restricted data may be used in Gemini. Confidential data may not be used without explicit permission from T&I Information Security. 

Gemini includes a chatbot with generative text capabilities, image generation and ability to work directly with documents from your Google Drive. Gemini can be used as an app or in your browser. NotebookLM is also available to Davidson users.

Getting Started with Gemini

Amplify is a platform for engaging with generative AI, available at amplify.davidson.edu. Originally developed by Vanderbilt University, Amplify is built for higher-ed by higher-ed, and allows for access to Claude and ChatGPT models with the same appropriate data protections for most types of Davidson data. Public, internal, or restricted data may be used in Amplify. Confidential data may not be used without explicit permission from T&I Information Security. 

For those who are familiar with generative AI, want to create assistants that can be shared to any  Davidson community member, or be able to select different models for each query they submit, Amplify is a good solution. Instructional video series on how to get started with Amplify: Instructional Video Series [Davidson Login Required].

Getting Started with Amplify 

In some instances, Davidson faculty or staff may have access to additional AI tools such as Claude or ChatGPT. For Davidson-licensed tools, public, internal, or restricted data may be used with those tools. Confidential data may not be used in AI at all without explicit permission from T&I Information Security. Free versions of those tools can only be used with public data. Email ai-innovation@davidson.edu if you have questions about your specific tool.

AI and the Classroom

AI tools bring new, unique ways to engage with information. While we recognize the value AI brings in many areas, its use in a classroom setting should always be guided by faculty and with the broader goals of learning in mind. At Davidson, AI use is guided by the larger values of the Honor Code. The trust and integrity that underlie the Honor Code extend to how we utilize AI as a campus community. 

Davidson students are expected to use generative AI and LLMs in a class only when and how their use is permitted by the stated policies of a specific class and/or the parameters for a given assignment. Don’t make assumptions about the acceptable use of AI in courses. If norms and expectations for a specific course are not clear to you, ask your instructor for clarity. 

When AI use is permitted in your classwork, be open and clear about your use of AI, and explicitly acknowledge how it contributed to your work.

Ideas for Using AI in Learning

  • Orient you to a topic, provide background and explain complex concepts

  • Brainstorm multiple approaches to consider in studying a subject matter

  • Challenge you with questions to interrogate your understanding of a topic

  • Critique your approach to a topic

Adapted from: The Student Guide to Artificial Intelligence licensed under a Creative Commons Attribution CC BY-NC-SA license. Courtesy of Elon University.

Protect Confidential Data

Students

Private or confidential data, or data belonging to another individual, should not be entered into tools that will use the information in their AI training model. AI tools should not be used to cheat, steal, plagiarize, or harm others. Review and revise as necessary any content that is generated by AI tools. Davidson-provided platforms like Amplify and Gemini do not train their models on your data if you log into them with your Davidson credentials.

Staff and Faculty

Davidson data should not be entered into unauthorized tools. Davidson-owned data that are classified as Internal or Restricted may not be used in AI tools, unless that tool was approved and purchased by T&I, such as Gemini, Amplify, or a Davidson-paid ChatGPT/Claude license. Confidential data may not be used in any AI tool without explicit permission from T&I Information Security. Review what types of data fall into which category in the Data Security Policy or the Data Categories Overview Article.

Davidson College AI Innovation Initiative

The AI Innovation Initiative is an initiative designed to enable innovation and experimentation with generative AI, and to gain institutional experience as new use cases for teaching, learning, research, campus life and administration emerge. This initiative hosts frequent jam sessions, training opportunities, and other ways to engage collaboratively with colleagues across campus.

Learn more about the AI Innovation Initiative [Davidson Login Required]