Sunday, May 11, 2025

Scaling Impact: Using AI to Create Just-in-Time Faculty Development Resources

When I first began experimenting with AI as a tool for faculty development, I wondered if it could truly capture the nuanced needs of our teaching community. After all, professional development has always been a deeply human endeavor – built on relationships, contextual understanding, and responsive design. Could a tool like Claude really help bridge the persistent gap between what faculty need and what our limited resources can provide?

The answer, I've discovered, is both more complex and more promising than I initially imagined.

The "PD Resource Creation" project emerged from a practical challenge: how to provide high-quality, contextualized professional development resources for faculty across diverse disciplines at Eastern Kentucky University without overwhelming our small faculty development team. The traditional approach – developing each resource from scratch through extensive research and writing – simply wasn't sustainable given the breadth of needs identified in our faculty needs assessment.

Bespoke Resources

The key has been creating a process that leverages AI's capacity for synthesis and organization while preserving the essential human elements that make professional development meaningful. This requires a careful balance – providing enough structure for the AI to produce valuable content while maintaining space for human expertise and judgment.

The process involves:

  1. Contextualization: Providing Claude with detailed information about EKU's specific context, including student demographics, regional characteristics, and institutional priorities

  2. Expert Guidance: Sharing established frameworks and research-based best practices in the specific area of teaching and learning being addressed

  3. Disciplinary Tailoring: Adapting resources for specific colleges and disciplines by incorporating relevant examples and applications

  4. Human Review: Having faculty experts review and refine the AI-generated resources, adding nuanced insights from their practical experience

  5. Faculty Feedback: Collecting user feedback on the resources to continually improve both the content and our prompting approach

What's fascinating is how this process mirrors what we know about effective teaching. Just as students benefit from clear structures and expert guidance but need space for autonomy and personal meaning-making, our AI partnership works best when we provide clear parameters while leaving room for creative synthesis.

Analyzing the Results: What Works and What Doesn't

The project has yielded some remarkable successes, particularly in creating comprehensive resources on topics like "Developing Student Self-Advocacy Skills" and "Creating Supportive Learning Environments." These resources have provided faculty with research-based foundations, practical implementation strategies, and discipline-specific adaptations that would have required weeks of development time using traditional methods.

However, challenges remain. The AI sometimes produces content that sounds theoretically sound but lacks the nuanced understanding that comes from lived classroom experience. It may over-generalize research findings or suggest approaches that don't account for practical constraints. And perhaps most significantly, the resources sometimes lack the authentic voice and emotional resonance that can inspire faculty to try new approaches.

This is where human expertise remains irreplaceable. Our subject matter experts play a crucial role in refining these resources, adding concrete examples from their own teaching, adjusting recommendations based on what they know works in our specific context, and infusing the more technical sections with a sense of purpose and possibility.

Implications for Faculty Development Practice

This project has significant implications for how we approach faculty development in resource-constrained environments:

  1. Democratizing Access: By significantly reducing the time required to create comprehensive resources, we can address a wider range of faculty needs and disciplinary contexts

  2. Just-in-Time Support: Faculty can access specific guidance when they need it rather than waiting for scheduled workshop offerings

  3. Consistency with Customization: We can maintain consistency in quality and approach while tailoring resources to different disciplines and teaching contexts

  4. Sustainable Scaling: The approach allows our small team to have a broader impact without proportional increases in workload

  5. Continuous Improvement: Each iteration helps us refine our prompting approach, creating a virtuous cycle of improvement

Perhaps most importantly, this approach allows our human facilitators to focus their energy on the aspects of faculty development that genuinely require human connection – coaching conversations, observational feedback, community building, and addressing the emotional dimensions of teaching challenges.

Questions for Further Exploration

As this project continues to evolve, several questions guide my thinking:

  1. How do we effectively balance efficiency with authenticity in AI-assisted resource development?

  2. What are the ethical implications of using AI in creating content that guides teaching practice?

  3. How might we better capture and incorporate the tacit knowledge of experienced educators into these resources?

  4. What faculty development needs remain best addressed through fully human-created approaches?

  5. How does the use of AI in resource creation model the critical digital literacy we hope to develop in our students?

The journey of exploring this partnership continues to both challenge and inspire my thinking about what's possible in faculty development. I'm curious how others in faculty development roles are navigating similar questions and what insights you might share from your own experimentation with these emerging tools.

AI Use Acknowledgment: I acknowledge that I have used Generative AI tools in the analysis of professional development resources created with AI assistance. This blog post represents my own critical reflection and evaluation of this process, supported by AI-assisted analysis of patterns and themes across multiple resources developed through the project.

Saturday, May 10, 2025

Beyond Coding: Using AI to Enrich Qualitative Analysis of Faculty PD Needs

One of the persistent challenges in faculty development work is making sense of qualitative data from needs assessments. When I recently distributed a professional development survey to faculty across departments, I received nearly 200 responses with rich, nuanced open-ended comments. The quantity was overwhelming, but the potential insights were too valuable to reduce to mere word counts or simplistic themes.

This scenario presents a familiar tension in qualitative analysis: how to honor the complexity and nuance of faculty voices while still identifying actionable patterns? Traditional coding approaches, while methodologically sound, are extraordinarily time-intensive. Meanwhile, basic word-frequency analyses miss the contextual richness that makes qualitative data so valuable in the first place.

With these constraints in mind, I decided to experiment with using Claude, a large language model, as a collaborative analysis partner rather than simply a data processing tool. The results challenged my thinking about qualitative analysis in surprising and productive ways.

The Collaborative Analysis Process

Rather than approaching Claude as a replacement for human analysis, I used it as an initial sense-making tool and thought partner. The process unfolded through several iterative phases:

  1. Data preparation: I anonymized all responses, removing identifiable information while preserving contextual elements like department type and faculty rank that might inform interpretation.

  2. Initial pattern recognition: I asked Claude to identify recurring themes, tensions, and outlier perspectives across the responses without imposing predefined categories.

  3. Dialogic refinement: Instead of simply accepting Claude's initial analysis, I engaged in a back-and-forth process, questioning categorizations, asking for evidence, and pushing for more nuanced interpretations.

  4. Counter-narrative exploration: Critically, I asked Claude to identify perspectives that might be missing or underrepresented in the data, and to highlight tensions between dominant narratives.

  5. Synthesis and verification: Finally, I returned to the raw data myself to verify key themes and ensure the analysis reflected the actual responses rather than algorithmic artifacts.

The goal throughout was not to automate analysis but to enrich it—to use the model's pattern recognition capabilities to identify connections I might otherwise miss, while maintaining my responsibility for interpretive decisions.

What Emerged: Beyond Binary Needs

What surprised me most was how this approach revealed complex tensions in faculty needs that binary survey questions would have obscured.

For instance, a traditional coding approach might have simply tallied mentions of "time constraints" as a barrier to implementing new teaching approaches. But the collaborative analysis revealed a more nuanced reality: many faculty were actually expressing something more complex than mere time scarcity.

They described a tension between their intrinsic desire to innovate pedagogically and institutional reward structures that didn't recognize that work. As one faculty member put it, "It's not really about not having hours in the day—it's about what those hours are valued for."

This distinction has significant implications for how we design professional development offerings. Rather than simply making workshops more "efficient" (the solution to a time scarcity problem), we need to address the deeper tension between faculty values and institutional incentives.

Other rich patterns emerged as well:

  • Faculty across ranks expressed a desire for more engagement with learning science research, but wanted it translated into disciplinary contexts rather than presented generically
  • Early-career faculty sought technical skill development, while mid-career faculty more often requested community and reflective practice opportunities
  • Nearly a third of respondents described a tension between student expectations and evidence-based teaching approaches

Each of these insights suggests specific directions for our professional development programming that wouldn't have been as clear from a more traditional analysis.

Methodological Reflections: Trust and Verification

As with my classroom ungrading practices, I found that approaching this analysis through a lens of "trust, not compulsion" yielded the richest results. I trusted the AI to help identify patterns but didn't compel it to fit responses into predetermined categories. Simultaneously, I maintained a critical stance, consistently verifying suggested patterns against the raw data.

There were limitations, of course. The model occasionally overweighted particularly eloquent responses and sometimes missed subtle disciplinary differences in how faculty expressed similar needs. These limitations reinforced the importance of the human analyst in the process—the partnership worked precisely because it combined different analytical strengths.

Implications for Faculty Development Practice

This approach to needs assessment analysis has immediate implications for how we design professional development offerings:

  1. Moving beyond binary needs: Instead of designing workshops based on surface-level needs, we can address the deeper tensions faculty experience between competing values and constraints.

  2. Differentiating by career stage: The analysis revealed distinct developmental trajectories that suggest tailored programming for different faculty career stages.

  3. Creating disciplinary translations: Faculty consistently expressed desire for learning science principles translated into their specific disciplinary contexts.

  4. Addressing institutional tensions: Many faculty needs can only be addressed through institutional policy changes, not just better workshops.

Perhaps most importantly, this approach models a commitment to truly hearing faculty voices rather than simply tallying their "votes" for predefined options. It's an approach to professional development needs assessment that aligns with the same principles I advocate for student assessment—prioritizing rich understanding over reductive measurement.

Questions for Further Exploration

As I continue to refine this approach to qualitative analysis, several questions remain:

  • How might we combine this approach with more traditional methods to maximize both depth and methodological rigor?
  • What are the ethical considerations of using AI tools in analyzing faculty responses about their professional needs?
  • How can we most effectively share these complex findings with institutional decision-makers accustomed to more simplified data presentations?
  • What does this approach suggest about how we might analyze student feedback in our courses?

I'm particularly interested in how others are experimenting with similar approaches to qualitative analysis in educational contexts. Are you finding ways to leverage AI tools while maintaining intellectual integrity and methodological soundness? What tensions or opportunities have you encountered?

I'd love to hear your experiences as we collectively reimagine how we understand and respond to faculty professional development needs.

AI Use Acknowledgment: I acknowledge that I have used Generative AI tools in the preparation of this blog post, specifically Claude. The AI tool was utilized in analyzing qualitative response data from faculty surveys and helping to structure this narrative about the process. I have verified the accuracy of all representations of the analysis process and findings through comparison with the original survey data. This analysis represents my own critical thinking and evaluation, supported by AI-assisted data processing and narrative development.

Sunday, February 16, 2025

Rethinking Assessment: Self-Determination Theory and Ungrading in an Online Social Psychology Course

In Spring 2025, I implemented a novel approach to teaching PSY 300 Social Psychology that prioritized student autonomy and intrinsic motivation. Drawing from self-determination theory (SDT), which emphasizes autonomy, competence, and relatedness as key drivers of motivation, the course was designed to give students unprecedented control over their learning journey.

Course Design

The course structure departed from traditional grading methods by adopting an "ungrading" approach, where students propose their own midterm and final grades based on their engagement and learning. Rather than imposing strict deadlines, the course used "Maximum Benefit Completion Dates" - suggested timelines that optimize learning without penalty for later completion.

Four core components structured the learning experience:

  1.  Engagement with course materials through Perusall, a social annotation platform
  2.  Collaborative group work on shared documents
  3.  Regular reflection through learning journals
  4.  Optional, repeatable quizzes for self-assessment and learning reinforcement

The quizzes were explicitly framed as learning tools rather than evaluation instruments. Students could take them as many times as they wished, with no impact on their grade unless they chose to include their quiz engagement in their grade justification.

Midterm Reflection Process

At midterm, students were asked to reflect deeply on their learning experience through several prompts:

  • How they benefited from engaging with course texts on Perusall
  • How they benefited from collaborative group work
  • How they benefited from the learning journal process
  • Their progress toward course learning outcomes
  • How the course affected their life
  • External factors influencing their engagement

Based on this comprehensive reflection, students proposed their own midterm grades, supporting their proposals with specific evidence of their engagement and learning. This process encouraged students to think metacognitively about their learning journey and take ownership of their academic progress.

Analysis of Midterm Reflections

Using Claude.ai to analyze the midterm reflection data revealed fascinating patterns in how students approached self-assessment. The grade distribution showed:

  • A: 67 students (57.8%)
  • B: 36 students (31%)
  • C: 11 students (9.5%)
  • D/F: 2 students (1.8%)

Key findings emerged from the analysis:

  • Students who reported high engagement with course materials typically assigned themselves higher grades
  • Those facing significant personal challenges often chose more moderate grades despite demonstrating meaningful learning
  • Students consistently considered both their engagement level and learning outcomes when self-assigning grades
  • Even students assigning themselves lower grades frequently reported significant learning benefits

Perhaps most notably, the flexible course structure appeared to encourage honest self-assessment. Students were remarkably candid about their participation levels and challenges, while still acknowledging their learning achievements.

Course Impact

The data revealed that removing traditional grading pressures had profound effects:

  • Students with anxiety disorders reported being able to learn without crippling stress
  • Those balancing work, family, and health challenges appreciated the flexibility to engage meaningfully despite life circumstances
  • Many students noted deeper engagement with the material when freed from grade pressure
  • The course structure fostered genuine peer learning and community building
  • Students used the optional quizzes as true learning tools rather than stress-inducing evaluations

This experience suggests that combining SDT principles with ungrading can create a learning environment that supports both academic achievement and student wellbeing. The high level of self-reported learning, even among students assigning themselves moderate grades, indicates that removing traditional grading structures may enhance rather than diminish educational outcomes.

The midterm reflection process itself served as a powerful learning tool, helping students articulate their growth and identify areas for improvement in the second half of the course. Their detailed reflections demonstrated remarkable self-awareness and a genuine focus on learning rather than grade achievement.

AI Use Acknowledgment:

I acknowledge that I have used Generative AI tools in the preparation of this blog post, specifically Claude.ai. The AI tool was utilized in the following ways:

  • Analyzing qualitative student reflection data to identify patterns and themes
  • Calculating grade distribution statistics from the dataset
  • Helping to structure and organize the findings into a coherent narrative
  • Assisting in clear articulation of course design principles and outcomes

I have verified the accuracy of all statistical data through multiple reviews of the original dataset, and all interpretations of student experiences have been carefully checked against the original reflection texts. This analysis represents my own critical thinking and evaluation of the course outcomes, supported by AI-assisted data processing.

This research was conducted with attention to student privacy; all data was analyzed in aggregate form with no identifying information.

Monday, February 13, 2023

More fun with ChatGPT

Nearly every time I use ChatGPT I come away amazed at its capabilities. Many others have noted ChatGPT's limitations, and they are significant. But that said, check out the prompts I gave it recently (I have not included the responses from ChatGPT for space considerations, but also so you, dear reader, could experience the joy I did when seeing the magic happen right before your very eyes).

I gave ChatGPT just the abstract of an article and these prompts:

  • write an informal blog, this time including the catchy title, an interest-generating or attention-grabbing opening, a section to introduce and define the topic you will address, a section with example that clarifies and illustrates the topic that you are presenting, and brief statement that ties together the topic and invites the reader to think about it further. make it 500 words long 
  • what profession would benefit from knowing about this article? 
  • write a formal letter to a social worker about this article, including suggestions for how they could apply this information in their profession

I co-teach a Psychology of Acting class. Check out these prompts:

  • write three scenarios that demonstrate the power of eye contact in theater
  • what are the best ways for actors to learn their lines
  • what the best ways for actors to get inside the role they are playing
  • what are the major theories of acting, and provide examples of famous actors for each theory
  • what are the hardest challenges for new actors, and how can they overcome them?

I can't claim that ChatGPT's responses to every prompt were scintillating or even correct. But as a starting place they were fantastic. In one class I really do suggest that students write letters to actual people who the students think would benefit from the information in articles they find. They often struggle with this task. But if ChatGPT can give them at least a first draft of a letter, they should be able to revise, tailor, and strengthen the letter with less difficulty. What if students became adept or proficient or even mastered this type of communication with the aid of ChatGPT? Then the real purpose of the activity could shift from developing those skills to affecting the real world. 


Imagine if AI could help us add more meaning to our assessments. Certainly a better outcome for incorporating AI into education than many others I've recently read about.

Tuesday, January 10, 2023

VERY short blog about assessment

 I just saw this tweet:

My intention is to us collaborative grading instead of "ungrading" when describing my practice to students and colleagues. Soooo much better because collaborative grading captures the dynamic more clearly -- I will work with students to determine their grade. Collaborative grading also establishes from the get-go a more equal relationship between instructor and student. No one really knows what ungrading means until you explain it (and sometimes not even then).

Now I just have to decide if I will re-do my syllabi for this spring semester.

[HT to Robert Talbert for encouraging this Captain's log...]

Friday, December 23, 2022

Another blog about ChatGPT

I bet you are already aware of ChatGPT. I've been watching the posts, tweets, freakouts about it over the holiday vacation. (Here's a great resource about ChatGPT, HT to Bryan Alexander https://bryanalexander.org/future-of-education/resources-for-exploring-chatgpt-and-higher-education/) I finally started playing around with it. I started by asking ChatGPT to summarize an online article, then write a series of tweets about the article, then write a sonnet(!) about the article. It did all three tasks more than adequately. BTW, all I had to do was paste in the link to the article, not the entire article itself.

The article I asked ChatGPT to summarize was from Boston Review. I wondered if ChatGPT could summarize journal articles just by including the DOI in my request. No deal. It told me:

I'm sorry, but I am unable to access or summarize specific articles as my training data only goes up until 2021, and I do not have browsing capabilities. Is there anything else I may be able to help with?

Anyone know how to get it to summarize journal articles?

Then I decided to turn to asking ChatGPT to describe something without giving it a source. I simply asked it to "describe what ungrading is". Here's what it said:

Ungrading, also known as alternative grading or competency-based grading, is a teaching and assessment approach that focuses on students demonstrating their understanding and mastery of material rather than on traditional grading systems that rely on grades, such as A, B, C, etc. In an ungrading system, students are given the opportunity to demonstrate their knowledge and skills through various forms of assessment, such as projects, presentations, essays, or exams, and they are given feedback and guidance along the way to help them progress and improve.

Ungrading can be implemented in a variety of ways, depending on the goals and context of the course or program. Some common features of ungrading include:

  • - A focus on learning and mastery rather than grades
  • - Multiple opportunities for students to demonstrate their understanding
  • - Individualized feedback and support to help students improve
  • - A flexible and personalized approach to assessment
  • - An emphasis on self-assessment and self-directed learning

Overall, ungrading is designed to provide a more holistic and learner-centered approach to education, allowing students to progress at their own pace and focus on their own learning goals.

This certainly gets the gist of ungrading correct, and remember my request was very basic. So I went further: write a description of ungrading as if you are explaining it to a student.

Ungrading is a different way of thinking about grades and assessment in school. Instead of earning grades like A, B, C, or D, you will be working towards demonstrating your understanding and mastery of the material. This means that you will have multiple opportunities to show what you know and can do, through things like projects, presentations, essays, or exams.

One of the key features of ungrading is that it is flexible and personalized. This means that you will have some control over how you demonstrate your learning and can choose the approaches that work best for you. You will also receive regular feedback and support along the way to help you understand what you need to do to improve.

In an ungrading system, the focus is on learning and growth, rather than just getting a good grade. This means that you have the chance to take your time and really understand the material, rather than just trying to get the right answer on a test. It also means that you will have to take more responsibility for your own learning, by setting goals and working towards them independently.

Overall, ungrading is designed to provide a more engaging and meaningful learning experience for students. It allows you to focus on what you are interested in and what you want to learn, rather than just trying to get good grades. So, it's a different way of thinking about school, but it can be very rewarding if you are willing to put in the effort and take control of your own learning.

This is pretty great. I would consider using this as a first draft for my syllabus. 

What about something a bit more sophisticated? "What does ungrading have in comon with self-determination theory?" [yes, ChatGPT can deal with typos.]

Ungrading and self-determination theory (SDT) are both approaches that focus on supporting individuals' autonomy and agency in their learning and development.

SDT is a psychological theory that explains how people's sense of autonomy, competence, and relatedness influence their motivation and well-being. According to SDT, people are more likely to be motivated and engaged when they feel that they have control over their own lives and are able to pursue their own goals and interests.

Ungrading is a teaching and assessment approach that aligns with these principles by giving students the opportunity to demonstrate their understanding and mastery of material in ways that are meaningful and relevant to them. It also emphasizes individualized feedback and support to help students progress and improve at their own pace.

Overall, both ungrading and SDT recognize the importance of supporting individuals' autonomy and agency in their learning and development, and both approaches seek to provide a more personalized and learner-centered approach to education.

Going for broke here: "describe a college course that combines ungrading, self-determination theory, problem-based learning, and wicked problems and is about climate change"  

A college course on climate change that combines ungrading, self-determination theory, problem-based learning, and wicked problems could be structured as follows:

    • Course Description: This course will explore the causes and impacts of climate change, as well as strategies for addressing and mitigating its effects. Students will work on real-world problems related to climate change, using a problem-based learning approach and applying their knowledge and skills to find creative and innovative solutions. The course will also use an ungrading approach, giving students multiple opportunities to demonstrate their understanding and mastery of the material and allowing them to take control of their own learning.

    •  

      Learning Outcomes: By the end of the course, students will be able to:

  1. Understand the causes and impacts of climate change, including the scientific evidence and the social and economic consequences.
  2. Analyze the challenges and opportunities associated with addressing climate change, including the role of technology, policy, and behavior change.
  3. Identify and evaluate strategies for mitigating and adapting to climate change, including at the individual, community, and global levels.
  4. Develop and defend their own position on a specific aspect of climate change, using evidence-based arguments and critical thinking skills.
  5. Work effectively in teams to research and solve complex problems related to climate change, using a problem-based learning approach.
  • Course Structure: The course will be structured around a series of wicked problems related to climate change, which students will work on in small groups using a problem-based learning approach. Each problem will be open-ended and complex, requiring students to apply their knowledge and skills to find creative solutions. The course will also incorporate principles from self-determination theory, creating a learning environment that supports students' autonomy and allows them to pursue their own goals and interests.

     

  • Assessment: Assessment in the course will be based on an ungrading approach, with students given multiple opportunities to demonstrate their understanding and mastery of the material. This may include projects, presentations, essays, exams, or other forms of assessment, depending on the goals and objectives of the course. Students will also be given regular feedback and support along the way to help them progress and improve.

Still lots of details to work out if this course were to be actually delivered, but the structure, scope, and especially Learning Outcomes, could all satisfy various committees, IMO.

Generally, I have been less concerned about AI and ChatGPT replacing me. I thought that AI would certainly become assistants or tools for professors and many others. I still think that will happen. But I suspect that OpenAI (and others) already have a better version than ChatGPT, and that the rate of improvement might well outpace Moore's Law. How long before the student becomes the master?

Wednesday, February 2, 2022

"What is truth"

I propose the following question as a final exam for any graduate of the liberal arts, and probably anyone with a college degree:

Name as many disciplines as you can that have an established literature that addresses the question "What is truth?"

    • The more disciplines you name, the more points you get
    • More points if you can describe the content of the literature
    • The more detail you can provide, the more points you get
    • If you don't know the literature, you can hypothesize what the literature would be about
      • More points if your hypothesized literature matches what experts in the discipline report
      • More points if members of the discipline had not thought of your ideas but agree that they are worth pursuing
Maximum score answer: detailed description of the literature addressing this question from every discipline ever invented.

Truth is not just a topic of interest to philosophers, but to every discipline (prove me wrong!), and the truly interdisciplinarity (or trans- or pan- or uni-disciplinarity) of the nature of truth is a vital insight. But where in our curricula would students learn this idea? Perhaps in an upper-level philosophy course, or if they happen to piece it together from all the courses they take. How would graduates from your institution do on this final exam?

If you agree with me that the universality of this question is important for students (people) to understand on a deep level, where should this topic exist in our curricula? For me the only answer is General Education. The hard part is figuring out where to put it -- in an existing course? In a new, stand-alone course? In every course? 
 
Thoughts?