Not That Kind of Doctor - Exploring AI and Generative AI in Education
TLTE
Author
02/07/2024
Added
3
Plays
Description
Artificial intelligence (AI) and generative AI are transforming education in ways we never imagined. In this episode of Not That Kind of Doctor, Guy and Nick are joined by Dr. Justin Olmanson to explore the intersection of AI and teaching and discuss how these technologies can be both powerful allies and complex challenges in the educational landscape.
🎓 What You'll Learn:
The distinction between AI and generative AI and how they each play a role in modern education (6:00)
How generative AI can be used as a "thinking partner" in lesson planning and instructional design (10:00)
The benefits and risks of using AI tools for creating and evaluating lesson plans (19:00)
Strategies for integrating AI into teaching without losing the human touch (24:00)
Ethical considerations and the emotional impact of AI-generated feedback on students (31:00)
Whether you're an educator, a graduate student, or someone interested in the future of education, this episode provides valuable insights into how AI can enhance teaching while maintaining the importance of human expertise. Join us as we dive into the world of AI in education and what it means for both teachers and students.
Like, comment, and subscribe for more episodes where we explore the challenges and opportunities in education with thoughtful discussion and practical advice. 🎓✨
#AIinEducation #TeachingwithAI #GenerativeAI #EducationTechnology
Artificial Intelligence - Not That Kind of Doctor with Nick Husbye and Guy Trainin
www.youtube.com/@tltenotthatkindofdoctor
Searchable Transcript
Toggle between list and paragraph view.
- [00:00:00.629](upbeat music)
- [00:00:09.900]Except I have nothing to grade until next Wednesday.
- [00:00:12.570]I'm sure there's gonna be a nightmare about that
- [00:00:14.010]at some point as well.
- [00:00:17.910]So, new semester,
- [00:00:20.310]new technology,
- [00:00:23.400]and we're here on Not That Kind of Doctor today
- [00:00:27.780]to talk about artificial intelligence, generative AI,
- [00:00:32.376]and, particularly, to be thinking about
- [00:00:34.620]some of the ways that
- [00:00:36.780]we can use
- [00:00:39.090]this generative AI in
- [00:00:43.200]our teaching and our research.
- [00:00:45.960]So, I'm not forgetting it this time.
- [00:00:48.600]I'm Nick Husbye.
- [00:00:49.470]I am an Associate Professor of Elementary Literacy Education
- [00:00:52.230]here, at UNL.
- [00:00:54.120]My co-host.
- [00:00:55.020]I'm Guy Trainin.
- [00:00:56.610]A professor here at UNL as well.
- [00:00:58.830]And we are joined by-
- [00:01:00.240]Justin Olmanson.
- [00:01:02.100]Same department, same institution.
- [00:01:05.550]Innovative Learning Technologies program.
- [00:01:09.360]And so, we're gonna be talking about, like,
- [00:01:11.820]artificial intelligence, and generative AI,
- [00:01:14.640]and ways that we are both leveraging it,
- [00:01:20.100]as well as being suspicious of it
- [00:01:24.330]within the work that we're doing and how graduate students
- [00:01:27.570]might be able to do that same thing.
- [00:01:30.270]So,
- [00:01:33.720]how are you feel, like, what are your,
- [00:01:36.780]kind of, thoughts around generative AI?
- [00:01:40.968]And how you're using it in your teaching, in your research?
- [00:01:44.520]Just widespread.
- [00:01:45.780]Just generally.
- [00:01:47.610]Well, I think like a lot of people, I started out
- [00:01:50.040]interested in how I could escape tasks I found,
- [00:01:57.277]that I found unpleasant, tedious,
- [00:02:00.300]or time consuming in a way
- [00:02:02.040]where I'd prefer to spend my time, my work time,
- [00:02:05.130]in a different way.
- [00:02:06.600]And so, like most people, I think that's where I started.
- [00:02:10.140]And then, because I do research in these areas,
- [00:02:14.400]because I've been interested in these areas for a while,
- [00:02:16.410]I started teaching a course in
- [00:02:17.730]artificial intelligence in education in 2016.
- [00:02:21.300]So, from then, I've been, sort of, in this area.
- [00:02:25.500]And, now, there's an explosion of possibility,
- [00:02:29.280]and uptake, and everyone's talking about it.
- [00:02:33.030]And so, that's exciting.
- [00:02:35.130]And it sort of feels like,
- [00:02:38.250]I should be farther ahead, I should be farther along.
- [00:02:42.360]Shouldn't we all?
- [00:02:44.147]In what way?
- [00:02:44.980]Tell me more.
- [00:02:45.813]Like, how could you envision yourself being further along?
- [00:02:49.140]Like, all the develop, like,
- [00:02:51.630]I like to design.
- [00:02:52.950]Right now, I've got several designs
- [00:02:56.340]that leverage artificial intelligence,
- [00:02:58.080]that leverage generative AI.
- [00:03:00.240]And we're on a third round of data collection for
- [00:03:05.220]an intelligent personal learning assistant
- [00:03:08.220]for undergrads, and then one for grads.
- [00:03:10.920]One that's based on them learning computational thinking,
- [00:03:14.520]computational literacies, and programming.
- [00:03:17.160]And then, at the undergrad side,
- [00:03:18.870]how do I navigate a course that has,
- [00:03:21.870]you know, 46 optional projects?
- [00:03:25.920]You know, in terms, you know, if we're in a TLDR age,
- [00:03:31.050]you know, how do you get students to, sort of,
- [00:03:33.450]engage with 40 different optional projects?
- [00:03:36.210]On one hand, they're excited 'cause they can find the things
- [00:03:38.528]that they want to do, because there's something like that.
- [00:03:41.033]And, on the other hand, you know,
- [00:03:43.110]it just feels overwhelming.
- [00:03:44.700]And so, logistically, our personal learning assistant
- [00:03:48.540]helps them sort of parse that all,
- [00:03:50.970]and gives them suggestions based on what their major is,
- [00:03:55.020]what they're interested in.
- [00:03:56.010]Are they interested in the maximum number of points
- [00:03:58.320]for the least amount of work?
- [00:03:59.880]Or are they interested in, you know,
- [00:04:01.590]their speech path or a future speech path,
- [00:04:04.140]and they want projects that
- [00:04:06.660]are devoted along those lines.
- [00:04:09.330]And then, we have another part of it,
- [00:04:11.580]where it supports them in their microteaching.
- [00:04:13.920]Where this is, for many students in my undergrad course,
- [00:04:17.700]it's the first time they're thinking about teaching.
- [00:04:20.580]And, a lot of times, they're like, "Well, this is it."
- [00:04:22.747]"If this doesn't go well,
- [00:04:24.780]I'm gonna have to change my major to Canadian Studies."
- [00:04:28.470]And a wonderful PhD by the way.
- [00:04:31.997]But, if that doesn't-
- [00:04:33.750]So, if it works for them, then they feel great.
- [00:04:36.240]And so, they put a lot of pressure on themselves.
- [00:04:38.700]And, sometimes, that pressure comes at three in the morning.
- [00:04:41.070]And so, having this, having
- [00:04:44.460]an agent or a bot just sort of hanging out,
- [00:04:46.650]and at three in the morning they can say,
- [00:04:48.367]"This is what I've got,
- [00:04:49.890]this is what I need to have, what am I missing?"
- [00:04:52.290]Our bot, you know, has all the
- [00:04:54.030]information about the project.
- [00:04:55.860]And, you know, whatever Reddit has said about teaching.
- [00:05:01.688]So, surprisingly useful, many students have found it
- [00:05:04.320]in the research we just finished from the January term.
- [00:05:07.980]And so, I wanna now take a step back.
- [00:05:10.890]I know a lot of people do that,
- [00:05:12.090]but I think it's important that we
- [00:05:14.370]talk about AI and generative AI.
- [00:05:17.640]So, let's start there just to kind of lay the ground,
- [00:05:21.930]because AI has been with us for a long time, actually.
- [00:05:25.590]And, what I'm often surprised with, is
- [00:05:29.880]how many things in my life are AI driven
- [00:05:33.060]without me having thinking about them as AI.
- [00:05:36.660]For example, I've been using grammar check.
- [00:05:40.410]If Grammarly wants to sponsor, we're here.
- [00:05:43.200]But I've been using it for a really long time,
- [00:05:46.200]and that's AI.
- [00:05:47.610]And it's gotten really better over time in multiple ways.
- [00:05:51.510]But there are lots of little pieces in my life
- [00:05:53.670]that are not AI by definition,
- [00:05:57.180]that are actually AI.
- [00:05:58.800]So, we started thinking differently when
- [00:06:00.840]generative AI showed up, when ChatGPT, really, shows up.
- [00:06:03.630]So, can you talk a little bit about
- [00:06:04.920]the difference from your perspective?
- [00:06:07.080]Sure.
- [00:06:09.690]Over 10 years ago,
- [00:06:12.180]I went to a conference,
- [00:06:13.830]an artificial intelligence and education conference,
- [00:06:16.320]international conference,
- [00:06:18.060]and I was amazed at what counted as AI.
- [00:06:22.980]Okay, so, if you have a series of if-then statements,
- [00:06:27.450]some would say, "Well, it's intelligent."
- [00:06:29.287]"It's making choices."
- [00:06:30.750]It meets the bar.
- [00:06:32.310]What you mentioned, in terms of Grammarly,
- [00:06:34.320]there's a long, rich, useful history of
- [00:06:38.040]natural language processing,
- [00:06:40.050]and it's coming from a linguistic side.
- [00:06:42.060]That's how I started in AI,
- [00:06:44.340]was thinking about things like WordNet.
- [00:06:47.310]It was a computational linguistics course.
- [00:06:50.070]And you could get quite a ways down the road,
- [00:06:53.370]in terms of offering supports.
- [00:06:56.580]I think what-
- [00:06:57.540]And, for instance, OpenAI's GPT has been around,
- [00:07:02.010]GPT-2, GPT-3, you know, it's been around for a while too.
- [00:07:05.310]But, until November of '22,
- [00:07:08.550]there was not really a sense that
- [00:07:11.250]it could sort of be a sidekick.
- [00:07:13.470]It could do some things, like I mentioned,
- [00:07:15.150]do some things for you.
- [00:07:16.650]Generate things that weren't trash
- [00:07:19.920]and didn't seem random.
- [00:07:22.140]That you could sort of, you know,
- [00:07:23.700]without too much finessing, turn it into something
- [00:07:26.940]that you could say, "Okay," you know,
- [00:07:29.647]"I collaborated with this thing and it was useful."
- [00:07:32.400]And I think, partially, that's because, you know,
- [00:07:36.520]as PhDs, we have a reasonable amount of
- [00:07:38.640]understanding about our fields.
- [00:07:40.740]And we know when we're getting into areas
- [00:07:42.930]where we don't know enough, right?
- [00:07:44.790]And so, I think it's, we're uniquely positioned
- [00:07:48.240]to sort of work with generative AI in our areas
- [00:07:51.180]or adjacent to our areas, because we sort of,
- [00:07:56.490]we have a sense as to how to navigate knowledge
- [00:08:01.128]and sort of understand, "I don't think that's,
- [00:08:03.480]I don't think that's going in the right direction."
- [00:08:05.340]But I think that's one of the major differences.
- [00:08:07.590]You can use AI
- [00:08:09.870]to predict the next purchase someone's going to make.
- [00:08:14.070]In the case of these two, probably Starbucks.
- [00:08:17.820]Or you can use generative AI to, sort of, write a report
- [00:08:23.850]as to, you know, based on some data,
- [00:08:26.650]what type of purchase they're gonna make and why.
- [00:08:29.370]And I think those,
- [00:08:31.710]those might be, you know,
- [00:08:33.780]in the ballpark of the differences.
- [00:08:35.670]One is generating, synthesizing, and generating information.
- [00:08:42.570]And the other is, sort of, just giving you information
- [00:08:46.530]and you have to do things with it.
- [00:08:48.930]Okay.
- [00:08:50.670]So, let's shift about,
- [00:08:52.650]talking about generative AI in teaching.
- [00:08:54.450]You've touched a little bit about
- [00:08:56.640]the use of generative AI in teaching.
- [00:08:59.640]And I wanna start with the conversations
- [00:09:01.860]I've had with my undergraduate students.
- [00:09:03.510]And that's an important part of the job we do,
- [00:09:06.480]whether it's graduate students or as faculty.
- [00:09:09.030]And that is-
- [00:09:11.880]I think that it's really important to give a very
- [00:09:15.150]consistent message within each class.
- [00:09:18.810]Because, right now, the sense I'm getting from
- [00:09:23.190]our undergraduate students especially,
- [00:09:24.840]but even from graduate students,
- [00:09:26.340]is the boundaries are blurry.
- [00:09:28.230]And that makes people very hesitant to use it effectively,
- [00:09:32.160]because they're not sure what the expectations are.
- [00:09:35.580]And they felt,
- [00:09:37.620]a number of time, almost whiplashed.
- [00:09:40.290]So, encouraged to use it in one course,
- [00:09:42.960]completely banned in another course,
- [00:09:45.450]and ambivalent in a third course.
- [00:09:47.520]And so, that causes a student
- [00:09:49.800]not wanting to get into trouble
- [00:09:52.620]to say, "You know what?"
- [00:09:53.797]"I'm not gonna touch this."
- [00:09:55.027]"At least for a while until they can figure it out."
- [00:09:58.770]Well, and I think, too, one of the things that's
- [00:10:00.990]important to highlight is
- [00:10:04.930]there's a variety of different entry points
- [00:10:07.050]into generative AI.
- [00:10:08.130]There's a lot of different ways to
- [00:10:10.590]access what generative AI is able to do
- [00:10:14.760]and utilize it well.
- [00:10:17.070]So, when Justin's talking about his chat bot,
- [00:10:19.770]that's very much from a coding perspective,
- [00:10:22.380]which pulls on that AI
- [00:10:25.680]in a slightly different way than
- [00:10:28.920]I pull on generative AI.
- [00:10:30.630]Like, utilizing generative AI as
- [00:10:34.410]a thinking partner, as
- [00:10:37.020]a way to help me
- [00:10:40.440]identify where gaps might be
- [00:10:42.990]as I'm thinking about, "This is my course."
- [00:10:44.887]"This is the outline of what my course is doing."
- [00:10:47.880]Help me think through, if these are my learning goals,
- [00:10:50.520]what might I be missing out on
- [00:10:52.290]given these characteristics of my students?
- [00:10:55.710]And it's super good at that, right?
- [00:10:58.410]But that's me kind of consuming
- [00:11:01.050]versus there's this production kind of bit of it, right?
- [00:11:04.410]Where you are creating chatbots that very much are
- [00:11:10.560]leveraging AI in slightly different ways.
- [00:11:14.280]And so, it feels like
- [00:11:17.160]that notion of, there's a variety of
- [00:11:19.560]different ways to come into it
- [00:11:22.260]and to use it effectively.
- [00:11:23.820]And, also understand that we don't quite know
- [00:11:26.160]what effective use looks like yet, right?
- [00:11:28.800]Like, there's some stuff that goes really, really well,
- [00:11:33.560]but we're still trying to figure out
- [00:11:35.970]how to best build queries
- [00:11:38.550]to get it to do particular things.
- [00:11:40.320]And I can only talk from like a consumption process.
- [00:11:44.430]I'm not creating bots, I'm not creating sites, I'm not,
- [00:11:47.190]I'm literally utilizing it to
- [00:11:50.580]help me think through some stuff, right?
- [00:11:54.690]But it's that notion of,
- [00:11:57.030]I'm asking it to help me think through.
- [00:11:59.100]Like, I'm using it as a tool to refine,
- [00:12:03.090]give nuance to,
- [00:12:06.901]and reposition my own thinking,
- [00:12:09.300]versus I'm not asking it to think for me.
- [00:12:11.940]Which,
- [00:12:13.410]sometimes, feels like
- [00:12:15.810]that's what the expectation of generative AI is,
- [00:12:18.720]is it will do the thinking for me.
- [00:12:20.550]I don't have to think, 'cause I'm gonna put it in,
- [00:12:23.640]it's gonna do this work for me.
- [00:12:27.850]Which, you know, I don't think it's there yet.
- [00:12:29.400]I don't think that it could do that without us
- [00:12:32.910]being able to
- [00:12:35.160]play with it a little bit, right?
- [00:12:36.900]Like, play with what it's giving us, in terms of output.
- [00:12:40.320]So, as you're thinking about
- [00:12:42.840]how you're using it right now, and that,
- [00:12:45.360]I'm struck by Justin's idea of like
- [00:12:48.750]how we interact with that knowledge.
- [00:12:51.180]Like, what are some of the ways
- [00:12:52.230]that you're interacting with it?
- [00:12:53.490]What are some of the ways that you're,
- [00:12:55.500]it's not just this passive thing, right?
- [00:12:57.390]It's,
- [00:12:59.309]we are doing particular things
- [00:13:02.220]with generative AI and the products that it's giving us.
- [00:13:05.610]How are you thinking about those heuristics,
- [00:13:08.490]in terms of when you realize,
- [00:13:10.747]"Oh, this is going in the wrong direction,"
- [00:13:13.110]and then how do you redirect?
- [00:13:14.610]Or do you?
- [00:13:15.443]Or is that the point where generative AI
- [00:13:16.890]just stops being productive?
- [00:13:21.300]I think on a-
- [00:13:22.410]So, to think about it,
- [00:13:25.530]sort of, at a wide angle,
- [00:13:28.890]some of the research I'm doing with
- [00:13:30.870]Gretchen Larson, a PhD student here,
- [00:13:33.600]and Aze Dehasani, also a PhD student in the ILT program,
- [00:13:38.210]we've broken it down into four different categories.
- [00:13:41.310]There's how do we think about generative AI
- [00:13:44.130]within academia in general?
- [00:13:47.100]And, you know, in society?
- [00:13:49.230]And then, how do we use it, like you described Nick,
- [00:13:52.410]like how is it as a,
- [00:13:55.500]as a helper?
- [00:13:56.340]As something that's generating information,
- [00:13:59.550]because we have a task at hand
- [00:14:00.900]and we wanna solve that problem.
- [00:14:02.370]We wanna overcome or address something.
- [00:14:04.770]And then, there's, how do we use generative AI to design?
- [00:14:11.100]And what does that look like?
- [00:14:12.270]Because it will help you write code.
- [00:14:14.518]Good luck if you don't know code.
- [00:14:18.030]I mean you will overdrive your headlights fast.
- [00:14:20.580]Just like, you know, if you're asking for advice.
- [00:14:23.520]You know, if your friend looks like they're gonna die
- [00:14:27.060]because they've been poisoned,
- [00:14:28.650]and you don't have access to anything else
- [00:14:30.330]except for ChatGPT, I understand why
- [00:14:32.430]you'd have access to other things, you know,
- [00:14:34.980]I would trust it if you're not a toxicologist.
- [00:14:38.130]But, you know, in many situations,
- [00:14:40.800]you know, especially when it comes to coding,
- [00:14:42.720]you can get in trouble quickly.
- [00:14:44.520]But, if you know what you're doing,
- [00:14:46.020]it can magnify, amplify what you're doing and,
- [00:14:49.680]you know, really move you down the line.
- [00:14:51.240]And then, the final way, the fourth way, is as a material.
- [00:14:54.840]Because generative AI itself is an API, or can be an API,
- [00:14:59.610]where you can hook it up to a flow and
- [00:15:02.119]it'll have things come through.
- [00:15:05.070]But, I think that, I think on all those levels,
- [00:15:07.470]on the three last levels, is an iterative process.
- [00:15:11.280]Can you tell me API and flow?
- [00:15:12.990]Break that down for me like I'm four.
- [00:15:17.697]An API, it's like, it's like if you went to-
- [00:15:24.540]So, if it's 1990 and you're looking for an apartment
- [00:15:28.380]in New York City-
- [00:15:29.310]As a 9-year-old.
- [00:15:31.170]Okay, 9-year-old, not four?
- [00:15:33.270]Not four. Okay.
- [00:15:35.400]If it's 1990 and you're looking for
- [00:15:37.230]an apartment in New York City,
- [00:15:38.730]you probably have to go to some office
- [00:15:41.460]where they have a big list of all the available apartments.
- [00:15:45.060]And then, you have to know
- [00:15:47.400]how to get to the different addresses,
- [00:15:49.680]and go there, or call, and find, and check it out.
- [00:15:53.370]Whereas, an API, you know, now, you would just,
- [00:15:56.790]you know, there'd be a website that would
- [00:15:59.100]drop all the
- [00:16:00.930]little pins on a map for you,
- [00:16:03.150]and you would set your price range,
- [00:16:05.370]and it would call out the ones
- [00:16:06.990]that didn't fall into that price range,
- [00:16:08.760]or nothing that didn't have a hot tub.
- [00:16:10.620]Whatever it was that you needed in an apartment.
- [00:16:13.530]And then, it would all be there.
- [00:16:14.790]In New York City? Well, yeah.
- [00:16:16.616]In New York City with a hot tub.
- [00:16:18.073]I mean, hey.
- [00:16:20.040]And so, an API is sort of a way to
- [00:16:24.750]ask for things in the background.
- [00:16:26.940]So, give me a map, and then I have a, you know,
- [00:16:30.600]I have this database of all these apartment listings,
- [00:16:33.270]or I have a website with all these apartment listings,
- [00:16:35.490]and each apartment has an address.
- [00:16:37.710]And so, you can use the API from the map
- [00:16:41.100]and the API from the, well,
- [00:16:43.440]let's say from the apartment listings,
- [00:16:45.330]and combine them, and it'll show where they are.
- [00:16:48.840]So, an API for generative AI would be OpenAI saying,
- [00:16:53.407]"Okay, for this amount, you get an account,
- [00:16:56.520]and you can ask it questions in the background."
- [00:16:59.190]Because maybe you want,
- [00:17:01.020]you want it to do something specific.
- [00:17:02.880]Like, you have a website where you're
- [00:17:04.830]talking about fantasy football,
- [00:17:06.330]and you're making recommendations to people.
- [00:17:09.240]And you only want the bot to come back
- [00:17:11.220]with fantasy football
- [00:17:13.680]responses and answers, and not answer anything else.
- [00:17:17.070]So then, in the background,
- [00:17:18.870]you're asking OpenAI the questions
- [00:17:22.170]they have for your technology,
- [00:17:24.750]but you're also putting something in saying,
- [00:17:26.377]"By the way, use this information
- [00:17:29.100]and don't answer the question
- [00:17:30.750]if it's not about fantasy football."
- [00:17:33.560]Hmm, okay.
- [00:17:36.060]And flow?
- [00:17:38.460]Is flow just the order of operations?
- [00:17:42.300]In this case.
- [00:17:43.710]Sure, how you'd, like-
- [00:17:46.410]How you would get-
- [00:17:47.610]So, you have to have a sense of, okay,
- [00:17:50.548]where's the irritant in the oyster coming from?
- [00:17:54.480]It's probably, in terms of generative AI,
- [00:17:56.640]it's usually coming from a person saying,
- [00:17:59.917]"I have this need."
- [00:18:01.320]Okay, then, you have this need, and then where does it go?
- [00:18:04.140]This is much better with a flow chart.
- [00:18:06.030]And so, that's where flow comes in.
- [00:18:07.920]Where, how does your, how is your data
- [00:18:09.780]going through the system,
- [00:18:10.950]and how is it changing at each step?
- [00:18:13.448]Okay.
- [00:18:15.300]Based on APIs that could be generative AI.
- [00:18:18.480]Based on natural language processing.
- [00:18:20.250]What are you doing to the information that's going through?
- [00:18:22.590]How are you making your decisions?
- [00:18:24.570]So, you can think of it kinda like a flow chart
- [00:18:26.160]or a decision tree, but with lots of data
- [00:18:28.770]coming in at different points.
- [00:18:30.180]Gotcha, okay, that's helpful.
- [00:18:32.250]Cause, yeah, I have enough technical information
- [00:18:33.750]to be dangerous, but not enough to, like, be articulate.
- [00:18:38.400]So, we have bots, and, or API embedded into-
- [00:18:43.050]You notice that we have bots.
- [00:18:44.580]Nick, you don't have a bot.
- [00:18:46.170]We have.
- [00:18:47.250]The cheese stands alone, it's fine, whatever.
- [00:18:51.570]What are some ways you use AI in teaching?
- [00:18:56.430]So, I don't have a bot,
- [00:18:59.460]'cause, apparently, I was not invited to that club.
- [00:19:01.230]Thank you very much.
- [00:19:04.320]Rude.
- [00:19:05.910]So, I think,
- [00:19:08.970]if I'm thinking about
- [00:19:11.010]the ways that I've used generative AI,
- [00:19:13.980]in terms of
- [00:19:16.530]a thinking partner, again, helping me identify gaps,
- [00:19:20.700]but have also, in the course of the last,
- [00:19:23.640]last semester in particular,
- [00:19:25.380]really tried to help my students understand
- [00:19:28.770]the ways that they, themselves, could use generative AI
- [00:19:33.030]to support them in their teaching.
- [00:19:36.420]And so, one of the things that I have found
- [00:19:39.180]really, really helpful is
- [00:19:43.110]when they leverage a ChatGPT, or a Bard,
- [00:19:46.710]or something of that nature,
- [00:19:50.790]having them design a lesson plan, right?
- [00:19:54.300]Like, this is the instructional routine
- [00:19:57.030]that I think would work.
- [00:19:58.320]This is the content that I'm trying to teach them.
- [00:20:00.960]Whether it's letter-sound relationships,
- [00:20:02.700]or a comprehension strategy, what have you.
- [00:20:05.460]And then, feeding that lesson plan
- [00:20:09.600]to the particular portal,
- [00:20:13.740]giving them some characteristics for students.
- [00:20:16.290]So, feeding the generative AI assessment data,
- [00:20:21.030]and asking the generative AI to help them think through
- [00:20:24.180]where might this lesson breakdown?
- [00:20:26.160]Where might instruction get tricky for students
- [00:20:29.580]based upon what you know about them?
- [00:20:33.600]And that's been really interesting as
- [00:20:39.240]it's forced my students,
- [00:20:41.430]who are learning to be classroom teachers,
- [00:20:44.850]to think through and about
- [00:20:49.620]how to put that assessment data
- [00:20:52.140]into conversation with their teaching.
- [00:20:56.389]And the AI has helped them get a little bit better at
- [00:21:01.200]figuring out, "Oh, this is how I planned it,
- [00:21:03.450]this is what, how it's in my head."
- [00:21:06.570]But it's given them, kind of, some advice on how to
- [00:21:11.940]fix those spaces where,
- [00:21:14.850]pardon, instruction might break apart a little bit.
- [00:21:17.880]And so, that's been really interesting.
- [00:21:19.500]And there's something about
- [00:21:22.530]the feedback coming from
- [00:21:25.980]ChatGPT or Bard that, like, they're getting the kind of
- [00:21:29.400]feedback that I would want to give them,
- [00:21:31.140]but they don't see it as a personal attack from me,
- [00:21:34.050]as their instructor, which has been a nice,
- [00:21:36.510]kind of, side benefit of it.
- [00:21:40.020]So, trying to think through,
- [00:21:41.010]I've been trying to think through ways to
- [00:21:44.430]leverage AI,
- [00:21:47.430]generative AI, as kind of a partner
- [00:21:50.580]in the work that I do with teacher educators,
- [00:21:54.750]or with pre-service teachers,
- [00:21:57.750]in terms of this isn't going away, right?
- [00:22:01.317]And there's the real efficiency driven,
- [00:22:05.610]kind of, stuff around,
- [00:22:08.820]okay, you can use this to write your lesson plans,
- [00:22:11.220]but I also need you to know
- [00:22:14.130]when that lesson plan sucks.
- [00:22:17.400]When it doesn't actually reflect
- [00:22:19.080]the content that you need it to reflect.
- [00:22:22.290]And so, let's not start there.
- [00:22:24.540]Let's draft our own lesson plans,
- [00:22:27.780]but then let's feed it back into generative AI,
- [00:22:31.350]and see how that works.
- [00:22:32.760]So, that's been interesting to me,
- [00:22:35.100]because, well, and I don't have a bot.
- [00:22:37.890]I don't either.
- [00:22:39.480]You just said you had a bot.
- [00:22:40.710]No, I did not.
- [00:22:42.160]Oh, mean.
- [00:22:43.770]Okay, sorry, I did not mean to.
- [00:22:45.600]It's been a while, Guy, why are you mean to me?
- [00:22:49.230]It's been a while, so we gotta start somewhere.
- [00:22:52.949]I don't like how this is starting.
- [00:22:56.550]So, I wanna start-
- [00:22:58.170]So, my take is a little bit different.
- [00:22:59.820]And this is exactly the conversations
- [00:23:01.560]I think we all need to have, because,
- [00:23:04.650]if we are all looking at it a little bit different,
- [00:23:07.260]and we're all exploring it, we need to share the information
- [00:23:09.900]and we need to be very clear.
- [00:23:11.190]So, the first thing I think is very important
- [00:23:13.140]is to be very clear with our students
- [00:23:14.550]about the expectations.
- [00:23:16.230]How do we see this?
- [00:23:17.400]What do we want them and don't want them to use?
- [00:23:21.928]And include that in the syllabus,
- [00:23:23.370]or wherever we're sharing information
- [00:23:25.530]with all of our students, because
- [00:23:27.360]they are unsure, and because it's really different.
- [00:23:29.820]There are many of our colleagues who say,
- [00:23:32.437]"Do not use any of these things."
- [00:23:35.587]"I want you to be here, to be discussing,
- [00:23:39.330]to be reading, and doing things
- [00:23:41.310]just like we did in the 19th century."
- [00:23:43.830]And that's gotta be fine.
- [00:23:45.960]And, that's fine, that's a fine attitude,
- [00:23:48.150]it's just gotta be very clear.
- [00:23:49.500]And, if we are being
- [00:23:52.680]supportive and, also, want them to play with it,
- [00:23:56.460]and to experiment with it, and to do interesting with it,
- [00:23:59.220]I think we need to be very clear.
- [00:24:00.330]So, that's the starting point.
- [00:24:01.920]The second thing is, I actually do use it for,
- [00:24:06.120]to ask them to generate some lesson plans,
- [00:24:08.790]and do some compare and contrast,
- [00:24:11.070]because it can change their role as teachers
- [00:24:14.880]from people who
- [00:24:17.706]are doing the grunt work of writing the lesson plans
- [00:24:20.280]to the ones who are, can be critical
- [00:24:23.550]and detail oriented towards that,
- [00:24:26.160]and ask the hard questions.
- [00:24:27.840]Because, when you are writing the lesson,
- [00:24:30.000]it is actually much harder to be critical about it.
- [00:24:32.940]Because, going back to your point about,
- [00:24:35.820]how do you respond to criticism, right?
- [00:24:37.860]It's hard to be critical about what you just spent your
- [00:24:40.980]blood, sweat, and tears at creating.
- [00:24:43.800]Now, with enough time,
- [00:24:45.360]you can be really critical about your own work.
- [00:24:47.640]But, if you're doing this in real time
- [00:24:49.680]for a lesson that needs to happen in two days,
- [00:24:52.170]you're much less likely.
- [00:24:53.160]So, actually, generating lesson plans
- [00:24:55.710]has, also, an advantage of,
- [00:24:58.530]now I'm the critical voice seeing,
- [00:25:00.630]is this really what I need?
- [00:25:01.860]Does it answer what needs to happen in the classroom?
- [00:25:05.640]How can I make it better?
- [00:25:07.680]What are the biases that are embedded in there?
- [00:25:09.810]And I think it's actually easier if
- [00:25:12.030]some of it is created.
- [00:25:14.010]Also, because I think most, or many of our students
- [00:25:18.030]will actually end up teaching lesson plans
- [00:25:20.880]that they didn't necessarily write.
- [00:25:22.770]That are coming from
- [00:25:24.750]curriculum creators of one sort or another.
- [00:25:27.570]And so, they're just reinterpreting existing lessons
- [00:25:31.470]more than creating them from whole cloth.
- [00:25:33.570]So, that's an interesting and somewhat different issue.
- [00:25:36.510]And what's in the back of my mind is,
- [00:25:37.890]how long before these big curriculum providers
- [00:25:40.530]provide an API that helps
- [00:25:44.427]teachers create a version of their lessons
- [00:25:48.390]that are much more fitted to the needs of
- [00:25:51.660]their specific classroom, and maybe even-
- [00:25:53.110]Well, and we're seeing that now, right?
- [00:25:55.110]Like, one of the things that I think
- [00:25:58.470]is important
- [00:26:01.976]is that level of
- [00:26:05.580]engagement
- [00:26:06.750]with these
- [00:26:09.870]chat bots.
- [00:26:10.703]With these portals.
- [00:26:14.325]Like, one of the places I've been
- [00:26:17.130]really playing with is edua.ai, not sponsored.
- [00:26:20.970]But, like, it does exactly that.
- [00:26:23.340]You can feed it a lesson plan,
- [00:26:25.680]and then tell them characteristics of your class,
- [00:26:28.110]and it will differentiate.
- [00:26:30.870]And it will spit out the,
- [00:26:34.275]"Okay, here's the version of the assignment
- [00:26:36.150]for this group of students,
- [00:26:37.350]because they have these characteristics."
- [00:26:39.067]"Here's the stuff for this group of students."
- [00:26:43.005]And so, one of the things that I'm finding
- [00:26:45.660]really heartening about this is,
- [00:26:49.500]about these technological developments, is it's
- [00:26:52.200]allowing for
- [00:26:54.450]differentiation at a scale that doesn't come at
- [00:27:00.390]the cost of, "Oh, you need to spend
- [00:27:02.820]your entire weekend doing this," right?
- [00:27:05.220]Like, instead you can ask
- [00:27:09.180]a portal to do that kind of work.
- [00:27:11.970]It will design the resources.
- [00:27:13.680]You have to check them over, of course.
- [00:27:15.510]And this is where the, like,
- [00:27:17.460]one of the things that I am-
- [00:27:19.380]One of my personal arguments, I think in this age of AI, is
- [00:27:24.300]the level of expertise, the role of expertise,
- [00:27:27.870]is going to be more important than ever,
- [00:27:30.810]because you are going to have to have
- [00:27:34.620]a lens
- [00:27:36.300]with which to bring to these products
- [00:27:38.730]that are going to be generated.
- [00:27:40.770]And you're going to have to have a sense of
- [00:27:44.250]when is it going awry?
- [00:27:45.390]When is it not?
- [00:27:46.350]When is that, is that actually going to work, in terms of
- [00:27:50.580]the differentiation for the kids
- [00:27:52.260]that you're seeing in your classrooms, right?
- [00:27:56.547]And so, that's where,
- [00:27:58.140]like, that's why I start with the,
- [00:27:59.700]you're designing this lesson, because I need you to have an
- [00:28:02.014]internal understanding of how a lesson
- [00:28:04.830]is structured and how this instruction
- [00:28:10.357]is forecasted, right?
- [00:28:11.400]What does it feel like to plan for this type of work?
- [00:28:14.940]And then, how does that feel when you're interpreting
- [00:28:18.420]that same work that someone else has done.
- [00:28:20.550]And how can you reconfigure that?
- [00:28:23.910]And so, thinking about that,
- [00:28:26.940]that role of expertise, how do I give my students
- [00:28:32.280]enough background
- [00:28:34.920]to be dangerous, and enough background to be like critical
- [00:28:41.925]without having to cover it all.
- [00:28:43.110]'Cause you can't, right?
- [00:28:44.190]Like, the world is too complex,
- [00:28:46.350]but you have to have enough.
- [00:28:47.520]Like, I'm thinking about the coding part.
- [00:28:51.120]ChatGPT has been really helpful as I'm coding
- [00:28:55.016]within my Canvas sites.
- [00:28:56.160]But, again, I know just enough
- [00:28:59.040]to know what I'm asking for.
- [00:29:01.020]And just enough to
- [00:29:04.170]tweak as needed.
- [00:29:07.080]But, if I were to do like anything
- [00:29:09.360]fancier than
- [00:29:11.460]tabs with embedded da, da, da, da, da, da,
- [00:29:15.150]I'd, you know, have some problems.
- [00:29:18.240]But it gives me enough to
- [00:29:20.880]kind of go on.
- [00:29:23.355]It kind of feels a bit like
- [00:29:24.300]when the internet first came out.
- [00:29:26.010]Or when the internet-
- [00:29:27.157]"When the internet first came out."
- [00:29:28.620]Listen to me sounding old.
- [00:29:30.150]But, like, when
- [00:29:32.250]we realized the utility of the internet,
- [00:29:35.130]the widespread utility of the internet.
- [00:29:37.560]Where like it allows me to learn stuff I didn't,
- [00:29:40.110]and do stuff that I didn't, wasn't able to do before.
- [00:29:43.710]Like, how do I make these things happen?
- [00:29:46.230]You're making notes. You're making notes.
- [00:29:47.702]So, I wanna hear.
- [00:29:49.875]It was a good conversation.
- [00:29:51.090]Well, I usually use a medical,
- [00:29:53.610]or, rather, a musical analogy for this,
- [00:29:56.640]but maybe I'll also use a cooking one.
- [00:29:59.040]So, initially, I would say,
- [00:30:01.560]teaching is a game of chicken, right?
- [00:30:05.070]At some point, there's the moment when the teaching happens.
- [00:30:09.270]And that's the moment where, you know,
- [00:30:12.330]you're, the game of chicken comes in the point where,
- [00:30:14.640]when can I start planning?
- [00:30:16.980]Because I know I'm, like,
- [00:30:18.270]I'm never gonna feel like it's perfect.
- [00:30:20.010]I'm never gonna feel like it's all done.
- [00:30:22.230]And so, you might have to give yourself
- [00:30:24.060]a certain amount of time.
- [00:30:25.260]But what are you doing in that time?
- [00:30:26.690]And how are you spending it?
- [00:30:28.680]So, the musical analogy might be, you know,
- [00:30:31.770]you're playing the oboe reasonably well, okay?
- [00:30:35.790]But, like, you have this,
- [00:30:37.830]you need this whole symphony to happen.
- [00:30:40.080]Or, you know, you're supposed to make this meal,
- [00:30:42.990]but you're spending most of your time
- [00:30:44.490]peeling carrots and potatoes.
- [00:30:46.740]And that's a frustrating thing.
- [00:30:48.060]And it's a necessary part of teaching,
- [00:30:50.130]where, at the beginning, you're grabbing from
- [00:30:52.380]other people's lesson plans, other people's curriculum,
- [00:30:54.630]because they're, like, because you don't have the content.
- [00:30:58.260]And it takes a long time to get, you know?
- [00:31:00.240]So, you can sort of chunk up, like, level up
- [00:31:02.610]from peeling potatoes and cutting onions
- [00:31:05.760]to sort of really thinking about
- [00:31:07.080]how all the flavors are blended together.
- [00:31:08.970]That's similar, you know, in terms of a musical analogy,
- [00:31:11.880]of going from playing the oboe, to leading the ensemble,
- [00:31:16.410]and then conducting the orchestra.
- [00:31:18.570]And that's something that generative AI
- [00:31:20.773]gives us a chance to sort of try our hand at,
- [00:31:24.330]to do a little sooner than we would've otherwise.
- [00:31:27.000]Especially when we've got people that are sort of
- [00:31:29.070]averse to human,
- [00:31:31.710]human given formative feedback.
- [00:31:35.070]In the research that that we did with graduate students,
- [00:31:37.800]we had a student who was
- [00:31:41.010]more capable than I am at coding,
- [00:31:42.930]which isn't saying that much, but it's saying something.
- [00:31:45.780]And, in the interviews with this participant,
- [00:31:49.950]they said the, like, for them, it felt like a failure
- [00:31:53.010]if they had to go to office hours
- [00:31:55.050]to get help on a project.
- [00:31:57.144]But that's where generative AI came in for them.
- [00:32:01.110]So, they could hear the, you know,
- [00:32:02.730]they could hear the critical feedback
- [00:32:04.830]and it didn't feel real.
- [00:32:06.720]Or we were talking, before we came on camera,
- [00:32:10.620]about feelings.
- [00:32:12.180]And this idea that, you know,
- [00:32:14.160]if you can remove the feelings from it,
- [00:32:16.050]and maybe, that's partially what generative AI does.
- [00:32:19.290]Is the affective place that sort of makes us
- [00:32:22.200]sort of wince inside when someone says this isn't good.
- [00:32:25.770]Or this could be much better.
- [00:32:27.120]Or you forgot four things.
- [00:32:29.280]I think that might be, at least, less,
- [00:32:32.160]if not, gone for a lot of people,
- [00:32:34.410]when they're using generative AI,
- [00:32:36.360]that there's no human judgment attached to it.
- [00:32:38.850]It's just them, in their dorm at 2:00 AM
- [00:32:42.840]getting some help on a lesson plan that they've developed.
- [00:32:46.230]And they're iterating at a level
- [00:32:47.610]that's above peeling carrots and potatoes.
- [00:32:50.640]And I think that's really exciting
- [00:32:52.200]for teachers as well as other fields.
- [00:32:57.780]Yeah, I like that.
- [00:32:58.890]Because we both kind of talked about it in some way,
- [00:33:02.070]and it reminds me of work I've done many years ago on
- [00:33:05.820]kids practicing fluency, which is a totally different thing,
- [00:33:08.640]but it was the same thing.
- [00:33:10.290]And that is, when they had automated response
- [00:33:13.860]that gauged, they were willing to practice
- [00:33:16.857]20, 25 times reading.
- [00:33:18.990]It was actually too many times reading the same text,
- [00:33:21.450]but they were willing to go back to it.
- [00:33:23.310]Whereas, when there was a person there, they wouldn't do it.
- [00:33:25.980]They would do it two times, and then they would shut down.
- [00:33:28.410]With the machine, listening to the same text,
- [00:33:31.260]offered the same way, they would go again,
- [00:33:33.600]and again, and again, and they would be
- [00:33:36.480]willing to vocalize and practice everything.
- [00:33:40.860]So, affect has,
- [00:33:42.210]that idea of affective filter, as a way,
- [00:33:45.677]it's a really interesting way to think about,
- [00:33:47.940]this is what the copilot can do for me in teaching.
- [00:33:51.510]Here's a voice, here's a coaching voice.
- [00:33:53.790]And the other piece is, of course,
- [00:33:55.170]that it's available when I need it.
- [00:33:57.870]That immediacy is really important.
- [00:33:59.850]I mean, we talk about this.
- [00:34:00.900]This can be late at night, or early in the morning,
- [00:34:03.840]or on the weekend.
- [00:34:04.673]And, suddenly, you don't have to bother anybody,
- [00:34:06.930]who will, if they're reasonable, not respond in real time.
- [00:34:11.310]You can get a response in real time and move on.
- [00:34:13.860]I can do, and I did what you talked about,
- [00:34:16.110]and that is, as I'm planning my own syllabus,
- [00:34:18.390]I asked ChatGPT to create a syllabus for a similar class.
- [00:34:21.570]And I just looked to see, do I have everything?
- [00:34:24.480]And I was like, "Yeah, I'm okay."
- [00:34:26.580]But, again, expertise comes in.
- [00:34:28.680]But, you know, I've asked for other syllabi.
- [00:34:31.740]And I've looked and compared.
- [00:34:33.060]But it was really nice to have that
- [00:34:35.550]immediate response to what I'm doing.
- [00:34:38.340]Yeah, I am curious about this, and this is a
- [00:34:41.460]topic for another time.
- [00:34:42.990]But, like, the lack of affectiveness
- [00:34:46.470]is like-
- [00:34:48.870]I'm curious about what that means for feedback, right?
- [00:34:51.990]Like, I'm curious-
- [00:34:54.803]When
- [00:34:57.270]something is-
- [00:34:59.490]'Cause when, I'm thinking about when I'm
- [00:35:01.200]personally using ChatGPT
- [00:35:04.440]as a thinking partner.
- [00:35:05.670]Like, it's usually in preparation of
- [00:35:08.850]going live with
- [00:35:11.910]something else, whether it's,
- [00:35:14.040]you know, an interview protocol.
- [00:35:16.743]Or whether I'm thinking through my syllabus.
- [00:35:18.840]Whether I'm thinking through my teaching, what have you.
- [00:35:22.680]And, like,
- [00:35:26.820]ChatGPT's not my stakeholder.
- [00:35:29.070]And so, that feedback's slightly different
- [00:35:32.460]in some way, shape, or form.
- [00:35:34.410]And it doesn't feel like that.
- [00:35:36.750]Like, it's nice to get that feedback,
- [00:35:38.100]but it's not necessarily feedback
- [00:35:39.540]that matters yet to me in my head,
- [00:35:42.840]as we're having this conversation.
- [00:35:44.340]And so, I don't know where I'm sitting on that,
- [00:35:47.340]but that was interesting to me.
- [00:35:49.050]Like, the-
- [00:35:51.150]If it's not-
- [00:35:56.034]If I am not engaged in feedback emotionally,
- [00:36:01.110]am I going to actually take that?
- [00:36:02.940]Am I gonna take that up?
- [00:36:04.290]So, I was thinking- As I'm thinking that-
- [00:36:06.510]I'm trying to think that through.
- [00:36:07.890]And I was thinking about it exactly opposite.
- [00:36:11.275]I mean, why are we not surprised?
- [00:36:12.450]Yeah, we are not surprised.
- [00:36:14.670]So, I was thinking about it as like,
- [00:36:17.100]I like it in some ways, because
- [00:36:19.525]I'm willing to take that critique.
- [00:36:22.230]My challenge is,
- [00:36:24.624]actually, the opposite.
- [00:36:26.880]And that is, some of our natural resistance may be good.
- [00:36:30.510]So, some of our natural resistance to feedback is bad,
- [00:36:33.840]because we're not listening.
- [00:36:35.160]But some of it is good, because, sometimes,
- [00:36:36.840]we need to stand our ground, and we're actually correct.
- [00:36:39.570]And the feedback is actually wrong.
- [00:36:41.670]And, maybe, with the lowering of the affect,
- [00:36:44.610]if you're like, "Okay, I'll take that
- [00:36:46.530]critique and I'll move on, because I need-"
- [00:36:47.963]Well, I don't know that it's actually critique.
- [00:36:50.504]And that might be what I'm trying to figure out.
- [00:36:53.580]I haven't thought about AI in this way before.
- [00:36:56.944]My point is, let's say it is critique,
- [00:36:58.770]or I'm accepting it as critique.
- [00:37:00.720]And, maybe, because there's no affective,
- [00:37:03.540]the affective filter is kind of lower,
- [00:37:06.000]I'm actually taking it in when I shouldn't,
- [00:37:08.010]when I should resist.
- [00:37:09.930]And so, I'm wondering about that.
- [00:37:11.982]Now, if it's something like coding,
- [00:37:15.300]there's a very objective test of if it worked or not.
- [00:37:19.470]But, if it's something like making a plan for a syllabus,
- [00:37:22.710]then it's squishier, it's harder to judge.
- [00:37:24.960]And there's definitely not an immediate response of,
- [00:37:27.167]did it work or did it not work?
- [00:37:29.430]So, that's where I'm a little bit,
- [00:37:33.180]I'm wondering about the boundaries.
- [00:37:36.480]Well, and I think asking questions.
- [00:37:38.550]So, like, out in the world,
- [00:37:42.720]around other future teachers or around colleagues,
- [00:37:46.890]there's a certain level, everyone has a certain level of
- [00:37:49.680]pressure to perform competence.
- [00:37:51.990]Right?
- [00:37:52.823]And so, in front of the bot, as long as you feel
- [00:37:55.860]like your transcripts are your transcripts, you know,
- [00:37:59.760]that feeling of performing competence diminishes.
- [00:38:03.960]So, that's something.
- [00:38:05.393]And so, like, that sense of vulnerability
- [00:38:07.980]when you ask certain questions, isn't there.
- [00:38:11.040]Or, at least, that's what our participants have implied.
- [00:38:14.250]That they don't feel as vulnerable when they're asking
- [00:38:17.613]certain questions than they would otherwise.
- [00:38:22.304]And, you know, you take away that vulnerability,
- [00:38:24.540]it's like when I was doing my dissertation study.
- [00:38:28.110]I was doing an ethnographic study at a elementary school.
- [00:38:31.680]I was there, and I cared about those teachers,
- [00:38:33.900]and they were great.
- [00:38:34.770]And I hung out for over a year.
- [00:38:36.870]And I didn't, I wasn't invested in the school
- [00:38:39.840]like my job depended on it
- [00:38:41.640]or those were my students in the same way.
- [00:38:44.340]And so, it was sort of interesting to say,
- [00:38:46.177]"Oh, if you have this ethnographic eye,
- [00:38:48.000]you can sort of see things you can't see
- [00:38:49.590]if you're too emotionally involved in what's going on."
- [00:38:52.920]And so, at 2:00 AM,
- [00:38:54.930]you're getting feedback that's, you know,
- [00:38:58.200]magnifying you.
- [00:38:59.490]Even if the feedback isn't as good as your own knowledge,
- [00:39:02.790]it can still be useful.
- [00:39:04.530]You know, I worked on a software development project
- [00:39:07.410]with someone who just happened to be
- [00:39:09.030]in the University of Texas library reading the newspaper.
- [00:39:12.150]And he would talk to me every day.
- [00:39:13.950]And, one day, I mentioned a project I was working on,
- [00:39:15.600]and he said, "Hey, can I be a part of that?"
- [00:39:16.837]"I don't know anything about technology,
- [00:39:18.840]and I'd love to sit in on your meetings."
- [00:39:21.000]And, you know, most of the time, it was sort of random.
- [00:39:24.360]But, sometimes, he would ask questions,
- [00:39:25.950]and it would be useful.
- [00:39:27.090]So, even if it's not up to our level,
- [00:39:30.690]in terms of what it knows,
- [00:39:31.890]it can still be that irritant in the oyster
- [00:39:34.800]that gets us thinking.
- [00:39:36.210]And so, removing some of that affect.
- [00:39:39.030]I mean, even when,
- [00:39:40.830]Guy, when you mentioned about the syllabus,
- [00:39:43.590]you qualified it afterwards by saying,
- [00:39:45.937]"I also looked at other syllabi," you know?
- [00:39:48.000]Yeah, yeah, yeah.
- [00:39:48.833]Because there's that sense of vulnerability that,
- [00:39:50.677]"Oh, you would have this question?"
- [00:39:52.710]And our students are having questions all the time in class
- [00:39:56.040]and they're swallowing them,
- [00:39:57.540]because they're performing competence.
- [00:39:59.460]And so, it's a nice sidebar,
- [00:40:02.130]whether it's a back channel during class,
- [00:40:04.650]or it's after class, where,
- [00:40:06.480]if they don't know things, they can look it up.
- [00:40:08.070]I mean, and you can do that already.
- [00:40:09.420]You can be like, "Oh, I forgot what a word was."
- [00:40:12.667]"I'm gonna look it up instead of raising my hand,"
- [00:40:14.970]like you would've done in 1990.
- [00:40:17.096]You know?
- [00:40:18.810]And so, in that area, I feel like it's useful.
- [00:40:22.740]And then, at least in the case of OpenAI,
- [00:40:27.060]with RLHF, the way they trained from 3.5 to 4.0,
- [00:40:32.880]it's a lot more, some would say, nice or PC.
- [00:40:35.610]It usually starts with something positive.
- [00:40:37.860]Sort of going along the-
- [00:40:38.693]I find it really annoying, actually.
- [00:40:40.107]I'm not gonna lie.
- [00:40:41.700]I don't need it either.
- [00:40:43.260]But, I think, for some people, it's really helpful.
- [00:40:46.830]And that's, they think it's helpful at OpenAI.
- [00:40:50.460]And so, those sorts of things continue,
- [00:40:52.620]and it sort of softens if people are feeling,
- [00:40:55.230]you know, that they could be judged.
- [00:40:57.840]I think, eventually, you know,
- [00:41:00.210]chat histories are gonna leak.
- [00:41:02.430]And then, people might,
- [00:41:03.824]you know, be a little more concerned.
- [00:41:04.890]It's like, right now, a lot of people, like,
- [00:41:07.590]never want to be in one of those YouTube videos,
- [00:41:10.140]where it's, you know, police camera footage of
- [00:41:12.960]them doing something embarrassing,
- [00:41:14.850]because they don't wanna be that type of famous, you know?
- [00:41:17.520]Not that type of famous.
- [00:41:19.350]Like, the goal is to avoid that.
- [00:41:21.030]And so, I think we have a lot of students,
- [00:41:23.760]even though we feel like youth are out taking risks,
- [00:41:26.370]that are very risk averse in settings where they feel like
- [00:41:28.830]they're being watched, observed,
- [00:41:30.090]or their data's being collected, at least video data.
- [00:41:32.940]So, until that happens,
- [00:41:36.030]I feel like this is a less affectively
- [00:41:42.090]dangerous, or they're gonna feel less vulnerable.
- [00:41:44.790]And, according to the research we've started,
- [00:41:48.420]that has some benefits.
- [00:41:51.120]And we haven't had students saying,
- [00:41:52.447]"Hey, I just," you know, "I thought whatever ChatGPT said
- [00:41:55.740]was great, and I moved on."
- [00:41:57.360]They've been concerned, one, they want to protect,
- [00:42:00.090]they've said they wanted to protect their learning.
- [00:42:02.520]They didn't want to get to a place where they were just
- [00:42:04.680]feeding it everything, and it knew everything.
- [00:42:06.900]And, two, for the students who tried
- [00:42:08.700]to be really ambitious with their code
- [00:42:10.290]before they were ready to understand how it was all working,
- [00:42:12.870]like they couldn't be the,
- [00:42:14.280]they couldn't be the conductor to watch the conductor
- [00:42:16.830]to know when the conductor was messing up.
- [00:42:18.900]You know,
- [00:42:21.030]they outran, they out drove their headlights pretty quickly,
- [00:42:23.940]and they're like, "Oh."
- [00:42:25.170]And then, a piece of software becomes some sort of
- [00:42:27.780]magical thing, where you don't want to touch anything.
- [00:42:30.630]And you're like, "No, exactly how this is,
- [00:42:32.880]I'm just gonna keep this here and never touch it,
- [00:42:34.770]because it works and I don't understand how it works."
- [00:42:37.216]Now, things like Copilot with GitHub and VS code,
- [00:42:41.790]you can actually highlight that bit of code.
- [00:42:44.010]And then, on the side say, how is this working?
- [00:42:47.130]And I'm excited because I'm teaching a course
- [00:42:50.160]in advanced web design and databases starting days from now.
- [00:42:55.275]And, in that course, they're going to be using GitHub.
- [00:42:58.560]They're gonna be doing computational literacies,
- [00:43:00.540]learning in a analog or non-digital way.
- [00:43:03.810]And then, we're going to, they're gonna be doing
- [00:43:05.880]coding and using a Copilot in that way.
- [00:43:09.870]So that, if they overdrive their headlights,
- [00:43:11.850]maybe this helps them sort of come up to speed.
- [00:43:14.280]So, they're not spend, you know,
- [00:43:15.300]they're not peeling carrots and potatoes the whole time.
- [00:43:17.820]They're not learning how to write a loop.
- [00:43:19.770]Once they understand how loops work and where they fit,
- [00:43:23.400]they can go from there.
- [00:43:25.950]Yeah, the one thing that I'm wondering about.
- [00:43:29.490]You're talking about performing a competence.
- [00:43:32.160]I'm a little bit concerned.
- [00:43:33.750]This is just a concern that I have in the back of my mind of
- [00:43:36.570]people performing that
- [00:43:41.550]sense of,
- [00:43:47.070]the way you described it, as that sense of concern.
- [00:43:50.460]So, are they really concerned
- [00:43:52.050]or are they performing concerns,
- [00:43:53.760]because they know they need to talk about it?
- [00:43:56.400]And they know that they should not admit
- [00:43:59.010]that they just kind of grabbed it and put it there.
- [00:44:01.800]So, that's something that is in the back of my mind.
- [00:44:05.430]Again, going back to that idea of expertise.
- [00:44:08.400]If you have enough expertise, then you can evaluate
- [00:44:11.550]and you feel more comfortable
- [00:44:12.810]doing the comparisons and looking at things.
- [00:44:16.912]But, if not, will that be the lesson?
- [00:44:20.190]And that is, I need to say I'm very careful,
- [00:44:22.680]and I'm checking, and I'm looking.
- [00:44:24.750]Or am I just saying that, and then continuing with my life?
- [00:44:28.320]Because, especially in higher education,
- [00:44:31.140]where they need to perform eventually,
- [00:44:33.690]because that's what they do in classes,
- [00:44:36.544]they send us something and they say the right things,
- [00:44:39.030]but did they do them?
- [00:44:40.590]Yes, and if they say by writing,
- [00:44:43.632]there's a level of obfuscation, right?
- [00:44:46.750]In my J-term course, I've had my students shoot
- [00:44:50.340]one minute mini videos and put 'em on YouTube,
- [00:44:53.850]and embed 'em in the slides, our sort of shared slide deck.
- [00:44:57.210]And so, they sort of narrate their take.
- [00:45:00.450]So, they make a multimodal slide with their ideas,
- [00:45:03.450]based on the readings.
- [00:45:04.770]Images, some text, and then their video.
- [00:45:09.150]And to do that video, it's really clear
- [00:45:13.230]from one, you know, conductor to another,
- [00:45:15.900]or a, you know, wannabe conductor, a future conductor,
- [00:45:19.110]if they're not, like, if they didn't understand
- [00:45:20.820]what was going on, you know?
- [00:45:21.990]As long as they can't read
- [00:45:24.060]some sort of script they wrote for their video,
- [00:45:27.240]it sort of gives them, 'cause I used to feel like,
- [00:45:29.647]"Okay, well the multimodal slide is my insurance,
- [00:45:33.030]that this isn't just a copy paste."
- [00:45:35.010]Because, you know,
- [00:45:38.264]some terrible, I know this is a clean podcast.
- [00:45:40.560]So, I'll just say, you know-
- [00:45:41.904]Is it?
- [00:45:42.737]It's rated as clean.
- [00:45:44.490]When I looked on iTunes.
- [00:45:47.550]Anyways, or the Apple Podcast thing.
- [00:45:50.995]I wanna make sure that they're,
- [00:45:54.300]what they say is how they're thinking and feeling,
- [00:45:57.210]and that they can sort of do it like jazz
- [00:45:59.130]more than something that's totally scripted.
- [00:46:01.530]And so, I've gone from the multimodal slide
- [00:46:03.780]to a video that they shoot.
- [00:46:07.222]And, you know, I have a problem
- [00:46:09.270]not leaving YouTube after I watch that video.
- [00:46:11.940]But, if they embedded in the slide,
- [00:46:13.800]then I don't have to go to YouTube.
- [00:46:14.820]I can watch their video, and I can get a good sense,
- [00:46:17.970]and then move on.
- [00:46:20.580]All right.
- [00:46:21.420]I think
- [00:46:23.640]we will need a follow up for this episode
- [00:46:26.190]to talk about the way AI can
- [00:46:29.760]be used in research.
- [00:46:31.140]I think that's an interesting thing to explore,
- [00:46:33.210]but I think that we need to give it enough space,
- [00:46:36.300]considering our time and everything else.
- [00:46:38.880]Yeah, no, like,
- [00:46:40.500]we've talked about teaching a lot.
- [00:46:42.000]Yeah.
- [00:46:42.833]We haven't talked necessarily about the abilities of
- [00:46:46.380]AI to help with data analysis, and/or writing,
- [00:46:49.170]and/or those other kinds of pieces
- [00:46:53.310]that are connected to the conversation that we've had today.
- [00:46:56.640]But we'll pick that up at another time.
- [00:46:59.670]Yeah.
- [00:47:00.503]So Justin, thanks.
- [00:47:02.670]Thanks for inviting me.
- [00:47:03.821]This was-
- [00:47:05.460]Now I'm thinking about emotion,
- [00:47:07.440]and feedback, and the role, and technological tools.
- [00:47:12.840]A lot to think about. All kinds of fun things.
- [00:47:15.540]And so, this is an exciting time.
- [00:47:19.232]And I love that we have something new to think about.
- [00:47:23.220]And a new way to question, what does it mean to teach?
- [00:47:26.820]And what does it mean to learn?
- [00:47:28.650]In many ways.
- [00:47:30.232](upbeat music)
The screen size you are trying to search captions on is too small!
You can always jump over to MediaHub and check it out there.
Log in to post comments
Embed
Copy the following code into your page
HTML
<div style="padding-top: 56.25%; overflow: hidden; position:relative; -webkit-box-flex: 1; flex-grow: 1;"> <iframe style="bottom: 0; left: 0; position: absolute; right: 0; top: 0; border: 0; height: 100%; width: 100%;" src="https://mediahub.unl.edu/media/21857?format=iframe&autoplay=0" title="Video Player: Not That Kind of Doctor - Exploring AI and Generative AI in Education" allowfullscreen ></iframe> </div>
Comments
0 Comments