Digital Platforms, Algorithms and the Future of News Consumption
Kjerstin Thorson
Author
11/04/2024
Added
12
Plays
Description
AI Media Literacy Symposium
Searchable Transcript
Toggle between list and paragraph view.
- [00:00:00.000]Well, good morning everyone, welcome to the symposium on media literacy in the age of artificial intelligence. I know it's a nice weather outside, but we're glad to have you here either in person over zoom we have a really great number of people who are attending remotely as well.
- [00:00:22.000]So I am Brian one, I am an associate professor in the College of journalism and mass communications, and along with a team of wonderful co workers, including Erica to frame from university libraries who would join us a little bit later.
- [00:00:37.000]Robert to me from emerging media arts, and Heather Aiken from agricultural leadership, education and communication program. Maria Marion also from journalism and mass communications and link I.
- [00:00:51.000]So from social media.
- [00:00:52.000]So from the school of computing.
- [00:00:54.000]We have a really kind of diverse team from a variety of different disciplines are working on a project to really enhance public awareness and knowledge of artificial intelligence how it affects our society.
- [00:01:09.000]And so, this symposium and this project is supported by a you know grant challenges planning grant on media and information literacy in the age of algorithms.
- [00:01:21.000]Our goal is to work with various stakeholders across the state and nationally to plan and implement an engaging innovative media literacy program to enhance public awareness and also to build resilience against the adverse impact of artificial intelligence on our society.
- [00:01:41.000]As all of you probably know, AI has greatly transformed our information environment, from how we view information on social media to the ads we get on digital media.
- [00:01:50.000]And also to the newsfeed we get from a variety of channels.
- [00:01:58.000]Our vision is to really foster a technologically savvy public that can competently navigate the changing information environment, whatever is coming up next.
- [00:02:11.000]And so to help us do that, we have invited three distinguished speakers, internationally known scholars,
- [00:02:19.000]in the area of media literacy, AI literacy, and misinformation, to really help us ignite the discussions about AI impact.
- [00:02:30.000]And we also, through this symposium, hope to stimulate more campus discussions on the social impact of AI,
- [00:02:39.000]and to stimulate more campus-wide research to explore the social and ethical impact of artificial intelligence on society.
- [00:02:48.000]So, we are really glad that all of you are here today.
- [00:02:52.000]Next, I would like to invite Dr. Sherry Vale, who is the Janet Olson Endowed Director and Dean and Professor in the College of Journalism and Mass Communications,
- [00:03:08.000]to welcome you all and also to talk a little bit about how our college is committed to media and AI literacy.
- [00:03:15.000]Thank you.
- [00:03:17.000]Thank you, Brian, for the introduction and for your leadership on this important initiative.
- [00:03:27.000]I feel like Brian should have an agent to help him with scheduling all of the different talks that he's had over the last couple of years on media literacy and AI.
- [00:03:36.000]When ChatGTP became the 2022 Christmas gift, we couldn't give back.
- [00:03:41.000]So many of us in academia were just scrambling to figure out how do we how do we manage this?
- [00:03:46.000]How do we keep students from using it to cheat on our assignments?
- [00:03:49.000]And Brian was steadfast in pushing us to see the bigger picture of what this really means.
- [00:03:55.000]He's been on the forefront of this research for years,
- [00:03:58.000]dedicating his career to investigating how digital algorithms manipulate the information that we see to affect our beliefs and behaviors.
- [00:04:06.000]He's also been incredibly generous with his time presenting in public forums and private meetings across the country to catch the rest of us up.
- [00:04:13.000]So thank you, Brian.
- [00:04:15.000]And thank you to your entire team for arranging this important conversation today.
- [00:04:20.000]I'm delighted to welcome you all to today's symposium on media literacy in the age of AI.
- [00:04:25.000]The event is sponsored by Grand Challenges Planning Grant from UNL's Office of Research and Economic Development.
- [00:04:32.000]And we are certainly facing a grand challenge.
- [00:04:35.000]According to an October 2023 Gallup poll, only 32% of the population reports having a great deal or even a fair amount of confidence
- [00:04:44.000]that media reports the news in full, fair, accurate ways.
- [00:04:49.000]A record number of Americans, 39%, say they don't trust the media at all.
- [00:04:56.000]That number has steadily increased since 2018.
- [00:05:00.000]The exponential growth of misleading media content facilitated by artificial intelligence has exacerbated this challenge,
- [00:05:09.000]purposefully misleading our citizens and sowing distrust in our society.
- [00:05:13.000]However, amidst the sea of deception, media literacy emerges as a formidable shield.
- [00:05:20.000]It equips individuals with critical thinking skills needed to discern fact from fiction, identify quality news sources,
- [00:05:29.000]and enable us to safeguard our understanding, protect our society, and advance our democracy.
- [00:05:36.000]The mission of the College of Journalism and Mass Communications is grounded in the ethical pursuit of truth,
- [00:05:42.000]to uphold our democracy.
- [00:05:45.000]We are deeply committed to fostering media literacy because we recognize its transformative power.
- [00:05:52.000]By understanding the construction and manipulation of media messages,
- [00:05:57.000]individuals can engage more effectively in civil discourse, advocate for their interests, and hold policymakers accountable.
- [00:06:05.000]Today's symposium embraces the grand challenges and opportunities presented by AI,
- [00:06:11.000]and serves as a vital platform for advancing our understanding of media literacy in the age of artificial intelligence.
- [00:06:19.000]It brings together esteemed scholars, researchers, and educators from across campus and across the country
- [00:06:26.000]to explore the multifaceted dimensions of media literacy and its intersection with AI.
- [00:06:32.000]Today, through engaging discussions, insightful presentations, collaborative sessions, we can chart a path forward
- [00:06:40.000]toward a more informed, resilient, and empowered society.
- [00:06:44.000]I would like to express my sincere gratitude to our esteemed guests,
- [00:06:48.000]Drs. Kirsten Thorsen, Jevin West, and Cindy Chen, for sharing your insightful research with us today,
- [00:06:55.000]to Dr. Brian Wong and his interdisciplinary team from across campus for arranging this event,
- [00:07:02.000]and to the Carson Center for hosting us today.
- [00:07:05.000]I also want to thank the entire ORED team, whose support and collaboration
- [00:07:09.000]are essential to our collective efforts to advance research to address these grand challenges in our world.
- [00:07:17.000]Thank you for joining us today, and I wish you a fruitful and engaging symposium.
- [00:07:32.000]Next, I would like to welcome Megan Elliott, who is the Johnny Carson Endowed Director
- [00:07:38.000]of the Emerging Media Arts Center and who has graciously agreed to host this symposium for us.
- [00:07:44.000]Thank you.
- [00:07:48.000]Thank you, everyone, and welcome.
- [00:07:50.000]I'd like to begin, as we always begin when we hold public events here at the Carson Center,
- [00:07:55.000]with a land acknowledgement.
- [00:07:57.000]The University of Nebraska is a public land grant institution with campuses and programs
- [00:08:02.000]across the state that reside on the past, present, and future homelands of the Pawnee,
- [00:08:07.000]Ponca, Otoe, Missouri, Omaha, Dakota, Lakota, Arapaho, Cheyenne, and Core peoples,
- [00:08:15.000]as well as on the relocated Ho-Chunk, Iowa, Sac, and Fox peoples.
- [00:08:20.000]And we acknowledge their elders past, present, and emerging.
- [00:08:24.000]So welcome to the Johnny Carson Center for Emerging Media Arts, where we play with all things AI.
- [00:08:30.000]And today I'm really excited that over lunch you're going to see the fruits of some of our labors
- [00:08:36.000]from two weeks ago when we held our second AI film hackathon.
- [00:08:42.000]We had 34 students that divided into eight teams, students from the Emerging Media Arts program,
- [00:08:47.000]the College of Journalism and Mass Communications, and the School of Computing.
- [00:08:51.000]And so you'll see those films today at lunchtime.
- [00:08:55.000]Wish you a fabulous day. I can't wait to get into it. Thank you.
- [00:08:59.000]Thank you, Megan.
- [00:09:11.000]So without further ado, let me introduce our first speaker of the symposium.
- [00:09:16.000]So I have two speakers in the morning, and then we have one speaker in the afternoon.
- [00:09:20.000]We also have a panel with really distinguished UNL researchers working on AI,
- [00:09:26.000]as well as Nancy from SIFC Nebraska,
- [00:09:28.000]to talk about AI.
- [00:09:29.000]And Nancy will talk about media and AI literacy on the policy front.
- [00:09:33.000]So we have a full day of really interesting talks and panels.
- [00:09:37.000]And we also have a breakout session to really want to have kind of small group discussions with all of you as well.
- [00:09:45.000]So I hope you get to stay and enjoy the talks today.
- [00:09:48.000]So first let me introduce Dr. Kirsten Thorsen.
- [00:09:51.000]She is the Brandt Endowed Professor and Associate Dean for Strategic Initiatives
- [00:09:56.000]in the College of Communication Arts and Sciences
- [00:09:58.000]at Michigan State University.
- [00:10:00.000]Dr. Thorsen is an internationally renowned political communication scholar.
- [00:10:05.000]Her research examines digital platforms and their impact on civic lives.
- [00:10:11.000]Her work has been published in leading political communication and media journals.
- [00:10:19.000]Her recent scholarship has been supported by the National Science Foundation
- [00:10:23.000]and the Social Science Research Council.
- [00:10:25.000]So welcome, Dr. Thorsen.
- [00:10:27.000]Good morning, everyone.
- [00:10:43.000]Don't tell anyone my code.
- [00:11:01.000].
- [00:11:12.000]I am very excited to be the first to kick off this conversation this morning,
- [00:11:16.000]and I want to thank Dr. Wong and all of you for coming
- [00:11:19.000]and for inviting all of us to be part of this conversation today.
- [00:11:23.000]I should probably preface this by saying I will in no way solve any of the grand challenges
- [00:11:27.000]that anyone just mentioned whatsoever.
- [00:11:29.000]In fact, I'm probably going to make it worse, so I'd like to apologize.
- [00:11:31.000]That's not very breakfast-oriented of me, but there you go.
- [00:11:37.000]So I want to start with a little bit of background.
- [00:11:38.000]I'm just going to talk to you sort of in general about how I situate my work
- [00:11:41.000]and the kinds of questions that I ask.
- [00:11:43.000]I'm sure you'll see a lot of similarities across the speakers today,
- [00:11:46.000]as well as what many of you may already be interested in.
- [00:11:49.000]This is a de-centered penny.
- [00:11:51.000]I like to think of it as sort of the icon of what I think one of the biggest challenges
- [00:11:55.000]that we have today actually is, which is thinking about the different ways
- [00:11:59.000]in which journalistic news, the things that some of you may have grown up thinking were news,
- [00:12:04.000]has been de-centered from the everyday lives of many people,
- [00:12:07.000]and especially from youth and young people, which tend to be the area that I focus in the most.
- [00:12:10.000]And I want to talk about two kinds of de-centering.
- [00:12:15.000]One, we already heard about from Dean Vail,
- [00:12:18.000]and that is what we might call the sort of epistemological de-centering of news.
- [00:12:22.000]It is no longer the case that by default most people trust
- [00:12:26.000]what we might call mainstream journalistic news sources just by happenstance.
- [00:12:31.000]My grandparents had to be home every day at 5:30 in order to watch the nightly news,
- [00:12:36.000]which in a small package produced the news that they were supposed to know,
- [00:12:39.000]they could go to bed, carry on.
- [00:12:41.000]That's not the world that we live in anymore.
- [00:12:44.000]Number two, and this one might be harder to persuade you of,
- [00:12:47.000]so I'm not even going to try, you just have to believe me,
- [00:12:49.000]that is the quantitative decentering of news in the everyday lives of people.
- [00:12:53.000]You may think, because if you are here this morning,
- [00:12:55.000]there is something weird about you on this beautiful day,
- [00:12:58.000]perhaps when you go online you see a lot of news.
- [00:13:00.000]I do, but that is not the case for most people.
- [00:13:03.000]The textures of the everyday lives of most people,
- [00:13:06.000]and certainly most young people,
- [00:13:08.000]is to encounter very little journalistic news
- [00:13:10.000]in the course of everyday life.
- [00:13:12.000]So what I'm going to talk about today
- [00:13:14.000]is the different ways in which I ask questions
- [00:13:16.000]about what the consequences are going to be of that
- [00:13:18.000]for the future of thinking about how
- [00:13:20.000]people may or may not become informed.
- [00:13:22.000]Or maybe the future of the ways in which
- [00:13:24.000]people learn about the world around them,
- [00:13:26.000]which we may or may not define as being enough informed
- [00:13:28.000]for the version of democracy you prefer.
- [00:13:31.000]The second thing that's happening,
- [00:13:35.000]this is a Google image search from just this morning
- [00:13:37.000]of what it looks like when you type in AI.
- [00:13:39.000]It's all robots and brains, right?
- [00:13:41.000]I'm sure you all know a lot about that.
- [00:13:43.000]The second issue that we face is constant,
- [00:13:45.000]rapid change in technology.
- [00:13:47.000]And so the other theme of what I'm going to talk about today
- [00:13:49.000]is at the texture of our everyday lives,
- [00:13:51.000]we're constantly challenged out of any routine, right?
- [00:13:54.000]The thing you used before,
- [00:13:56.000]the platform you were comfortable with yesterday,
- [00:13:58.000]simply doesn't work the same the next day.
- [00:14:00.000]And we've handed over control
- [00:14:02.000]of the various infrastructures
- [00:14:04.000]through which we encounter news or content
- [00:14:06.000]or information in our daily lives
- [00:14:08.000]to companies over which we have little,
- [00:14:10.000]if any, control at all.
- [00:14:12.000]I like to live in this little Venn diagram.
- [00:14:15.000]On the one hand, that sucks.
- [00:14:17.000]That's not good. That's why we're all here.
- [00:14:19.000]That's why there's grant money to do exciting things
- [00:14:21.000]and have big conversations.
- [00:14:23.000]But on the other hand, from a scholarly perspective,
- [00:14:25.000]this is an incredibly open time.
- [00:14:27.000]This is a really cool moment
- [00:14:29.000]to be studying these kinds of questions and topics
- [00:14:31.000]because there's a lot of room
- [00:14:33.000]to do new conceptual work,
- [00:14:35.000]to challenge the kinds of theoretical frameworks
- [00:14:37.000]that you may have grown up with,
- [00:14:39.000]or I may have grown up with,
- [00:14:41.000]or even if you're not a scholar,
- [00:14:43.000]challenging just the ways that we think
- [00:14:45.000]about how people live through their civic lives.
- [00:14:47.000]So I want to thread both of these today,
- [00:14:49.000]which is to say there's a lot of issues
- [00:14:51.000]and I'm concerned about them.
- [00:14:53.000]But on the other hand, this is a really exciting time
- [00:14:55.000]to be doing scholarship
- [00:14:57.000]because there's a lot of room for the new.
- [00:14:59.000]So this is how I think about my work.
- [00:15:01.000]I talk a lot about the platformization
- [00:15:03.000]of our civic lives.
- [00:15:04.000]And I'll talk in a second about what that means.
- [00:15:06.000]This is a picture of a Michigan winter.
- [00:15:08.000]I suspect winters look-ish like that here as well.
- [00:15:11.000]But I like to use this picture as a way
- [00:15:13.000]to remind me of the cloudiness of our future view,
- [00:15:16.000]that is, where we're headed.
- [00:15:17.000]There's a lot of sort of incidental effects
- [00:15:19.000]of living lives through platforms,
- [00:15:21.000]and that means there's a lot of work for us all to do
- [00:15:24.000]to think through what a sort of clear set of futures
- [00:15:27.000]might look like.
- [00:15:29.000]These are the characteristics of platforms
- [00:15:31.000]that sort of animate the work that I do.
- [00:15:33.000]Number one, platform companies themselves,
- [00:15:35.000]whether that's Meta or, I was just in China,
- [00:15:38.000]Tencent and WeChat,
- [00:15:40.000]platforms need to become the centre
- [00:15:42.000]of any of the domains they go into
- [00:15:44.000]because that's how their business models work,
- [00:15:46.000]that's how they make money.
- [00:15:48.000]So they have this momentum to drill
- [00:15:50.000]like a mining shaft drilling tool
- [00:15:53.000]into the centre of whatever domain they enter into.
- [00:15:56.000]And sometimes, accidentally, that's our civic lives.
- [00:16:00.000]Number two, platforms tend to rely
- [00:16:02.000]on processes of datafication.
- [00:16:04.000]So when we go online or walk through space with your phone,
- [00:16:07.000]as you all know very well,
- [00:16:09.000]you're producing data about yourself.
- [00:16:11.000]And those data are incredibly important
- [00:16:13.000]to the way platforms work in many different ways,
- [00:16:15.000]incredibly diverse ways and incredibly diversifying ways.
- [00:16:18.000]And then third, as I've already said,
- [00:16:20.000]this platformization results in constant change, right?
- [00:16:23.000]At the level we can't always see,
- [00:16:25.000]the way an algorithm works, for example,
- [00:16:27.000]or the sets of data that have informed a particular model.
- [00:16:30.000]But also in really big ways,
- [00:16:31.000]as I'll show you in a second,
- [00:16:32.000]the sort of thinking about your life
- [00:16:33.000]and repertoires of platforms.
- [00:16:35.000]I used to use Facebook, but now I'm just on Instagram.
- [00:16:37.000]But I got friends out of the country,
- [00:16:40.000]and so I'm also on WhatsApp.
- [00:16:41.000]So we also change our repertoires
- [00:16:43.000]at an individual level very often.
- [00:16:45.000]When we switch and think, as I often do,
- [00:16:50.000]what does this feel like from the inside out?
- [00:16:52.000]What does it feel like to grow up, for example,
- [00:16:54.000]in a platformized civic life?
- [00:16:56.000]One thing it feels like is to live your life
- [00:16:58.000]across repertoires, right?
- [00:17:00.000]We always had lots of channels that we use,
- [00:17:02.000]but now we have lots of platforms that we use.
- [00:17:04.000]And what that means is that
- [00:17:06.000]the way each one of us sees the world
- [00:17:08.000]is to an extent that, unless you study it empirically,
- [00:17:10.000]actually, you may not even believe,
- [00:17:12.000]is incredibly fractured and individualized.
- [00:17:15.000]If you spend time on Instagram, for example,
- [00:17:18.000]or perhaps you're on Tic Tac.
- [00:17:20.000]That would be delicious, actually.
- [00:17:23.000]I would prefer to spend my life on Tic Tac,
- [00:17:25.000]although I really like Tic Tac as well.
- [00:17:27.000]If you spend your lives in these platforms,
- [00:17:29.000]what you see is that what you see
- [00:17:31.000]and what I see will be very different.
- [00:17:33.000]What my husband and I,
- [00:17:34.000]even though we've been married for 20 years,
- [00:17:36.000]see on these platforms is remarkably different.
- [00:17:39.000]So these windows are somewhat fractured and individualized.
- [00:17:43.000]The way I like to think about that is that
- [00:17:45.000]if you want to untangle if you or I or you or somebody
- [00:17:49.000]is going to see a particular piece of news
- [00:17:51.000]or a piece of information that we might think is important
- [00:17:53.000]in their local community,
- [00:17:54.000]you have to do quite a bit of untangling,
- [00:17:57.000]like necklace untangling,
- [00:17:58.000]as you'll see in the picture.
- [00:18:00.000]And I've spent a lot of time sort of theorizing
- [00:18:02.000]what are the different sets of actors
- [00:18:04.000]who are involved in shaping whether a piece of content
- [00:18:06.000]gets in front of your face or mine, right?
- [00:18:08.000]And those actors, media users,
- [00:18:10.000]the people you're connected with
- [00:18:12.000]on these more social platforms,
- [00:18:14.000]strategic communicators who are advertising to you intentionally,
- [00:18:16.000]whether that's politicians
- [00:18:18.000]or news organizations themselves,
- [00:18:20.000]which like to advertise as well,
- [00:18:22.000]or they are the platforms and the algorithms
- [00:18:24.000]that they design themselves,
- [00:18:26.000]all of these actors are making decisions
- [00:18:27.000]that are involved with your data
- [00:18:29.000]that you're producing,
- [00:18:31.000]each of which shapes what you end up being exposed to
- [00:18:33.000]or not.
- [00:18:35.000]What that means,
- [00:18:37.000]in short, in sum,
- [00:18:39.000]is that the concepts that we typically use
- [00:18:41.000]to think about news exposure,
- [00:18:43.000]so in the world that I come from,
- [00:18:45.000]we think about things like selective avoidance
- [00:18:47.000]or incidental exposure.
- [00:18:49.000]We like to see content by accident.
- [00:18:51.000]All of these kinds of concepts are really challenged.
- [00:18:53.000]I mean, some might say, like me, for example,
- [00:18:55.000]they're kind of blown up.
- [00:18:56.000]They just don't work the way they used to.
- [00:18:58.000]And so we need to develop all new concepts
- [00:19:00.000]to help us make sense of what's going to explain things like,
- [00:19:03.000]why is it that some people see a lot of news
- [00:19:05.000]but most people see hardly any?
- [00:19:09.000]When we turn to look particularly at young adults,
- [00:19:12.000]I think the most important thing to keep in mind
- [00:19:14.000]is that news use itself has been entirely de-ritualized, right?
- [00:19:19.000]Again, comparing what was sort of in the height
- [00:19:22.000]of the 20th century, a very routinized way
- [00:19:25.000]of seeing the world, that's entirely gone, right?
- [00:19:28.000]That exists really only for certain generations,
- [00:19:32.000]like my grandparents, for example,
- [00:19:34.000]perhaps some of your parents or grandparents
- [00:19:36.000]who do still have these sort of routines of news use.
- [00:19:39.000]So what's coming in place and what will it mean
- [00:19:41.000]for how people find out about the world around them?
- [00:19:45.000]So I'm going to talk just briefly about a couple
- [00:19:47.000]of empirical projects that I've done,
- [00:19:49.000]one from a few years ago and one a bit more recent,
- [00:19:52.000]to sort of show you two different ways
- [00:19:54.000]that I approach these kinds of questions in my own work
- [00:19:57.000]and how we might start to draw upon these kinds of approaches
- [00:20:00.000]to make sense of what's coming next
- [00:20:02.000]in terms of this question of news exposure.
- [00:20:04.000]So first of all, news exposure itself is shaped by datification.
- [00:20:09.000]So I want you to imagine with me the datified version of you.
- [00:20:13.000]This is a vast oversimplification,
- [00:20:15.000]because if you think of all the places that you go online,
- [00:20:18.000]of course there's a million versions of you out there in data,
- [00:20:21.000]but let's just imagine for a second that there's
- [00:20:23.000]a million different versions of you walking around.
- [00:20:25.000]It has interests, it has preferences,
- [00:20:28.000]it has likes, it has dislikes, it has friends,
- [00:20:31.000]it has friends who themselves have interests.
- [00:20:34.000]So imagine sort of an identified version of you
- [00:20:36.000]that's made out of your own data.
- [00:20:38.000]The question is, does that datafied version of you
- [00:20:41.000]help us understand what kinds of content
- [00:20:44.000]you might encounter across different platforms?
- [00:20:47.000]So when I try to ask these kinds of questions,
- [00:20:49.000]there's really three kinds of studies that I do.
- [00:20:51.000]The first one I'll talk about
- [00:20:52.000]is about sort of building in measures of algorithmic output
- [00:20:55.000]to kind of classic models of exposure.
- [00:20:57.000]So can we see that something about the data
- [00:21:00.000]that's been collected about you
- [00:21:01.000]affects what you see down the road?
- [00:21:04.000]The second approach,
- [00:21:05.000]which I'm not going to talk about today
- [00:21:06.000]but would love to talk about later,
- [00:21:07.000]how do we audit the algorithms themselves
- [00:21:09.000]that shape our news exposure?
- [00:21:11.000]How do we sort of unpack and untangle them
- [00:21:12.000]in ways that can help us learn about how algorithms
- [00:21:14.000]and increasingly AI-enabled algorithmic systems
- [00:21:17.000]can shape news exposure?
- [00:21:19.000]And then third, and this is what I'll talk about last,
- [00:21:21.000]how do we think about the sort of everyday feeling
- [00:21:24.000]of being inside these algorithmic types of platforms
- [00:21:28.000]and increasingly AI-enabled algorithmic types of platforms?
- [00:21:34.000]So let me start with this.
- [00:21:37.000]Is the datafied version of you interested in news and politics?
- [00:21:40.000]These are published in a couple of studies that come out
- [00:21:42.000]over the last couple of years.
- [00:21:44.000]And with my group of PhD students, we really wanted
- [00:21:47.000]to understand this question of can we see the agency
- [00:21:50.000]of the datafied version of you somewhere and help understand
- [00:21:54.000]whether it was helping to explain the kinds of news
- [00:21:56.000]that you saw on a platform.
- [00:21:59.000]So as you all may know, it's actually quite hard
- [00:22:01.000]to study different kinds of platforms because we don't tend
- [00:22:04.000]to know what platforms know about you.
- [00:22:06.000]They have no reason to tell us that information.
- [00:22:08.000]Sometimes they don't even know it that well themselves.
- [00:22:11.000]So a method that we often use that many people are now using
- [00:22:13.000]is the method of data donation.
- [00:22:15.000]So oftentimes a little bit in the US,
- [00:22:17.000]but increasingly in European contexts because
- [00:22:19.000]of some different policy regulations,
- [00:22:21.000]people are able to take some of their data out of platforms.
- [00:22:24.000]So in the case that we wanted to study,
- [00:22:26.000]which was on Facebook, it turns out that one thing
- [00:22:28.000]that people could take out of platforms
- [00:22:30.000]was data about how they were sold to advertisers.
- [00:22:34.000]So all of you know if you go onto platforms,
- [00:22:36.000]you see ads, and to some extent those ads
- [00:22:38.000]have been targeted to you.
- [00:22:40.000]And sometimes they have been very carefully
- [00:22:42.000]micro-targeted just to you.
- [00:22:44.000]And in fact, Facebook and other companies like Facebook
- [00:22:48.000]sell the data about you to say, oh, here's Kirsten.
- [00:22:51.000]She likes long walks and sunny skies
- [00:22:54.000]and also specific sports teams,
- [00:22:56.000]and therefore you should sell X to her at Y time.
- [00:23:00.000]So we did a survey of young people,
- [00:23:02.000]first in our own university and then in a national sample.
- [00:23:05.000]And we asked them to fill out a lot of questions
- [00:23:07.000]about themselves.
- [00:23:08.000]How interested are you in politics?
- [00:23:09.000]How often do you see news online?
- [00:23:11.000]How engaged are you in news in various ways?
- [00:23:13.000]But we also asked them to do a really hard thing.
- [00:23:16.000]That is to download the data about themselves,
- [00:23:17.000]and then to upload and give it back to us
- [00:23:20.000]so that we can analyze it.
- [00:23:22.000]And so we are taking this idea
- [00:23:24.000]of all these words about you
- [00:23:26.000]that Facebook is selling about you
- [00:23:28.000]as one way to see this ghostly,
- [00:23:31.000]datafied version of you.
- [00:23:34.000]And so what we do is we analyze those data
- [00:23:37.000]in ways that I talk about later,
- [00:23:38.000]but we'll leave for now.
- [00:23:39.000]We analyze those data to try to understand
- [00:23:41.000]how interested does Facebook think you are
- [00:23:44.000]in news or politics
- [00:23:46.000]by understanding how many words
- [00:23:48.000]that it sells to advertisers
- [00:23:50.000]seem to be related to news or politics,
- [00:23:52.000]so about local politicians or news organizations
- [00:23:54.000]and so on and so forth.
- [00:23:56.000]So to make a very long story short,
- [00:23:58.000]and don't worry about the numbers or the model,
- [00:24:00.000]it's just here to be something behind me as I talk,
- [00:24:02.000]what we find is that if the data-fied version of you
- [00:24:07.000]appears to be interested in news or politics,
- [00:24:09.000]you attract more news and politics into your Facebook.
- [00:24:12.000]And that's even after we control
- [00:24:14.000]for how interested you say you are
- [00:24:15.000]in politics.
- [00:24:17.000]In fact, the relationship between
- [00:24:19.000]how you tell us you're interested in politics,
- [00:24:21.000]like on a scale of one to seven,
- [00:24:23.000]how interested are you in news?
- [00:24:24.000]The relationship between that
- [00:24:26.000]and how interested Facebook thinks you are in news
- [00:24:29.000]are only lightly related to one another.
- [00:24:32.000]They're kind of different.
- [00:24:33.000]So there's lots of reasons for that, right?
- [00:24:35.000]Inferences about you make mistakes.
- [00:24:38.000]A lot of times the platforms interpret signals about you
- [00:24:40.000]that aren't intentionally what you sent.
- [00:24:42.000]And so imagine now when we think about
- [00:24:44.000]news exposure, we have to understand you
- [00:24:46.000]and your preferences, and that's what
- [00:24:48.000]communication scholarship has gotten pretty good at.
- [00:24:50.000]Like, we like to think of people as active users
- [00:24:52.000]and choosers of media.
- [00:24:54.000]But now there's this ghost-like version of you.
- [00:24:56.000]This one.
- [00:24:58.000]This one. That one.
- [00:25:00.000]And that version of you is only a little bit like you,
- [00:25:02.000]or maybe it's only like you on a Monday morning
- [00:25:04.000]when you haven't had your coffee, or maybe it's only like you
- [00:25:06.000]after a couple glasses of wine on a Friday night.
- [00:25:08.000]But either way, it also is making choices for you.
- [00:25:13.000]So our algorithmic identity, right, and increasingly
- [00:25:16.000]our AI-enabled algorithmic identity will be much more complex,
- [00:25:19.000]or identities, I'll leave that there for a second,
- [00:25:21.000]is capable of making choices for us about exposure
- [00:25:24.000]or attracting or repelling news content from within our orbit.
- [00:25:29.000]At the same time, even as we come to understand a phenomenon like this,
- [00:25:33.000]this is, you know, these are data from, what, two years ago?
- [00:25:35.000]They're already like, pfft, pfft, who knows?
- [00:25:38.000]Nothing works that way anymore, right?
- [00:25:40.000]The rapid change that we're experiencing
- [00:25:42.000]reminds us in this story that it's not about you or me,
- [00:25:45.000]it's about platforms, right?
- [00:25:46.000]It's about how they work.
- [00:25:49.000]You may have all heard the story
- [00:25:51.000]of the law that's been passed in Canada last year,
- [00:25:54.000]in which they were going to charge platforms
- [00:25:56.000]for putting news out there to people,
- [00:25:59.000]make them pay back the news organizations.
- [00:26:02.000]And interestingly, you know,
- [00:26:04.000]platforms didn't want to pay that money.
- [00:26:06.000]Well, most of them didn't.
- [00:26:07.000]And the reason for that is because
- [00:26:09.000]people don't want news on their platforms,
- [00:26:11.000]and platforms know that very well, right?
- [00:26:13.000]So to get back to that decentered penny,
- [00:26:15.000]so I want to remind you a second.
- [00:26:17.000]On the one hand, we're thinking to ourselves,
- [00:26:19.000]how do we get more news in front of people on these platforms?
- [00:26:22.000]But platform companies know very well, as I do,
- [00:26:24.000]and I'll show you the data in a second,
- [00:26:26.000]that's not how people want to use these platforms, right?
- [00:26:29.000]And so we have this real tension where we've handed over
- [00:26:31.000]the infrastructure for how people find out about stuff
- [00:26:34.000]to companies who know very well that the way they can make money
- [00:26:37.000]is not to show civic content.
- [00:26:40.000]We also increasingly see the design of platforms themselves
- [00:26:43.000]being what we might call unfriendly to news.
- [00:26:46.000]So initial studies of TikTok, or as you know, I like to call it TicTac,
- [00:26:50.000]have shown, and we've also seen studies of YouTube that look the same,
- [00:26:54.000]which is that recommendation over time tend to lead away from news.
- [00:26:58.000]Because the platforms themselves have come to understand
- [00:27:00.000]that's not what people want to do and use on these platforms.
- [00:27:05.000]So how do we understand now this experience from the inside out
- [00:27:09.000]of how people experience life through and on these platforms?
- [00:27:13.000]The last study I want to talk to you about is a study based on interviews.
- [00:27:19.000]And what we're trying to understand in this project,
- [00:27:22.000]this is funded by the Social Science Research Council,
- [00:27:24.000]was this experience of datafied media use.
- [00:27:27.000]So we use a lot of different approaches,
- [00:27:29.000]but what we're trying to understand is this sort of question
- [00:27:32.000]of how have we positioned young people in terms of how responsible they are
- [00:27:36.000]for their own media environment.
- [00:27:38.000]So I want you to harken back to an earlier era
- [00:27:42.380]that era again of my grandparents. My grandparents have no idea how often I hold them up as the
- [00:27:47.280]icons of routine news use. That's fine. Sorry. Sorry up there. So let's harken back to this
- [00:27:54.460]olden days, right? This imaginary time where the packages that news came in were pretty tight,
- [00:27:59.400]right? News hadn't been unbundled yet. News came in newspapers or perhaps in magazines or in
- [00:28:04.560]compact television programs. The job very clearly to produce the news at that time was held by
- [00:28:11.820]journalists, right? We all know that very well. And then over the last sort of 20, 25 years,
- [00:28:16.120]we started talking a lot about how, you know, citizens could also make news. We all have a
- [00:28:20.020]voice and people could make news and we're not being gatekept anymore. We thought that was kind
- [00:28:24.580]of a good thing for a while. My misinformation colleagues will remind us why perhaps it's not.
- [00:28:28.840]But what we wanted to understand is not just journalism's role of gatekeeping has now come
- [00:28:36.120]down to all of us, but actually, in fact, the construction of our media environments themselves
- [00:28:41.260]perhaps has come onto our shoulders as well. So what we've been doing, and I actually have been
- [00:28:48.660]doing this for many, many years in different forms, is we sit down with young adults and we read
- [00:28:53.760]platforms together. It's a co-browsing exercise. I've probably been doing it for 15 or 18 years.
- [00:28:58.980]And sometimes we do them online, sometimes we do them in coffee shops, and we ask people, what are
- [00:29:03.540]your top three platforms? Sometimes that's Reddit, YouTube, Instagram, sometimes it's Facebook.
- [00:29:10.700]It can be all sorts of different platforms. And whatever the interface is to that platform,
- [00:29:14.400]we sit down with people and we ask them to walk us through what it looks like.
- [00:29:17.940]So I want to talk to you about a couple of concepts that we've been developing
- [00:29:23.560]that emerge from these kinds of projects. The first one that we've been working on is this
- [00:29:29.660]experience of what we call personal platform architecture. And personal platform architecture
- [00:29:35.380]is not about saying that youth and young adults are now able to build platforms. We're not talking
- [00:29:40.140]about being technological architects, but rather we're talking about being an active participation
- [00:29:45.660]in the construction of your own media environment. And it's not as simple as just like,
- [00:29:49.420]I'm on Instagram and I chose to click on this source or I didn't.
- [00:29:53.260]In fact, it's about constructing the full repertoire of what platforms you use,
- [00:29:58.940]actively making choices about what accounts to connect to or even peers to be friends with.
- [00:30:04.780]But it's also about the passive ways that your data as you move through
- [00:30:09.580]these digital spaces gets then read back into the infrastructure of the platform itself.
- [00:30:14.780]Leading to recommendations that you may get in the future or feedback loops that help explain
- [00:30:20.220]what kind of content you're going to see in the future. But architecture under really strong
- [00:30:26.380]constraints, it's not up to us how platforms work. Those are designed and given to us. In fact,
- [00:30:33.180]sometimes the functionality is not really about us. Notifications, for example, to keep a streak,
- [00:30:39.020]that's not about your convenience, right? That's about the platform needing to keep you. That goes
- [00:30:43.240]back to platformization itself and needing for business reasons to keep users where you want
- [00:30:49.400]them to be. And this, what we call everyday labor of constructing your own media environment is
- [00:30:56.040]something that you have to do every day. So it's quite poignant in interviews when we talk to youth
- [00:31:00.560]and young people, as your identities grow up in partnership with this media worlds that you're
- [00:31:08.460]working with. It can be quite difficult to have to tweak and tone and change those platforms all the
- [00:31:15.620]time. This is how we define personal platform architecture as datafied actions that alter the
- [00:31:23.600]flows of communication received by a user, both in the short term and then on into the future.
- [00:31:28.500]We talk from our data about three different kinds of labor that's involved in this sort of, I guess
- [00:31:37.900]a lot of this kind of work.
- [00:31:39.400]The first is emotional labor that is managing what I'll call in a second cacophony, like the
- [00:31:43.960]yuckiness, the ishiness or the toxicity feeling of what it feels like to be in some of these
- [00:31:49.180]platforms some of the time.
- [00:31:50.380]And that includes tuning or toning out news and politics, a concept that you may have heard
- [00:31:56.620]of that's been largely in our fields called news avoidance is very closely tied to this feeling
- [00:32:01.180]of not wanting to see certain kinds of things in your spaces.
- [00:32:04.600]The second is immaterial labor.
- [00:32:07.340]And I think we can make here some connections to this idea of literacy.
- [00:32:10.260]That is to say that people's work to evaluate and decide what to believe is not easy.
- [00:32:17.200]It's become very difficult to know what to believe.
- [00:32:21.660]And one of the things that we hear all the time is that the very cues about mainstream news
- [00:32:29.060]that I might use to let me know that this is credible, like high production values,
- [00:32:34.240]slickness, well put together.
- [00:32:36.780]For many youth and young people, that comes off as inauthentic, right?
- [00:32:41.260]In fact, that seems less credible because it's less raw.
- [00:32:44.120]It's too put together.
- [00:32:45.280]We hear all the time, all news is biased.
- [00:32:47.720]And not just politically biased, but profit biased.
- [00:32:50.520]And I don't know, are they wrong?
- [00:32:53.020]And then finally, visibility labor.
- [00:32:55.900]There's a lot of labor that goes into tuning and tweaking and architecting your own systems
- [00:32:59.800]that's related to positive presentation of self.
- [00:33:02.940]How do you make sure that something you share isn't going to be embarrassing?
- [00:33:06.720]How do you make sure that if your mom's looking over your shoulder, the platform looks fine?
- [00:33:10.260]Is it okay to post something on this platform because your Uncle Willie isn't darn there,
- [00:33:14.940]but he is on this platform, and if you post something, he might say something underneath it?
- [00:33:18.560]So what I want to paint the picture of is sort of youth and young adults actually working
- [00:33:23.280]really hard to manage these sort of broader media systems around themselves.
- [00:33:27.380]They do so under the experience of what we've called information cacophony.
- [00:33:32.200]This is what it feels like.
- [00:33:34.120]This is what we hear all the time, especially in high
- [00:33:36.660]news times, talking about the war in Gaza, talking about election seasons,
- [00:33:41.880]times when, or COVID, certainly, times when there's a lot of information,
- [00:33:46.420]you know it should be there, and it feels very cacophonous.
- [00:33:49.660]And cacophony is the word that we use because it's not just loud.
- [00:33:53.340]It's not just loud.
- [00:33:54.420]It's not just loud, right?
- [00:33:56.100]It's messy.
- [00:33:57.600]It's voices speaking in very different tones, speaking in very different approaches.
- [00:34:02.020]And what brings it all together as awful is it's combined
- [00:34:06.600]with this low trust in news as well as low trust in other institutions.
- [00:34:10.940]So remember, this epistemological displacement of journalistic news
- [00:34:15.480]has meant that you don't automatically know what to trust.
- [00:34:19.080]I can manage cacophony because I trust the New York Times.
- [00:34:22.680]And therefore, I can still scan through the mess and say, but this I believe.
- [00:34:26.620]When we take that away, it becomes much more difficult.
- [00:34:28.740]And the last piece of cacophony that's emerged I think is really important for
- [00:34:32.940]thinking about interventions and how we connect in the future is a tone
- [00:34:36.540]or mismatch.
- [00:34:37.380]Most people use digital platforms for fun, for entertainment, or for escape,
- [00:34:44.860]or to be inspired or stimulated.
- [00:34:48.040]And oftentimes, when news or politics appears in those spaces,
- [00:34:52.200]it appears as a huge tonal mismatch, right?
- [00:34:55.160]Either it looks different, it feels different, or it feels very discordant
- [00:34:59.980]to the kinds of things that you were enjoying seeing, again, like this.
- [00:35:05.620]So one takeaway from
- [00:35:06.480]a lot of the work that we've been doing is just a constant reminder to people that
- [00:35:09.800]even though, sure, perhaps there's more information out there than ever before,
- [00:35:13.720]arguably, for young adults, managing the media environment and
- [00:35:17.040]deciding what's true and false is harder than it's ever been before.
- [00:35:21.600]And I want to note, and
- [00:35:22.700]this is maybe not as super positive about the possibilities of literacy,
- [00:35:26.700]managing the media environment is not a job people wanted to be hired for
- [00:35:30.780]in the first place, right?
- [00:35:32.500]It's not like people are out there looking for
- [00:35:34.100]ways to make sense of the world, rather, they're
- [00:35:36.420]struggling with the fact that there is no natural default, excuse me, way to make sense
- [00:35:40.240]of the world.
- [00:35:43.860]One way to see this is that because we've handed over so much of our civic infrastructure
- [00:35:48.140]to platform companies themselves, what we're doing, and again, I don't mean us, we, but
- [00:35:53.800]maybe we'll be people who do something about it, what we've done is crafting environments
- [00:35:58.060]that are unfriendly to news, right?
- [00:36:01.440]We've crafted a social life and a civic life that is unfriendly to news.
- [00:36:06.360]And it's not one that has, I think, a particular easy solution.
- [00:36:12.500]Personal platform architecture, the different ways that each of us inhabit or curate or
- [00:36:16.220]craft our own media environments around ourselves, they shape whether we're seen as attractive
- [00:36:21.700]to news.
- [00:36:22.700]Am I the kind of person who a platform should deliver news to?
- [00:36:26.660]And if I'm not, then I'm crafting a media world around myself in partnership with platform
- [00:36:31.500]companies that will prevent me from being the kind of person who may see information
- [00:36:35.980]that I need to know.
- [00:36:36.300]That I need to get through my daily life.
- [00:36:37.680]And I'll leave you with this.
- [00:36:41.900]And these are the kinds of questions that our research team is thinking about now.
- [00:36:44.660]So how do we imagine the futures of news exposure under these conditions, right?
- [00:36:49.280]There are lots of people who are working on saving news.
- [00:36:51.720]I hope that works out.
- [00:36:52.980]There are sort of institutional challenges for news organizations themselves.
- [00:36:56.600]I'm very interested in this sort of individual level experience.
- [00:37:00.360]How do we craft civic culture that has a space for people who want to be informed to be informed?
- [00:37:06.280]Or who want to engage around certain kinds of content to be engaged?
- [00:37:09.420]And when we hand over agency of what we see to these data-fied versions of ourselves, we lose some of that agency.
- [00:37:15.560]We hand over some of that agency to perhaps people, companies, systems that we don't ourselves understand.
- [00:37:22.700]So is there potential for generative AI?
- [00:37:25.720]Is there potential for technology to solve these problems?
- [00:37:29.100]I assume by my use of air quotes, you can understand that I don't think that will be particularly easy.
- [00:37:34.540]This is not a happy story as far as...
- [00:37:36.260]As far as I'm concerned.
- [00:37:36.760]But I think what we will see, and will be very interesting to see, is whether or not...
- [00:37:41.820]And when I give versions of this talk, people always ask me,
- [00:37:44.500]will AI perfect the datafied version of ourselves?
- [00:37:48.600]Maybe what we're waiting for is just technology to get so good
- [00:37:51.960]that we don't have a separation between who we really are and who platforms think we are, right?
- [00:37:57.800]I don't think that's where we're headed.
- [00:38:00.160]I don't think that's the case, and I don't think that should be the goal.
- [00:38:02.600]We are multiple cells.
- [00:38:04.180]We are full of multiplicities.
- [00:38:06.240]And no version of technology may ever be able to capture that particular complexity.
- [00:38:10.260]What I think we will see is sort of personalization and tailoring on steroids.
- [00:38:15.300]Okay, so if young adults like raw, authentic news, can we do something to New York Times
- [00:38:20.480]content to make that presentation match what they want to see, right?
- [00:38:23.640]Can we do that?
- [00:38:25.120]And I think there'll be a lot of interesting experiments to see whether that's effective
- [00:38:27.940]or not.
- [00:38:28.320]The question is, how focused will we be, to Brian's point, on some of the buts?
- [00:38:33.240]What may be the incidental negative effects of
- [00:38:36.220]trying to engage in those kinds of processes?
- [00:38:38.140]What may be the incidental effects of trying to design systems that will deliver news to
- [00:38:43.300]people who didn't want it in the first place?
- [00:38:44.760]So ultimately, I think this is a super exciting time to be having the conversations that you
- [00:38:49.780]all are having here, and that this team of folks has invited us all to have today.
- [00:38:53.920]I'm not super optimistic about us solving these problems in the next day, but I think
- [00:38:59.020]there's an incredible amount of energy going into rethinking and reinventing a lot of the
- [00:39:06.200]future of learning about the world around you might look like.
- [00:39:08.780]Thank you.
- [00:39:09.660]Oh, sorry, I was about to run away.
- [00:39:36.180]If you don't want to get up for the microphone I will also repeat your question so.
- [00:39:43.160]I'll read, I can hear you I'll repeat it I'll repeat it.
- [00:39:55.880]Will you introduce yourself just for fun.
- [00:40:02.100]Sure.
- [00:40:06.160]So the question is, do we see platforms as, as making cacophony worse by trying to engage in the tailoring.
- [00:40:32.040]That's a great question. So, um,
- [00:40:36.140]No, I think no, because platforms want you to have a great experience because they want you to stay right they need you to stay and look at the things or buy the things or whatever the business model that they have actually is.
- [00:40:49.140]So for me cacophony is like a individual level experience right it's cacophony because if I go on Instagram.
- [00:40:57.140]I will reveal to you that my Instagram at its optimum point is showing me beautiful houses and well styled food.
- [00:41:06.120]I also follow some news organizations so my cacophony ends up being when I'm looking at this smooth, well designed universe, and then I see a photo of a war, and then I see a political post from a colleague or a friend.
- [00:41:20.120]Right, that's what cacophony can feel like to me.
- [00:41:23.120]At the sort of group level. I don't think that is cacophony that's this like fractured individualized experience that's the difference between what you and I see, and I, you know that's the platforms trying to optimize to you versus
- [00:41:36.100]what you and I see, and I, you know that's the platforms trying to optimize to me.
- [00:41:38.100]That does the result that we have very different experiences of our, of our digital worlds.
- [00:41:46.100]I saw it.
- [00:41:56.100]The question is, what can be done to make young people more aware of the fact that they control their algorithm.
- [00:42:04.100]So, there have been studies of algorithmic literacy.
- [00:42:06.080]And we've done some of that work and it's variable.
- [00:42:08.080]I mean, I think some people more or less understand how those platforms work.
- [00:42:12.080]We, one of the exercises we do in our co browsing technique is we ask about those kinds of questions, and we sort of show people some of the capacities that they have that they may not have had.
- [00:42:22.080]And then at the end we also show them some of their own data so they can sort of grow their agency through the research process itself.
- [00:42:30.080]The sad part is that typically showing people those things doesn't, at least in the short term, make them change.
- [00:42:36.060]They change their behavior, right?
- [00:42:38.060]Because I think it's worth seeing us individuals and platforms in partnership.
- [00:42:46.060]We both want to feel good, right?
- [00:42:48.060]We both want to feel good in the moment.
- [00:42:50.060]And the platform is doing everything it can to give us the right combination of what we like and a little bit of novelty and blah, blah, blah to make us feel good.
- [00:42:57.060]And that's why I'm there as well.
- [00:42:59.060]So I don't want to overestimate that knowledge is power, as He-Man and She-Ra used to tell the youth of my generation.
- [00:43:06.040]Sometimes just telling people how something works doesn't change their actual behavior.
- [00:43:12.040]That said, a proposal that we've worked on a few times that no one wants to fund is what if we treat understanding of our civic lives online the way we treat equity issues around girls and STEM, right?
- [00:43:28.040]So girls and STEM is this fascinating question.
- [00:43:30.040]Young girls and young boys, we all love science, having a good time.
- [00:43:36.020]We all love science, having a good time.
- [00:43:37.020]We all love science, having a good time.
- [00:43:38.020]We all love science, having a good time.
- [00:43:39.020]We all love science, having a good time.
- [00:43:40.020]We all love science, having a good time.
- [00:43:41.020]We all love science, having a good time.
- [00:43:42.020]We all love science, having a good time.
- [00:43:43.020]We all love science, having a good time.
- [00:43:44.020]We all love science, having a good time.
- [00:43:45.020]We all love science, having a good time.
- [00:43:46.020]We all love science, having a good time.
- [00:43:47.020]We all love science, having a good time.
- [00:43:48.020]We all love science, having a good time.
- [00:43:49.020]We all love science, having a good time.
- [00:43:50.020]We all love science, having a good time.
- [00:43:51.020]We all love science, having a good time.
- [00:43:52.020]We all love science, having a good time.
- [00:43:53.020]So I think that's what we're all about.
- [00:43:54.020]work very similarly not based on gender but based on income and education that
- [00:43:59.000]is rich white people typically see a lot more news than everybody else that's
- [00:44:04.100]because news has historically been made for those populations but we don't do
- [00:44:08.240]interventions to address that right we haven't decided as a society that that
- [00:44:12.000]is sort of a socialization type of problem so I think if we're gonna do
- [00:44:15.900]something it's gonna be interventions that are consistent with pretty young
- [00:44:19.760]kids so they they don't just understand but maybe have a sense of responsibility
- [00:44:23.960]to build a media world around themselves that contains you know some tough stuff
- [00:44:28.640]as well as fun stuff
- [00:44:31.560]mm-hmm yeah for sure yeah yeah yeah
- [00:44:53.900]yeah so the question is about young people's interest in news how do we know
- [00:45:16.340]if they're interested in news or not if we don't even agree on what news is
- [00:45:18.740]which I totally agree with and then the other piece of it happens how do we make
- [00:45:22.300]a comparison back to
- [00:45:23.840]young people I don't know like my grandparents when they were young were
- [00:45:26.760]they interested in news or not these are both questions that I'm obsessed with so
- [00:45:30.140]I didn't plant that question but I'm glad you asked it so on the first
- [00:45:35.060]question I will shout out my colleagues Emily Vraga and Stephanie Edgerly wrote
- [00:45:40.400]this great paper on newsness as a variable so there's a whole paper out
- [00:45:44.480]there and a series of studies afterwards in which they write about that we
- [00:45:49.100]shouldn't assume that news is like an on or off switch it's on a continuum and
- [00:45:53.060]different people from
- [00:45:53.780]different backgrounds and different contexts will define news differently
- [00:45:56.780]now I don't want to like get all existential about research methods but
- [00:45:59.780]essentially what that means is when we ask people how much news do you see we're
- [00:46:04.160]getting you know somewhere in the range from sort of right to garbage right if
- [00:46:08.180]we ask people is this news or not we may get an accurate sort of perceptual
- [00:46:12.260]answer but doesn't really tell us about how a platform may be interpreting the
- [00:46:16.760]kind of action that you take I think that's a real issue by some metrics young-ish
- [00:46:23.720]Gen Z ish generations are very engaged there's a lot of social issues that
- [00:46:29.600]people care deeply about but I am still pretty convinced that because so much
- [00:46:35.120]news has been defined as toxic that politics itself has been defined as
- [00:46:39.500]toxic I'm actually not gonna put that in square scare quotes because politics is
- [00:46:42.920]kind of toxic that there is more alienation from news content now than in
- [00:46:47.880]previous generations I spent quite a bit of time in my own dissertation a long
- [00:46:52.900]time ago
- [00:46:53.660]trying to look at every chunk of data from like the 50s 60s and 70s about
- [00:46:58.040]young people of that era and what it said about what we could learn about
- [00:47:02.720]their news conception or news use from that time and I think the best we can
- [00:47:06.380]tell is that news use isn't purely generational it's contextual right we
- [00:47:11.680]are motivated to be engaged with news when we absolutely have to so of course
- [00:47:15.000]we saw news you shoot up and then shoot right back down during kovat but it has
- [00:47:20.100]always been the case that most people are
- [00:47:23.600]uninformed and they're okay with that right what i think is different now is
- [00:47:28.500]the volatility between people who are super informed and people who are
- [00:47:32.440]completely alienated what we've lost is that you know what arguably is like the
- [00:47:37.400]greatness of american civic culture which is like the regular people in the
- [00:47:41.300]middle who knew a little bit and just enough right that that middle has kind
- [00:47:45.880]of fallen out and we now have many more people on that i don't see anything at
- [00:47:49.060]all and this really intense chunk of people who see too
- [00:47:53.540]much news contents
- [00:48:23.480]so i was interested if you could actually talk a bit about that last slide about the evidence and how you see those manifestations of the other maybe having an impact on that fracturedness
- [00:48:43.480]or not and maybe i have it maybe i'm also like
- [00:48:47.480]uh
- [00:48:53.420]i actually think value judgments are awesome and i wish we would bring them back so that's okay
- [00:48:59.420]yeah so so so the question is how will some of these futures of news how could they possibly
- [00:49:06.620]um affect the level of individualization of what we see perhaps um my first job um was at university
- [00:49:16.140]of southern california and there was this big session where they invited uh the lead engineer on
- [00:49:23.360]the google news project this is a long time ago right so you'll remember we used to go to like
- [00:49:27.980]news.google.com and there was like you know it's like early personalized news there wasn't like a
- [00:49:32.600]flag linking but it wasn't that much after that right so it's just this kind of clean little page
- [00:49:36.320]and it would give you i don't know i can't remember 10 or 12 results and so someone from
- [00:49:41.840]the university was interviewing him about how did you decide how many results to put on the
- [00:49:46.000]page and he had grown up in india and the story that he told us is that he decided that on the
- [00:49:53.300]page the first three two or three links would be the same for everyone right they would not be
- [00:49:59.200]personalized they would be unpersonalized right they would be for us all and the reason he did
- [00:50:05.000]that is because his father was this diligent reader of newspapers read multiple newspapers
- [00:50:10.560]every day and he firmly believed had this like intense value judgment that there should be at
- [00:50:15.440]least a few things that we all know i think this question that is not the question you asked about
- [00:50:21.800]is that a value judgment
- [00:50:23.240]that we would or could make today right what does it mean to ethically design a system that's
- [00:50:28.520]intended to deliver news to you well the reason journalistic news institutions got pretty ish good
- [00:50:35.720]at that although i will note that before we were worried about no one seeing news we were worried
- [00:50:39.240]about how bad news was so let's not let's hold on to that for a second sorry sorry journalism um
- [00:50:44.360]it's also really good too um so so we built this institution that had all of these sort of norms
- [00:50:52.360]and practices
- [00:50:53.180]that were designed to deliver us a sort of consistent set of pieces of news
- [00:50:56.760]that we all decided through these norms and practices were the things that we should know
- [00:51:00.760]those systems however also excluded voices they kept people out they made sure that we couldn't
- [00:51:06.240]hear about people who the news organizations did not have a natural affinity to or were interested
- [00:51:11.400]in so how would we ethically build a system today knowing what we know that would deliver an
- [00:51:17.320]unindividualized set of news to people i'm very skeptical that that's what we would choose however
- [00:51:23.120]i do still think that that sort of window of conversation that we can have with kids as they're
- [00:51:28.400]growing up about what kind of civic life do you want to have right what are the fora that you want
- [00:51:33.220]to engage with other people i'm working now we have an nsf grant to study volunteer moderators
- [00:51:40.200]of local groups on social media platforms these people are inundated with ick content and good
- [00:51:49.960]content often in spaces it's like grandma's
- [00:51:53.060]facebook garden group or like seedling trade group and now these have become really important
- [00:51:59.680]fora where people connect with each other right on a very local sort of town square
- [00:52:03.320]type of basis and even in those spaces what we see is that platforms are doing everything
- [00:52:08.260]they can to keep civic out to keep news out to keep politics out because it's become so
- [00:52:13.840]contentious right because it's been invaded by our national political system which has
- [00:52:18.900]never been how we understand the world at a local level so i think that the kind of
- [00:52:23.000]question you're asking really like unzips a lot of really big issues right what what
- [00:52:27.640]where are we going to take a stand and say this is what we need so i have no no idea
- [00:52:32.800]no we should talk about it
- [00:52:52.940]are we screwed is the question so far on the table
- [00:53:00.560]yeah yes we are screwed but i think on the flip side this is yeah well so not
- [00:53:22.880]that we always have been but that i mean unless you truly believe in some sort of true progress
- [00:53:28.640]toward a glorious end which i think we kind of debunked a century ago we go through cycles
- [00:53:33.680]right civic cultures are alive they don't they don't sit and they don't stay the same
- [00:53:40.220]and i i think people are incredibly resilient to these kinds of things i mean how is it
- [00:53:44.700]possible that we could see so little news content and yet the world moves on right in
- [00:53:50.700]some sense i i'm a total
- [00:53:52.820]asthmatic optimist i do not think that ai will solve the problem at all and i'm not
- [00:53:58.400]sure that telling people about ai will solve the problem at all but i am very sure that
- [00:54:02.860]you know we are good at figuring out new versions of infrastructure to make the world move forward
- [00:54:09.800]but we are definitely i think giving up maybe the one that some of us were raised with and
- [00:54:14.140]some of us might have been raised to idealize so screwed for now that might be a good ending
- [00:54:20.820]maybe we'll just
- [00:54:22.760]yeah brian's like that's not a good ending actually
- [00:54:27.020]sorry thank you thank you kirsten
- [00:54:33.360]thank you so we have a 15 minute break um till 10 15 um when uh dr jeff and west would
- [00:54:45.960]talk about generative ai and misinformation um in the meantime it's 15 minutes there's
- [00:54:50.260]drink and snack over there
- [00:54:52.700]feel free to mingle and chat and we'll come back at 10 15. thank you
- [00:54:56.760]what are you printing the answer to that should be absolutely nothing
- [00:55:22.640]- Shasta.
The screen size you are trying to search captions on is too small!
You can always jump over to MediaHub and check it out there.
Log in to post comments
Embed
Copy the following code into your page
HTML
<div style="padding-top: 56.25%; overflow: hidden; position:relative; -webkit-box-flex: 1; flex-grow: 1;"> <iframe style="bottom: 0; left: 0; position: absolute; right: 0; top: 0; border: 0; height: 100%; width: 100%;" src="https://mediahub.unl.edu/media/23415?format=iframe&autoplay=0" title="Video Player: Digital Platforms, Algorithms and the Future of News Consumption" allowfullscreen ></iframe> </div>
Comments
0 Comments