Five Things with Kyle Langvardt
University Communication
Author
09/16/2021
Added
28
Plays
Description
Social media is a driving force in our world and a place to express our opinions. But does freedom of speech exist on social media platforms? Kyle Langvardt is a First Amendment scholar and a member of the Nebraska Technology and Governance Center at the University of Nebraska College of Law. In this episode, Langvardt talks about the complex issues related to free speech and governance of social media and why it’s important for students and universities to study them.
Show notes: Learn more about Kyle Langvardt at law.unl.edu/kyle-langvardt; learn more about the Nebraska Governance and Technology Center at ngtc.unl.edu; learn more about Tech Refactored at ngtc.unl.edu/tech-refactored.
Searchable Transcript
Toggle between list and paragraph view.
- [00:00:00.268](electronics sounding)
- [00:00:03.521](bird tweeting)
- [00:00:05.140]We are bombarded with content on social media.
- [00:00:08.461](bird tweeting)
- [00:00:09.392](electronic chiming)
- [00:00:10.225]Kyle Langvardt, an assistant professor
- [00:00:11.830]at the Nebraska College of Law, is not a fan.
- [00:00:15.296]No, I don't smoke either.
- [00:00:17.781]I mean, I don't want to do that to myself.
- [00:00:22.200]In fact, Langvardt put off getting a smart phone
- [00:00:24.880]as long as possible.
- [00:00:26.400]He just didn't want the added stress.
- [00:00:28.659](dramatic music)
- [00:00:31.280]But even just looking at regular news,
- [00:00:33.490]I think there's kind of an addictive quality,
- [00:00:35.880]and I notice in myself, if I'm reading news
- [00:00:41.450]as I brush my teeth, I think my mental state
- [00:00:46.330]gets a little more scattered and frenetic and paranoid.
- [00:00:51.650]So I can't even imagine what this kind of video game
- [00:00:57.380]atmosphere of social media might do.
- [00:01:00.012](electronics sounding)
- [00:01:05.330]But Langvardt does study social media platforms
- [00:01:07.810]in connection with his work
- [00:01:09.100]at the Nebraska Governance and Technology Center.
- [00:01:12.620](upbeat music)
- [00:01:15.680]This is Faculty 101, Five Things About Social Media
- [00:01:19.350]and the First Amendment.
- [00:01:20.991](upbeat music continues)
- [00:01:26.240]Kyle Langvardt started out looking at the expansion
- [00:01:29.030]of the first amendment into areas where it doesn't belong.
- [00:01:32.760]That kind of work brought me to the cases
- [00:01:35.570]involving computer code and the application
- [00:01:39.810]of the first amendment to like 3-D printed weaponry,
- [00:01:42.500]these arguments that I think are kind of crazy.
- [00:01:45.660]And then eventually, several years ago
- [00:01:48.990]I started thinking, well, maybe at some point
- [00:01:51.960]we'll see some kind of law that tries to constrain
- [00:01:55.180]these really large platforms' power to govern speech.
- [00:01:59.640]So I write about the first amendment's application
- [00:02:02.270]to these platforms, but I also write a lot
- [00:02:04.120]about different types of policy interventions
- [00:02:08.760]that might be possible, either in the near
- [00:02:10.580]or in the long term, that kind of thing.
- [00:02:13.500]As part of his research, he addresses
- [00:02:15.420]the complex free speech and governance issues
- [00:02:18.150]surrounding social media platforms.
- [00:02:20.722](electronic chiming)
- [00:02:22.350]Number one, when it comes to social media,
- [00:02:25.440]the rights of users are, well, weak.
- [00:02:28.820]The short answer, at least if we're talking legally,
- [00:02:30.840]is they don't have any.
- [00:02:32.950]The platforms aren't state actors,
- [00:02:36.070]they're not the government, they're not in some kind
- [00:02:37.730]of joint enterprise with the government,
- [00:02:39.800]they don't perform a traditional exclusive
- [00:02:43.240]governmental function, so the first amendment
- [00:02:45.730]just doesn't apply to them.
- [00:02:47.060]And we don't really have other types
- [00:02:51.220]of anti-discrimination principles
- [00:02:52.880]that apply to online platforms either.
- [00:02:56.040]So if we're talking about the rights of users,
- [00:03:00.550]their rights are really the rights
- [00:03:02.360]that the platforms grant them.
- [00:03:04.523](electronic shimmering)
- [00:03:05.356]Number two, social media companies
- [00:03:07.730]do make an effort at self regulation,
- [00:03:10.320]setting up potential conflicts with freedom of speech.
- [00:03:13.910]I mean, the platforms have to regulate themselves.
- [00:03:16.470]If they didn't, the platforms would become unusable
- [00:03:19.980]and they'd become a pretty serious threat
- [00:03:21.680]to the public, by the way.
- [00:03:22.700]I mean, in other countries where, say,
- [00:03:27.540]Facebook has underinvested
- [00:03:29.470]in its content moderation operation, basically its censors,
- [00:03:33.170]there's been ethnic violence and that kind of thing
- [00:03:36.760]that's been linked pretty clearly
- [00:03:39.350]to activity on the platform.
- [00:03:40.900]So they have to do a lot of that kind of thing.
- [00:03:43.096](suspenseful music)
- [00:03:45.620]I guess the question is what kinds of self regulation
- [00:03:51.320]do we like, and what kind of self regulation don't we like?
- [00:03:54.440]And normally when we talk about something
- [00:03:56.150]like content moderation, what we're talking about
- [00:04:00.080]is the platforms stepping in after the fact.
- [00:04:02.600]You are temporarily blocked from posting on Facebook.
- [00:04:06.000]And either removing certain posts, suspending accounts.
- [00:04:10.444](bird chirping)
- [00:04:11.277]This post has been removed.
- [00:04:13.840]Demoting speech in people's newsfeed.
- [00:04:17.350]All of that is content-based, and traditionally,
- [00:04:21.640]at least if we were in a normal first amendment context,
- [00:04:26.070]we'd have a very suspicious attitude toward any of that.
- [00:04:28.830]We apply just the closest possible scrutiny
- [00:04:31.610]to any kind of content-based regulation.
- [00:04:34.740]I think the way that a platform like Facebook is set up,
- [00:04:38.600]there's basically no way around doing that.
- [00:04:42.757](suspenseful music continues)
- [00:04:45.300]But what I would really like to see from the platforms
- [00:04:47.580]is changes in the way that it's set up,
- [00:04:50.350]changes that it's designed.
- [00:04:52.500]One concern that I have about these social media platforms
- [00:04:55.850]that are ad-based and therefore dependent on virality,
- [00:04:59.484](electronic swooshing)
- [00:05:01.930]is that they have a tendency to make speech
- [00:05:06.740]a lot more dangerous than it's traditionally been.
- [00:05:11.100]The linkage between speech and harm
- [00:05:12.780]is just much tighter than it's been before.
- [00:05:16.120]And once you've kind of changed the background physics
- [00:05:19.850]of speech and communication and discourse,
- [00:05:23.610]I think at a certain point the freedom of speech
- [00:05:27.160]is just no longer a viable proposition,
- [00:05:29.300]and that's basically where we live now.
- [00:05:32.060]Number three, Twitter tested a feature
- [00:05:34.880]to encourage users to, as the platform says,
- [00:05:38.290]read it before you retweet it.
- [00:05:41.179](bird chirping)
- [00:05:42.070]If a user tries to share an article on Twitter
- [00:05:44.640]based only on the headline, a prompt pops up
- [00:05:47.880]asking if the user wants to read the article first.
- [00:05:51.400]That's a content-neutral intervention.
- [00:05:53.971]It doesn't require anybody to come in
- [00:05:56.260]and make some kind of judgment
- [00:05:58.000]about what kind of speech is valuable or what's not.
- [00:06:01.900]But it has the effect of slowing speech down.
- [00:06:04.521](slow garbled speech)
- [00:06:08.650]Maybe making speech a little bit more deliberative,
- [00:06:13.360]and hopefully reducing the prevalence of harm
- [00:06:17.229](gentle electronic music)
- [00:06:19.300]without requiring these kinds
- [00:06:21.000]of more disturbing interventions.
- [00:06:22.830]I think the thing is platforms don't want to do
- [00:06:26.870]too much of that because once speech becomes
- [00:06:30.580]too deliberative, their business model is shot.
- [00:06:34.940]I mean, what they want to do is walk people through
- [00:06:40.570]an experience that leads them to purchase more products,
- [00:06:43.110]and so deliberation is basically the enemy there (laughs).
- [00:06:46.680]Langvardt would like to see more of that type of change
- [00:06:49.390]in the way platforms operate.
- [00:06:51.580]As social media attains more and more influence,
- [00:06:57.730]you begin to move to a point where you pretty much
- [00:07:00.050]just have to rely on censorship
- [00:07:02.730]to avoid harm that's related to speech.
- [00:07:07.040]So I think that's a pretty disturbing development,
- [00:07:10.490]and I'd like to see public policies
- [00:07:12.110]that tried to head off that dynamic.
- [00:07:14.740]I think the freedom of speech requires
- [00:07:18.240]a kind of background social resiliency,
- [00:07:23.610]and if you're living in a social situation
- [00:07:26.730]that's characterized by low public trust,
- [00:07:32.290]hot emotions, a way of thinking about politics
- [00:07:36.550]that becomes almost sectarian,
- [00:07:41.070]then you begin to get to a place
- [00:07:43.550]where you can no longer really say
- [00:07:45.700]that the best way to address bad ideas
- [00:07:49.500]is by allowing good ideas to flow unrestricted.
- [00:07:55.600]I think what we really need to do as a society
- [00:07:58.215]is try to recreate that resilience wherever we can,
- [00:08:05.080]but that's a much harder thing to do
- [00:08:08.030]than just allowing a platform to step in after the fact
- [00:08:11.100]and take down speech that's bad.
- [00:08:14.683](alarm buzzing)
- [00:08:15.540]Your account has been suspended.
- [00:08:18.760]Number four, Langvardt is working on a project
- [00:08:21.510]that will examine the role of media and governance.
- [00:08:24.660]The idea here would be to look at
- [00:08:28.170]not just social platforms, but also more traditional
- [00:08:31.410]media outlets like newspapers, TV, radio,
- [00:08:35.120]as mechanisms for actually governing and constraining
- [00:08:39.390]social conduct, and talking about their interaction
- [00:08:42.920]with legal mechanisms and so on.
- [00:08:45.894](gentle piano music)
- [00:08:48.280]We're planning to draw together scholars
- [00:08:50.830]from all around the country, all around the world,
- [00:08:54.660]not just in law, but also in journalism,
- [00:09:00.060]computer science, outside the academy,
- [00:09:03.424]to talk about these things.
- [00:09:05.504](gentle piano music continues)
- [00:09:08.074](electronic shimmering)
- [00:09:09.170]And number five, it's important for students
- [00:09:12.050]to learn about these issues,
- [00:09:13.860]and for universities to study them.
- [00:09:16.870]Different classes can kind of touch
- [00:09:18.620]on technology in all sorts of ways,
- [00:09:23.950]but what gets left out, I think,
- [00:09:27.020]if you have just kind of scattered references to technology,
- [00:09:32.300]is the fact that technology itself
- [00:09:37.760]is a kind of governance institution,
- [00:09:41.300]that the design of networks and infrastructure and apps
- [00:09:49.120]have a role in structuring social behavior
- [00:09:53.160]that's really very similar to what law does.
- [00:09:57.760]And if you are not mindful of that,
- [00:10:03.800]then I think you wind up missing a lot of insights,
- [00:10:11.622]not just in practice, but as a citizen.
- [00:10:14.933](gentle piano music continues)
- [00:10:17.040]I think the danger is that you can begin
- [00:10:19.680]to look at these kinds of design choices
- [00:10:24.440]as something other than choices.
- [00:10:27.290]You can start to look at technology
- [00:10:29.550]as just nature or some kind of unstoppable force
- [00:10:34.340]rather than an actual mechanism for governance.
- [00:10:39.863]And so if we're not aware of that aspect
- [00:10:45.500]of what technology is, how it's constructed, what it does,
- [00:10:50.160]then we can wind up just kind of sleepwalking
- [00:10:52.020]toward a kind of technocracy where we've wound up
- [00:10:57.562]outsourcing all kinds of decisions to people
- [00:11:02.680]whose names we don't even know.
- [00:11:04.160]And by the way, I would say that's what we're doing
- [00:11:06.010]with the freedom of speech right now.
- [00:11:08.032](upbeat music)
- [00:11:11.300]That's Faculty 101, Five Things with Kyle Langvardt.
- [00:11:15.470](upbeat music continues)
- [00:11:17.840]Faculty 101 is produced
- [00:11:19.520]by the University of Nebraska-Lincoln.
- [00:11:21.970](upbeat music continues)
The screen size you are trying to search captions on is too small!
You can always jump over to MediaHub and check it out there.
Log in to post comments
Embed
Copy the following code into your page
HTML
<div style="height: 5.62em; max-width: 56.12rem; overflow: hidden; position:relative; -webkit-box-flex: 1; flex-grow: 1;"> <iframe style="bottom: 0; left: 0; position: absolute; right: 0; top: 0; border: 0; height: 100%; width: 100%;" src="https://mediahub.unl.edu/media/17873?format=iframe&autoplay=0" title="Audio Player: Five Things with Kyle Langvardt" allowfullscreen ></iframe> </div>
Comments
0 Comments