Is Regulation of Social Media Necessary to Protect Democracy?
Since 1922, the National Communication Association has sponsored international student exchange tours for the purpose of promoting debate, discussion, and intercultural communication. Renowned for their wit, humor, and eloquence, members of the United Kingdom’s English-Speaking Union tour the United States each year, debating the best and the brightest at our institutions of higher learning. The list of tour alumni include a British Prime Minister, a Leader of the Opposition, an Archbishop of Canterbury, and many senior politicians, journalists, and businesspeople. It promises to be educational and entertaining for those interested in communication, civic engagement, international relations, and global politics. The event will be moderated by Aaron Duncan, UNL Director of Speech & Debate.
icon search Searchable Transcript
Toggle between list and paragraph view.
[00:00:07.607]ANNOUNCER: Today, you are part
[00:00:08.742]of an important conversation
[00:00:09.976]about our shared future.
[00:00:11.444]They E. N. Thompson Forum
[00:00:13.880]on world issues explores
a diversity of viewpoints
[00:00:18.351]and public policy issues
[00:00:19.619]to promote understanding
[00:00:20.620]and encourage debate
across the university
[00:00:23.089]and the state of Nebraska.
[00:00:25.725]Since it's inception in 1988,
[00:00:30.296]have challenged and inspired us,
[00:00:31.965]making this forum
[00:00:34.100]one of the preeminent
[00:00:37.103]in higher education.
[00:00:40.006]It all started when
E. N. "Jack" Thompson
[00:00:42.542]imagined a forum
on global issues
[00:00:45.345]that would increase Nebraskan's
understanding of cultures
[00:00:48.014]and events from
around the world.
[00:00:50.884]Jack's perspective was
influenced by his travels,
[00:00:53.920]his role in helping to
found the United Nations,
[00:00:56.956]and his work at the Carnegie
[00:01:01.861]As president of the Cooper
Foundation in Lincoln,
[00:01:04.364]Jack pledged substantial
funding to the forum
[00:01:07.967]and the University of Nebraska
[00:01:10.303]and Lied Center
for Performing Arts
[00:01:12.505]agreed to co-sponsor.
[00:01:15.175]Later, Jack and his wife Katie
[00:01:16.943]created the Thompson Family Fund
[00:01:19.546]to support the forum
and all their programs.
[00:01:23.616]major support is provided
[00:01:25.418]by the Cooper Foundation,
[00:01:27.487]Lied Center for Performing Arts,
[00:01:30.123]and University of
[00:01:32.158]We hope these talks sparks
an exciting conversation
[00:01:40.133]And now, on with the show.
[00:01:49.609]MIKE ZELENY: Good evening
ladies and gentlemen.
[00:01:51.377](cheers and applause)
[00:01:57.784]I'm Mike Zeleny
[00:01:58.618]with the university
[00:01:59.452]and it's my pleasure
to welcome you
[00:02:00.220]to this special Wilson dialogue.
[00:02:02.288]Chuck and Linda
created the dialogue
[00:02:05.325]to explore both
sides of an issue
[00:02:06.426]important to Nebraska
[00:02:07.393]and the nation.
[00:02:08.394]Dr. Wilson, a
[00:02:10.062]served on the
University of Nebraska
[00:02:11.865]board of regents for many years
[00:02:12.932]and also served as
the medical officer
[00:02:13.933]in the US Army.
[00:02:16.002]Linda served on the
Lincoln City Council
[00:02:18.004]in the Public
for tonight's event
[00:02:21.307]is provided by the
E. N. Thompson Forum
[00:02:22.876]on world issues,
[00:02:24.110]Lied Center for Performing Arts,
[00:02:25.612]and the Cooper Foundation.
[00:02:27.113]We would also like
to acknowledge the
[00:02:29.315]and logistical support
[00:02:30.650]provided by the Department
of Communication Studies
[00:02:32.652]and the Center for
[00:02:34.554]Please join me in thanking them.
[00:02:43.429]This evening we would
also like to extend
[00:02:44.597]a very special warm welcome
[00:02:46.065]to Chancellor Green
and his wife Jane,
[00:02:47.734]to all of the Nebraska speech
and debate team alumni,
[00:02:49.669]and to all the high school
[00:02:51.638]speech and debate
students and teachers
[00:02:53.239]who have traveled statewide
to attend this event.
[00:02:55.275]We're thrilled that you're here.
[00:03:03.583]Known for their wit,
[00:03:04.617]humor and eloquence,
[00:03:06.185]members of the United Kingdom's
[00:03:07.554]English Speaking Union,
[00:03:08.988]the British National Debate Team
[00:03:11.991]toured the United
States each year
[00:03:12.992]debating the best and brightest
[00:03:14.093]at our institutions
of higher learning.
[00:03:15.628]The list of tour
alumni is brilliant.
[00:03:18.031]It includes a British
[00:03:19.566]the leader of the opposition,
[00:03:21.100]an Archbishop of Canterbury,
[00:03:22.402]and many senior politicians,
[00:03:25.939]The tour is coordinated
by the National
[00:03:28.908]The Nebraska Speech
and Debate Team
[00:03:31.477]consists of 46 students
[00:03:33.046]with majors ranging
[00:03:34.981]to political science
[00:03:36.049]to computer science.
[00:03:37.116]Ladies and gentlemen,
[00:03:38.117]this is the pride
of all Nebraska.
[00:03:39.619]In January, the team
earned it's sixth straight
[00:03:41.287]big 10 championship,
[00:03:42.655]capturing nine of 12
[00:03:44.624]Let's give them a
round of applause.
[00:03:46.359](cheers and applause)
[00:03:55.168]The deliberation this evening
[00:03:56.169]will be moderated by
Professor Aaron Duncan,
[00:03:58.037]director of the Nebraska
Speech and Debate Team.
[00:04:00.039]After the debate,
[00:04:01.541]you will have the opportunity
[00:04:03.710]to vote for a winner,
[00:04:05.244]and ask questions
of the participants
[00:04:08.615]using the hashtag E.
N. Thompson Forum.
[00:04:09.582]Also, ushers will be
available in the aisles
[00:04:11.351]to collect your
[00:04:13.019]and bring them to the stage.
[00:04:14.687]The title of this
[00:04:16.221]Is Regulation of
Social Media Necessary
[00:04:18.858]to Protect Democracy?
[00:04:20.760]Now please join me in welcoming
[00:04:21.995]the British National Debate Team
[00:04:23.196]and the Nebraska
Speech and Debate Team.
[00:04:41.614]MODERATOR: Thank you.
[00:04:42.582]It is my distinct pleasure
[00:04:43.583]to serve as tonight's moderator
[00:04:44.951]for the debate between
[00:04:46.319]the University of Nebraska
[00:04:47.420]and the British
National Debate Team.
[00:04:50.023]Please allow me to begin
[00:04:52.792]Tonight, representing the
British National Debate Team,
[00:04:55.962]here for Queen and country,
[00:04:57.964]are Richard Hunter
and Rebecca Haworth.
[00:05:09.075]Richard Hunter is a native
of Northern Ireland.
[00:05:10.677]He attended college of
University of St. Andrews
[00:05:13.279]where he graduated
in June with a degree
[00:05:15.915]in modern history.
[00:05:16.849]As a young man,
[00:05:17.950]he twice reached the finals
of the Northern Ireland
[00:05:20.086]Schools Debating Competition.
[00:05:22.555]He was also a finalist
[00:05:24.157]at the John Smith
Memorial Mace tournament.
[00:05:26.626]And in 2017, he was
chosen as an adjudicator
[00:05:30.129]at the European University
[00:05:35.601]studied economics in Germany
[00:05:37.370]at the University of Birmingham
[00:05:38.905]where she graduated in 2015.
[00:05:41.507]During her career she won
five major debate tournaments,
[00:05:44.177]made the final
round at 14 national
[00:05:49.382]including the German National
[00:05:52.285]in which she was debating
[00:05:54.087]in a foreign language.
[00:05:57.790]Becky currently works
[00:05:58.825]in a management consulting role
[00:06:00.760]at Price, Waterhouse, Cooper.
[00:06:02.895]The British National Team.
[00:06:13.272]Their opponents for
[00:06:14.373]representing the good life,
[00:06:15.942]and the great state of Nebraska,
[00:06:19.479]are Erin Shehan
and Colton White.
[00:06:20.713]Erin Shehan is a native
of Omaha, Nebraska
[00:06:24.083]and is senior at the
University of Nebraska Lincoln
[00:06:26.819]where she majors in political
science and history,
[00:06:28.855]and has minors in
[00:06:30.890]global studies and
humanitarian and human rights.
[00:06:34.393]She has won numerous state,
[00:06:35.895]national and Big 10 awards.
[00:06:37.530]Last year she ranked
[00:06:39.098]in the top 2% of the country
[00:06:41.000]in persuasive speaking
[00:06:42.368]and in the top 1%
[00:06:43.770]in communication analysis.
[00:06:46.005]Her partner for tonight's debate
[00:06:47.340]is Colton White,
[00:06:48.541]a native of Kearney, Nebraska
[00:06:50.109]and a junior at UNL
[00:06:52.111]majoring in English
[00:06:56.015]Colton is the president
of the UNL Debate Team.
[00:06:58.017]Last year he was the eighth
best speaker in the nation
[00:07:01.187]at our national
[00:07:02.789]and captained the team
[00:07:04.090]to a fifth overall
[00:07:06.225]The topic for tonight's debate
[00:07:09.095]Is Regulation of
Social Media Necessary
[00:07:11.597]to Protect Democracy?
[00:07:14.367]The University of Nebraska
[00:07:15.668]will be affirming
[00:07:18.738]The British National Team
[00:07:20.606]will be opposing regulation.
[00:07:22.909]The format of tonight's
debate is as follows:
[00:07:25.144]we will begin with four
eight minute speeches
[00:07:28.047]alternating between the
University of Nebraska
[00:07:30.316]and the British National Team,
[00:07:32.251]followed by two four minute
[00:07:35.922]We ask that you can
keep your phones on
[00:07:38.391]and keep tweeting at us
[00:07:39.959]at the E. N. Thompson hashtag,
[00:07:41.694]but please turn them
to silent if you do.
[00:07:43.930]Please submit your questions
[00:07:45.865]and please join me
[00:07:47.233]in welcoming our first speaker
[00:07:48.768]from the University of Nebraska,
[00:08:04.250]ERIN SHEHAN: Hi everyone.
[00:08:05.251]Colton and I just want
to start this debate
[00:08:06.819]by thanking the Wilson Dialogues
[00:08:08.754]and the E. N. Thompson Forum
[00:08:10.456]for sponsoring the debate
[00:08:11.624]as well as Becky and Richard
[00:08:13.092]for coming all the
way to Nebraska
[00:08:14.560]to join us.
[00:08:15.661]And then everyone
in the audience
[00:08:17.230]for coming tonight
[00:08:18.264]to watch the debate.
[00:08:19.932]We're here today to
discuss the resolution,
[00:08:21.133]regulation of social media
[00:08:23.302]is necessary to
[00:08:25.238]I'll start by establishing
the following definitions
[00:08:27.640]that we'll be using
in the debate.
[00:08:29.809]Regulation is a governmental
[00:08:32.611]rule or directive.
[00:08:34.280]This isn't just about
[00:08:36.349]We argue that companies can
establish regulations too.
[00:08:39.719]Social media consists of
websites and applications
[00:08:43.222]that enable users
to create content.
[00:08:45.191]But to participate
in social networking.
[00:08:47.960]So sites like
Facebook and Twitter.
[00:08:50.630]And protecting democracy entails
[00:08:52.632]ensuring the process of
free and fair elections.
[00:08:55.401]And the principles of
[00:08:58.204]and open deliberation.
[00:09:00.006]We contend that democracy
is not just a process.
[00:09:03.643]It's a principle
[00:09:05.711]and we support regulation,
[00:09:07.380]not at this time.
[00:09:08.781]We can't reject regulation
[00:09:11.484]that will stop misinformation
[00:09:13.519]just because it's regulation.
[00:09:15.421]Therefore, we argue
that we should regulate
[00:09:19.725]in order to make the
sharing of opinions
[00:09:23.562]We plan to achieve
this in two parts.
[00:09:24.797]First, government regulation
[00:09:26.732]should ensure openness
[00:09:29.068]Second, some responsibility
for social media companies
[00:09:32.471]should be placed at the feet
of social media companies
[00:09:36.642]First, government regulation
needs to guarantee
[00:09:38.945]openness and transparency
from social media companies.
[00:09:42.081]This means that these companies
[00:09:44.283]need to open up the
books to the government
[00:09:45.651]about how their algorithms,
[00:09:48.321]and ad revenue work.
[00:09:49.455]Social media algorithms
are really confusing.
[00:09:52.425]They act like a magic formula
[00:09:54.360]designed to deliver
the best content first.
[00:09:56.462]We don't know exactly how
these algorithms work,
[00:09:59.966]but we know that they
prioritize revenue over truth.
[00:10:05.037]The algorithms social
media companies use
[00:10:06.973]are designed to prioritize
the user's attention
[00:10:09.742]for as long as possible.
[00:10:11.510]By delivering information
that the users likes.
[00:10:14.313]I like these algorithms
when they give me ads
[00:10:18.050]for garlic bread
recipes and cat videos.
[00:10:21.220]But these algorithms
aren't so great
[00:10:23.255]when they organize us
reinforced echo chambers
[00:10:27.526]that are primed
[00:10:30.229]fake news was tabloids at the
grocery store checkout line.
[00:10:33.366]Now, social media algorithms
deliver fake articles,
[00:10:37.436]like the Pope
endorsing Donald Trump
[00:10:40.072]directly to those who are prone
to believing the headline.
[00:10:44.710]Nearly one 40 million
entirely false articles
[00:10:48.014]were shared on Facebook alone
[00:10:49.815]in the three months leading
up to the 2016 election.
[00:10:52.618]This isn't about partisanship.
[00:10:54.854]These fake articles contain
[00:10:59.692]deceiving Americans on both
sides of the political spectrum.
[00:11:04.030]Opening these algorithms
to a third party
[00:11:05.498]that isn't interested in profit
[00:11:07.733]could alter the way that content
[00:11:09.568]and fake news is
delivered to readers.
[00:11:12.138]The same thing is happening
with trending topics
[00:11:14.340]on sites like
Facebook and Twitter.
[00:11:18.411]RICHARD HUNTER: What kind of
[00:11:20.146]is the government going to use
[00:11:21.680]to enforce these regulations?
[00:11:23.649]ERIN: So we hope that the FEC,
[00:11:25.084]the Federal Election Commission
[00:11:26.419]would have oversight
in the same way
[00:11:28.120]that they have oversight
over things like
[00:11:29.722]Internet and television,
[00:11:31.757]which is something that I'll
get to later in the speech.
[00:11:34.060]So moving on,
[00:11:35.961]the same thing has happened
with trending topics
[00:11:38.097]on sites like
Facebook and Twitter.
[00:11:39.432]So following the 2016 election,
[00:11:41.100]Facebook admitted themselves
[00:11:42.835]that they had over
270 million accounts
[00:11:45.404]that were fake,
than they thought.
[00:11:48.240]And on Twitter,
[00:11:49.442]over 48 million
accounts were also fake.
[00:11:51.577]These fake accounts
known as bots
[00:11:53.679]were used during
the 2016 election
[00:11:56.048]to manipulate trending topics.
[00:11:57.750]We need more transparency
[00:11:59.552]about why certain
things are popular.
[00:12:01.287]Trending topics are important.
[00:12:03.389]When information is trending,
[00:12:05.057]it reaches new and
[00:12:06.992]And Google searches for
the topic increases.
[00:12:10.096]If we want social media to
function as a transparent
[00:12:12.665]and open space for the
sharing of opinions,
[00:12:15.267]the topics must be
what real people
[00:12:18.070]are actually talking about.
[00:12:24.844]Finally, social media
revenue from advertisements
[00:12:27.079]needs to be more transparent.
[00:12:28.881]Since 2011, Facebook has asked
[00:12:31.450]the Federal Election Commission,
[00:12:32.785]the agency that regulates
[00:12:34.353]for blanket exemptions
[00:12:35.888]for political advertising
[00:12:37.923]These are the types
of formal disclosures
[00:12:39.859]you see on political ads on TV,
[00:12:41.527]radio or print.
[00:12:43.262]Informing you of
who purchased the ad
[00:12:45.498]and if it's affiliated
with a candidate.
[00:12:47.266]Due to Facebook lobbying,
[00:12:50.503]these rules don't apply online,
[00:12:51.570]allowing social media companies
[00:12:53.272]to bring in a lot of money.
[00:12:55.241]We've since learned that
in the 2016 election,
[00:12:58.110]accounts affiliated with Russia
[00:13:00.012]spent hundreds of
thousands of dollars
[00:13:02.014]on targeted ads
before the election.
[00:13:04.049]The funding sources of these ads
[00:13:06.418]should have been disclosed.
[00:13:08.087]The FEC needs to
accept that Facebook
[00:13:10.789]is more than a
social media site,
[00:13:12.791]it's a media company.
[00:13:15.528]And these ads need
to be transparent.
[00:13:17.530]Next, some responsibility
for social media companies
[00:13:21.333]should be placed at the feet
[00:13:23.068]of social media
[00:13:24.904]Both companies and us as users
[00:13:26.739]need to take an active role
[00:13:28.874]to fix the problems
like fake news,
[00:13:30.509]and deceptive ads.
[00:13:31.810]We all need to take action.
[00:13:33.746]If Ted Cruz's social
[00:13:35.347]have taught us anything,
[00:13:38.551]sometimes the Senate can't
manage their own accounts,
[00:13:40.686]so they probably shouldn't
have full control over ours.
[00:13:44.190]Social media has already proven
[00:13:46.926]they can control content.
[00:13:48.994]It's just a matter of
if they do it for good
[00:13:50.896]or for bad.
[00:13:52.364]And we hope they do it for good.
[00:13:54.567]First, social media companies
[00:13:56.435]should identify and
eliminate fake accounts.
[00:13:58.370]Bots on social media can
gain a big following.
[00:14:01.507]During the 2016 election,
[00:14:04.643]a Russian backed Twitter account
[00:14:05.678]claiming to be the Tennessee GOP
[00:14:06.612]gained over 150
[00:14:08.681]During it's tenure,
[00:14:10.416]it was retweeted
by Kellyanne Conway
[00:14:12.518]and Donald Trump Jr.
[00:14:13.886]The real Tennessee GOP account
[00:14:16.255]only had 13,000 followers.
[00:14:18.624]Other fake accounts targeted
African American women
[00:14:21.393]telling them they
could vote from home
[00:14:23.662]on election day,
[00:14:25.264]and giving them a completely
fake number to use.
[00:14:27.433]Fake Twitter accounts tweeted
over 1.4 million times
[00:14:32.805]in the span of just
over two months.
[00:14:35.374]Despite these fake accounts
being reported many times,
[00:14:38.210]they weren't removed by Twitter.
[00:14:40.846]Social media can't
be an open space
[00:14:43.983]for the sharing of ideas
[00:14:45.150]if certain vulnerable groups
[00:14:46.685]are targeted with
[00:14:49.588]Social media must
direct more resources
[00:14:51.090]to removing fake accounts,
[00:14:53.459]and tagging them as suspicious.
[00:14:59.365]we hope for a transparent system
[00:15:00.466]of flagging news articles
[00:15:01.500]that involves user
[00:15:03.235]We know that fake
news is widespread
[00:15:05.571]and it causes a
lot of confusion.
[00:15:07.606]64% of Americans
[00:15:10.209]say fake news causes
them a lot of confusion
[00:15:13.245]about basic facts about
politics and current events.
[00:15:16.315]That's a huge problem
[00:15:18.050]and social media companies
need to be accountable.
[00:15:20.452]We can all help with this.
[00:15:22.554]We hope for a system in which
[00:15:24.690]social media sites
[00:15:25.691]tag the location of
origin for articles
[00:15:27.192]and allow users
to tag information
[00:15:29.561]as reliable or not.
[00:15:31.363]From there, disputed information
[00:15:32.965]can be tagged as such
[00:15:34.066]until fact checkers
have the opportunity
[00:15:36.769]Giving users transparent
and clear information
[00:15:39.204]can eliminate some of the issues
relating to fake news.
[00:15:43.442]We are arguing in
favor of the resolution,
[00:15:46.078]regulation of social media
[00:15:47.746]is necessary to
[00:15:49.548]not to regulate free speech,
[00:15:51.550]but instead to
allow social media
[00:15:53.585]to become a more
[00:15:55.454]Social media has given
the power to act
[00:16:00.025]as a journalist,
[00:16:01.026]and an advocate,
[00:16:01.960]and with this power comes a
new set of responsibilities.
[00:16:05.497]Regulation allows just this,
[00:16:10.035]and equitable exchange
[00:16:12.705]Therefore, I urge you all
to support the affirmative.
Before I start my speech,
[00:16:44.536]I'd just like to thank everybody
[00:16:45.637]who's made this
[00:16:46.972]We're thousands of
miles away from home
[00:16:48.107]and we feel like we have
[00:16:49.508]so many adopted friends
[00:16:50.542]from across the pond.
[00:16:52.177]So thank you
[00:16:53.178]honestly so much for
having us here tonight.
[00:16:55.280]So, social media provides
a really unique platform
[00:16:58.984]and there's a couple
of benefits of this.
[00:17:01.587]The first benefit of
this is entertainment.
[00:17:04.522]So if anybody's ever
seen the YouTube video
[00:17:06.258]"Charlie Bit My Finger"
[00:17:07.992]if you haven't seen it,
[00:17:09.228]I'd highly recommend.
[00:17:10.496]The second benefit is
in terms of news, right.
[00:17:11.829]So a lot of people
log on to Facebook
[00:17:14.800]and they get a load of diverse
[00:17:16.635]sources of information
[00:17:18.103]and different viewpoints.
[00:17:19.171]And we hear this
algorithm touched on
[00:17:20.571]by the other side.
[00:17:21.906]We think actually the way
that your news feed works
[00:17:24.076]is that you see
content from people
[00:17:25.944]that you're friends with,
[00:17:27.212]and note that typically we
can be friends with people
[00:17:30.082]who have different views
that we have, right.
[00:17:31.417]Or say if I really love
garlic bread recipes, right,
[00:17:34.520]I can up vote
garlic bread recipes
[00:17:36.755]and still see news articles
[00:17:38.690]from different perspectives,
[00:17:40.092]things that I never
thought about before.
[00:17:41.693]We think the social
media has revolutionized
[00:17:43.462]the free and open
exchange of ideas.
[00:17:45.631]Think if you wake up in
Virginia this morning,
[00:17:48.333]you'd have reminder
saying have you voted yet
[00:17:49.968]in the election?
[00:17:51.703]You know we hear these
arguments about how social media
[00:17:55.073]has been really disempowering
[00:17:56.074]for minority groups,
[00:17:57.309]but actually we think it's
a huge tool of empowerment
[00:17:58.911]to encourage people
to engage politically.
[00:18:00.779]We think that's really,
[00:18:03.182]COLTON WHITE: Question.
[00:18:04.216]REBECCA: Two points from you,
[00:18:05.184]I'll take you later.
[00:18:06.151]Two points from you
inside opposition today.
[00:18:07.152]Firstly, how the social media,
[00:18:08.153]when it is unregulated,
[00:18:10.289]is a necessary tool
[00:18:11.623]to empower people
[00:18:12.758]in a democracy.
[00:18:13.759]And secondly, how
[00:18:16.094]a healthy debate in a democracy.
[00:18:18.263]But first of all,
[00:18:19.298]some responses to
what we've heard
[00:18:20.499]from the other side.
[00:18:23.202]note we had this idea of
[00:18:28.006]prioritize revenue over truth.
[00:18:29.708]Say, point one,
[00:18:31.577]try and push past the rhetoric
behind this point, right,
[00:18:34.213]and see algorithms are complex.
[00:18:36.114]They work as explained,
[00:18:37.616]and if this team
is trying to solve
[00:18:39.685]echo chambers as they exist,
[00:18:41.153]they have a lot more work to do.
[00:18:42.521]No social media doesn't create
[00:18:45.123]and so far as we'd
like to speak to people
[00:18:46.992]who's views agree with our own,
[00:18:48.560]we do that in real life anyway.
[00:18:50.529]But social media
allows us to connect,
[00:18:52.531]to have issues going viral that
we've never heard of before
[00:18:54.933]and actually encourages
[00:18:56.401]more exchange of ideas.
[00:18:57.569]Then they say that the company
[00:18:59.271]should be establishing
[00:19:01.440]but we think we've seen
an organic development
[00:19:04.676]to push out some of the problems
[00:19:06.512]they that they
talk about, right,
[00:19:07.513]which reflects the
development of social media
[00:19:08.881]as a platform
[00:19:10.249]in its burgeoning maturity.
[00:19:11.950]So a trending news icon on
the top right of your screen,
[00:19:14.419]they started now only having
[00:19:16.221]articles trend that are
from reputable news sources
[00:19:19.258]moving away from this
idea of fake news.
[00:19:21.293]Note the fake news is basically
[00:19:23.896]become a catchphrase
[00:19:25.564]like a superhero catchphrase
[00:19:27.199]that we say, right.
[00:19:28.166]Actually, fake news
[00:19:29.601]can be used as a tool
[00:19:31.637]from people who are in power
[00:19:36.074]that say or challenge issues
[00:19:37.976]that the administration
isn't comfortable with.
[00:19:40.579]We think it's important
that we're critical
[00:19:42.247]about this concept
[00:19:43.448]and that we challenge
[00:19:44.750]this where it appears.
[00:19:46.184]We get the example
of saying, you know,
[00:19:47.719]previously you would have
fake news in a grocery store,
[00:19:50.355]but now you have it online.
[00:19:51.857]When we say actually
having it online is better
[00:19:53.425]because you can
instantaneously call it out.
[00:19:55.294]Whereas in conventional media,
[00:19:57.229]you'd have an apology
[00:19:58.297]on page 79 of next weeks
[00:20:01.934]Then we get this idea that
[00:20:03.702]transparency is super important
[00:20:07.406]that social media
tipped the balance
[00:20:09.107]of the election recently.
[00:20:10.175]We say A, there is
no empirical evidence
[00:20:11.877]that it changed the
outcome of the election.
[00:20:13.745]There are a number of
[00:20:15.547]as to why certain
people will have voted
[00:20:17.316]for different candidates.
[00:20:18.917]That issue hasn't been resolved.
[00:20:20.752]Note that also we have more
[00:20:22.521]optimism about people's
ability to critically discern,
[00:20:24.756]to note there's
loads of programs
[00:20:26.858]in high schools and
colleges across the country
[00:20:28.994]teaching people how to see
[00:20:30.929]what's fake news and what's not.
[00:20:32.664]That you have more independent
fact checking agencies
[00:20:33.966]that are able to call that out
[00:20:35.968]on the system.
[00:20:37.069]We think that fake news
can sometimes be used
[00:20:38.937]as a tool for distraction
[00:20:40.238]and it's really important
that that doesn't drive
[00:20:41.840]these policy decisions.
[00:20:43.575]Point one, why
[00:20:45.277]unregulated social media
[00:20:47.079]is so important
to empower people.
[00:20:48.380]We think that democracy is about
[00:20:49.948]expressing your ideas.
[00:20:50.983]And groups expressing
[00:20:53.051]and their priorities,
[00:20:54.019]but we can only
express our opinions
[00:20:55.687]on the infrastructure
[00:20:56.855]that we're provided.
[00:20:57.923]When you don't
have social media,
[00:20:59.691]when you don't have
a free social media,
[00:21:01.627]people have to take
their voices elsewhere.
[00:21:03.595]So when you have the
Black Lives Matter group
[00:21:06.231]who want to
of police brutal,
[00:21:12.938]when it's not regulated
[00:21:14.072]is a powerful way
that they control
[00:21:15.841]their own story
[00:21:17.109]and tell of their own troubles
[00:21:18.577]with their own voices
[00:21:19.611]so they can share a video,
[00:21:20.946]so it can go viral,
[00:21:22.080]so they can galvanize support
[00:21:23.548]for their political issue.
[00:21:26.184]REBECCA: I'll take you at
the end of this point.
[00:21:27.252]And without social media,
[00:21:29.688]you get called out
for being unpatriotic,
[00:21:32.858]for you know,
[00:21:33.925]the protect and serve mentality
[00:21:35.227]of the police force.
[00:21:36.528]We think that
[00:21:37.562]you need a critical
mass for these groups
[00:21:40.198]to gain progress
[00:21:41.199]and it's uniquely
visceral on social media
[00:21:43.568]when you're able to
share these videos.
[00:21:45.437]We think that regulation
[00:21:47.406]always going to be political,
[00:21:49.708]as I'll explain in
my second point.
[00:21:51.043]And when you don't have
regulation on social media,
[00:21:53.378]you give these people a voice.
[00:21:54.646]You let them communicate
directly to other citizens
[00:21:57.315]where conventional media
[00:21:58.850]will shut them out.
[00:22:01.086]It's really important
in a democracy
[00:22:03.355]that we have a balance
of different ideas.
[00:22:05.390]So even if it's not an
opinion that you agree with,
[00:22:07.859]it's important to have
[00:22:09.961]And social media
being free and open
[00:22:12.397]for everybody to use it,
[00:22:13.832]and for nobody to
have a rule book
[00:22:15.367]and say what's allowed
[00:22:16.468]and what's not allowed,
[00:22:17.469]is a really important
for these people
[00:22:19.037]to get their points
of view across.
[00:22:20.138]That matters A, for the
people who are in these
[00:22:24.009]who want to get their
points of view across,
[00:22:25.644]but B, for the average citizen
[00:22:27.145]who needs to inform themselves
[00:22:29.047]about what matters in society,
[00:22:30.916]and about the dignity
[00:22:32.250]and the democracy
of their society.
[00:22:34.419]I'll take your question, yeah.
[00:22:35.787]COLTON: Yes, so
[00:22:36.822]how are groups like
Black Lives Matter
[00:22:38.757]supposed to benefit
from social media
[00:22:42.461]when fake news
articles about them
[00:22:43.462]are being spread
[00:22:44.830]and lies being perpetuated?
[00:22:45.831]REBECCA: So people are capable
of critical thought.
[00:22:49.101]As I said in my rebuttal,
[00:22:51.169]there are a number of programs
[00:22:52.437]helping people discern
what is fake news
[00:22:54.005]and what's not.
[00:22:55.407]There's also multiple
[00:22:57.709]that you have on social media
[00:22:59.311]that can call out
[00:23:01.012]when something is
[00:23:03.181]Note the comparison
is having to go to
[00:23:06.518]with this typically
[00:23:07.786]racist reporting of crimes,
[00:23:09.254]when you have a
[00:23:10.856]with somebody who's white,
[00:23:12.057]or somebody who's
a person of color
on in different ways.
[00:23:14.860]And we think social media
[00:23:16.094]is really important for them
[00:23:17.429]to get their views
[00:23:18.430]Secondly, I'm gonna talk
to you about regulation.
[00:23:21.466]it's very, very difficult
[00:23:22.934]to pick any issue
which is not political
[00:23:27.038]i.e. it's impacting
[00:23:28.774]And we think the criteria
[00:23:29.775]for what's described
as fake news
[00:23:32.711]is necessarily going to vary
[00:23:34.312]depending on the priorities
of the current administration.
[00:23:37.015]Note the tendency to
term something fake news
[00:23:39.951]if it's an
[00:23:41.720]to current administration.
[00:23:43.922]We think that
[00:23:44.890]not having regulation
is really important
[00:23:48.260]to ensure this free
[00:23:49.494]exchange of ideas.
[00:23:51.730]it's an important
mechanism of accountability
[00:23:54.432]to make sure that the
government is listening
[00:23:57.736]even when those views
[00:24:00.505]So the example
that we would draw to
[00:24:02.274]is recently Google
[00:24:03.475]tried out some
[00:24:05.777]as to how their
search engines work.
[00:24:07.245]And they introduced a
change in the algorithm
[00:24:10.215]that was due to remove
[00:24:11.316]anything that was
[00:24:14.452]or blatantly misleading.
[00:24:15.954]And studies have shown that
[00:24:17.189]the conversion rate to websites
[00:24:18.824]that contained critique of
[00:24:20.292]of wars that American
was engaging with abroad
[00:24:23.528]had fallen by 70%.
[00:24:26.131]Think that these opinions,
[00:24:28.200]whether you agree
with them or not,
[00:24:30.035]are a fundamental part
[00:24:31.603]of a healthy democracy
[00:24:33.238]and that governments will
always have an incentive
[00:24:35.407]to set up a system
[00:24:38.143]elements of policy that
are inconvenient the them
[00:24:40.579]and the appalling words,
[00:24:42.280]and they're less likely to
get their senior politicians
[00:24:45.517]And we think that these
[00:24:47.619]uniquely come out
in social media.
[00:24:49.888]And so regulation
would stifle them.
[00:24:52.324]This is even more important
[00:24:53.859]when you think about countries
that claim to be democracies
[00:24:55.794]but actually actively infringe
[00:24:58.296]on press freedoms.
[00:24:59.631]So places like Turkey and Russia
[00:25:01.266]that claim to be democracies
love to use this as a tool
[00:25:04.636]to push down dissenting voices
[00:25:06.304]and stop people
[00:25:08.573]claiming what they need
[00:25:11.042]in terms of political change.
[00:25:12.344]Look, this is really
important for groups
[00:25:13.912]who experience structural
violence from the government
[00:25:15.380]and this is so important
for the average voter.
[00:25:17.249]We think regulation cannot work
[00:25:18.950]in the democracy
that we have today.
[00:25:20.785]Very proud to oppose.
So, before I begin,
[00:25:47.612]I would like to
extend the thanks
[00:25:49.948]that we've seen so far.
[00:25:51.182]First to the institutions
[00:25:52.484]that made this possible,
[00:25:53.752]such as the E. N.
[00:25:55.787]the Wilson Dialogue,
[00:25:57.255]the UNL Department of
[00:25:59.758]and the Lied Center
in which we are now.
[00:26:02.594]I would also like to
extend the thanks again
[00:26:05.063]to Becky and Richard for
coming all the this way
[00:26:07.399]to debate us,
[00:26:08.767]and again to all of you
[00:26:10.402]for being here
[00:26:11.736]to hear this debate.
[00:26:13.104]So, to begin,
[00:26:15.340]the American founding father,
[00:26:17.509]and more recently,
[00:26:22.681]"Unless your government
invade your rights,
[00:26:27.752]"even to observe neutrality,
[00:26:29.921]"you must have a
[00:26:32.157]Just as Hamilton
the role of government
in preserving democracy,
[00:26:36.261]we support regulating
[00:26:39.497]to preserve democracy
[00:26:40.765]and preserve truth.
[00:26:43.001]I would like to highlight
three main points
[00:26:45.704]that have emerged from
the debate so far.
[00:26:48.340]The first of which,
[00:26:50.308]there is a clear difference
that Erin and I propose
[00:26:58.550]are designed to
[00:27:00.852]It's not about regulating
what people can say.
[00:27:05.123]It's about regulating
what companies can hide
[00:27:08.426]and what different
[00:27:10.528]like the Kremlin,
[00:27:12.030]are able to hide from
the American people.
[00:27:15.300]We can see this in how American
broadcasting laws work.
[00:27:19.337]So for example,
[00:27:21.039]if there are two
[00:27:22.941]going against each other in
the American political system,
[00:27:25.477]and they take out
advertisements on radio,
[00:27:29.180]or they go on to
debate on radio,
[00:27:31.750]under US broadcasting law,
[00:27:34.119]they need to have
equal access to time.
[00:27:37.055]We can see how this
is really important
[00:27:39.190]because if one side
gets 10 minutes,
[00:27:41.192]and the other side
only gets one,
[00:27:43.161]that's not a fair dialogue.
[00:27:44.496]Erin and I would like
to take this opportunity
[00:27:47.265]to buy up all of our
opponents' speech time.
[00:27:54.906]underneath current law
[00:27:56.408]we need to disclose when
[00:27:58.710]who paid for certain
[00:28:00.678]So for example,
pays for a TV ad,
[00:28:07.519]we should be able to know.
[00:28:09.554]That doesn't mean that
that ad shouldn't run,
[00:28:12.257]but just that we
should have an open
[00:28:14.459]deliberation and discussion
[00:28:15.994]about whether or
not that's credible,
[00:28:18.129]or whether or not
we can trust it.
[00:28:20.365]For example, I hear the Brits
are being funded by Facebook
[00:28:23.935]for this event.
[00:28:26.738]Basically, we just
need to update our laws
[00:28:29.874]for democracy in
the 21st century.
[00:28:31.376]Our opponents may start
to frame this debate as
[00:28:34.112]tyranny is just
around the corner
[00:28:36.681]if we let
[00:28:38.049]administrations get control
[00:28:39.784]of how we
[00:28:41.252]handle our democracy.
[00:28:42.620]But that's not
what we're saying.
[00:28:44.656]We're saying that
tyranny is more likely
[00:28:47.125]around the corner
[00:28:48.326]when we don't have the
information that's necessary.
[00:28:52.597]COLTON: I will take your question.
So can you respond to the fact
[00:28:54.599]that you claim
[00:28:55.700]that we should
regulate social media
[00:28:57.368]like conventional media,
[00:28:58.670]but in conventional media,
[00:29:00.004]the broadcaster would choose
which programs are run,
[00:29:02.107]whereas social media
has to be different
[00:29:04.809]because users from the ground up
[00:29:07.112]will determine what
content is projected.
[00:29:09.380]COLTON: You seem to
[00:29:10.548]have an idea that
social media companies
[00:29:13.351]are completely neutral platform.
[00:29:14.986]But what we're saying is that
[00:29:16.654]they aren't necessarily so.
[00:29:18.123]For example, we don't know
how the algorithms work
[00:29:21.559]in so far as
[00:29:23.128]what material we see.
[00:29:25.296]These are not a
[00:29:27.031]That's what Facebook, Twitter
[00:29:28.633]and other social media
companies are saying,
[00:29:30.034]but we don't know if
that's entirely true.
[00:29:34.439]our opponents are saying that
[00:29:36.374]there can be political,
[00:29:39.277]and that this
criteria can be bad.
[00:29:40.745]We agree, which is
why we're saying that
[00:29:43.114]we should make things
more transparent instead.
[00:29:45.750]So moving on to
our second point,
[00:29:48.786]regulating social media
[00:29:51.189]can help move us more toward
[00:29:53.291]a more truthful society.
[00:29:55.693]The lifeblood of democracy
[00:29:58.263]is free and open information.
[00:30:00.765]In today's social media,
[00:30:03.635]truth is being
replaced with lies.
[00:30:06.070]This isn't a left
or a right issue
[00:30:09.774]in terms of our
[00:30:11.676]It's a truth or falsity issue.
[00:30:14.412]And that is how you should
think about the debate today.
[00:30:16.681]According to the
[00:30:19.817]and other media outlets,
[00:30:21.853]over 100 and 50
[00:30:24.956]were reached by
[00:30:27.292]during this past election.
[00:30:29.661]That is more
[00:30:31.062]than the total number of people
[00:30:33.364]who voted, period.
[00:30:36.701]Now we don't know for sure
[00:30:37.902]how many people were persuaded
[00:30:39.571]by one message or another,
[00:30:41.139]but we know that this is
influencing our elections.
[00:30:45.143]So, our opponents so
far in this debate
[00:30:47.946]have come up with
the notion that
[00:30:51.316]like Facebook and Twitter,
[00:30:52.850]is a great
place to voice political
discussions and debates.
[00:30:56.421]What is your secret?
[00:30:59.357]I have not had a single
good political discussion
[00:31:06.497]In fact, most of the time,
[00:31:07.599]it's just someone saying
[00:31:08.900]what is this curly headed
dude yelling about?
[00:31:10.969]What is his problem?
[00:31:12.770]That's the kind of
[00:31:14.739]that happens on Facebook.
they may say that
[00:31:19.043]sure, we may associate
[00:31:20.445]with people from our side
of the political spectrum,
currently operates is
[00:31:25.450]if you like a page
[00:31:26.951]that spreads fake news
[00:31:28.419]or supports one side
of a political issue,
[00:31:30.622]you're more likely to see
[00:31:33.491]that support that
viewpoint as well.
[00:31:36.027]That's the key
[00:31:37.929]debating within social media
[00:31:40.398]and debating within real life,
[00:31:42.400]where you have to engage people.
[00:31:44.035]And it's not like suddenly
you talk to someone
[00:31:46.437]with a political viewpoint,
[00:31:47.505]and 10 more of those people
[00:31:49.173]suddenly exist out of thin air.
[00:31:51.242]But that's what
social media does.
[00:31:52.810]And they bring up the example
of Black Lives Matter.
[00:31:55.013]However, we can see
how these organizations
[00:31:57.081]can be just as harmed
by social media,
[00:32:00.618]or any left or right
[00:32:03.288]when we can see fake
[00:32:05.990]about those organizations.
[00:32:08.059]How is someone supposed to be
open to a new political idea
[00:32:11.596]when there are
fake news articles
[00:32:13.364]being shared about them
[00:32:14.499]that say that
[00:32:15.533]this group promotes violence
[00:32:16.567]in a way in which they did not.
[00:32:17.835]Or, any other number of
fabrications that can occur.
[00:32:21.072]We would like to show how
regulating social media
[00:32:23.975]is the only way to move toward
a more truthful society,
[00:32:26.878]and therefore a more
[00:32:29.847]Lastly, we would like
to point out that
[00:32:33.184]in the status quo,
[00:32:34.619]social media companies can
willfully control our democracy.
[00:32:38.356]As Erin explained
on the first speech,
[00:32:41.025]regulation should open the books
[00:32:43.461]of social media companies.
[00:32:45.229]Social media companies
[00:32:47.865]Only a few key players,
[00:32:49.801]such as Google, Facebook,
[00:32:52.770]hold all of the cards.
[00:32:54.472]They are not the
[00:32:56.607]in our society.
[00:32:58.810]Our opponents talk about sources
[00:33:03.014]we need to be able to
[00:33:04.615]discuss whether or not,
[00:33:06.584]or that people are
always able to determine
[00:33:08.753]whether or not a source is
credible or not credible.
[00:33:11.556]While we can certainly
say that people have
[00:33:15.660]critical thinking skills,
[00:33:16.961]and that we should
[00:33:18.429]this is not what
social media does.
[00:33:21.065]As we pointed out
in the last speech,
[00:33:22.633]we are more likely
[00:33:24.035]to be able to
[00:33:25.136]be confused about what
sources are saying.
[00:33:28.239]That's what Erin proposed
in our first speech.
[00:33:30.475]Social media actually increases
the confusion over sources.
[00:33:33.644]It muddies the
waters of the debate.
[00:33:35.847]It doesn't clarify them,
[00:33:37.181]which is why we need to have
[00:33:39.617]social media companies
[00:33:40.952]take their role seriously
[00:33:43.554]in our society,
[00:33:45.223]and take down fake
accounts that are saying
[00:33:49.427]Donald Trump was
endorsed by the Pope,
[00:33:53.331]Hillary Clinton has a kill list,
[00:33:56.367]or any other number of
[00:33:59.704]that we have from this last
[00:34:03.541]So to close,
[00:34:05.109]as Thomas Jefferson,
[00:34:06.878]another Broadway star,
[00:34:10.014]"The end of democracy,
[00:34:11.516]"and the defeat of the
falls into the hands
[00:34:17.755]"of moneyed corporations."
[00:34:20.257]Well, Jefferson probably
didn't say that on Twitter,
[00:34:23.194]we can see that
[00:34:25.362]unchecked social media
[00:34:27.165]can be a threat to democracy
[00:34:29.434]in the United States
[00:34:33.271]In order to move towards
a more truthful society,
[00:34:35.739]we urge you to support
[00:35:00.264]RICHARD HUNTER: So finally,
as the fourth speaker,
[00:35:02.033]I would like to echo
all of the thanks
[00:35:03.901]that have been given so far.
[00:35:05.136]I know it's very
[00:35:06.337]me and Becky as we've come here,
[00:35:08.372]have definitely felt like
[00:35:10.374]Thank you very much.
[00:35:13.144]I will not as a British person
[00:35:14.378]with a funny accent,
[00:35:15.680]be quoting any founding
fathers in this speech.
[00:35:21.118]Sorry, about that,
[00:35:22.720]but as a history student,
[00:35:24.055]I am aware of certain
[00:35:26.290]And I can't help but
escape the knowledge
[00:35:28.926]that five hundred years
[00:35:30.761]and one week ago,
nailed his 95 theses
[00:35:34.499]to a church door in Wittenberg,
[00:35:35.967]starting what was known as
the European Reformation,
[00:35:39.270]which changed the face
of Europe forever.
[00:35:41.873]He did that,
[00:35:42.907]and his teachings
reached a wide audience
[00:35:45.076]because he used
the printing press.
[00:35:47.512]He was able to create
pamphlets and documents
and people read them
[00:35:52.817]who hadn't been exposed
to this material before.
[00:35:55.219]And it all happened because
[00:35:56.888]of this new and
[00:35:59.290]We've seen this recurring
[00:36:01.926]Any time there is a
innovation in the way
[00:36:06.364]that people consume information,
[00:36:07.798]people use it
[00:36:09.066]for political purposes.
[00:36:11.068]It happened during
the 1970's in Iran
[00:36:13.371]where people were
smuggling in cassette tapes
[00:36:15.406]of addresses by
[00:36:17.742]which lead to the
overthrow of the Shah.
[00:36:19.777]It happened in the late
1980's in Eastern Europe
[00:36:22.847]when people were
smuggling fax machines
into the country
[00:36:26.951]so that they could
[00:36:30.955]And it happened after the
revolution of social media
[00:36:34.392]when we saw during
the Arab spring,
[00:36:36.093]Twitter and Facebook and YouTube
[00:36:38.429]used to mobilize
of thousands of people
[00:36:41.699]onto the streets.
[00:36:43.401]We think the
[00:36:45.202]are always used by those
who have subversive ideas
[00:36:48.973]that are not recognized
[00:36:50.174]as important or
valuable by the state
[00:36:52.577]for social and political change.
[00:36:55.279]We think that
[00:36:56.380]the ability to limit and
control those technologies
[00:36:59.450]about how people use them and
the way in which they use them
[00:37:02.787]and the things that they
say when using them.
[00:37:05.423]The ability to do that
[00:37:06.924]has always been used
[00:37:08.259]by those who hold
[00:37:10.494]to limit free speech
[00:37:11.762]and to maintain their hold
[00:37:13.998]on political and social power.
[00:37:15.900]We believe on side opposition
[00:37:17.568]that the state should never
have the ability to do that,
[00:37:20.171]that regulation of social media
[00:37:22.306]should be something that is left
[00:37:24.175]to the consumers of social media
[00:37:25.209]and the companies that run that.
[00:37:27.211]And we think that will always be
[00:37:28.713]far more healthy and
far more protective
[00:37:31.282]of democracy that we
hold to be so valuable.
[00:37:34.452]So some points of rebuilding
of what Becky said,
[00:37:36.621]some rebuttal to their speech,
[00:37:38.422]and then a little constructive
material of my own.
[00:37:40.157]So firstly note
[00:37:41.425]that there was very
little actual rebuttal
[00:37:43.594]given to Becky's
[00:37:44.795]material about how social media
[00:37:49.400]We got out of the last speech
[00:37:50.501]was that simply the assertion
[00:37:52.203]that there are never any good
[00:37:55.740]I might suggest to
the previous speaker
[00:37:57.241]that maybe he may need to
talk with better people
[00:37:59.577]on social media.
[00:38:01.646]He maybe need to seek out spaces
[00:38:03.180]in which those,
[00:38:04.715]those conversions happen.
[00:38:06.751]But I think also importantly,
[00:38:08.419]and this came up
[00:38:09.520]in Becky's speech as well,
[00:38:10.788]but we see how autocracies
[00:38:12.523]have this tendency to
limit freedom of speech,
[00:38:15.159]countries like Turkey,
[00:38:16.527]countries like Russia.
[00:38:17.795]They claim to be democracies
[00:38:19.163]and to have free
and fair elections,
[00:38:20.564]but they use their
control of information
[00:38:23.034]to say that
[00:38:24.902]certain things are fake,
[00:38:26.771]certain things didn't happen,
[00:38:28.606]or are not real
[00:38:30.074]and control the narrative.
[00:38:31.409]We see that in China
[00:38:32.410]where simply posting
on social media
[00:38:34.011]anything to do with
[00:38:36.814]will get you a visit
from the police
some kind of crime
[00:38:40.985]against the state.
[00:38:41.986]We think that,
[00:38:43.087]we don't think that this will
happen tomorrow in America
[00:38:45.556]if this regulation passes.
[00:38:47.558]But we think that
a significant harm
[00:38:49.293]and that the state is always
likely to take more control
[00:38:52.263]that is necessary--
[00:38:54.699]RICHARD: No thank you.
[00:38:55.700]They also say that regulations
[00:38:56.701]is not censorship,
[00:38:58.436]that they are not
[00:38:59.770]being, stopping people
from saying things.
[00:39:01.772]You're preventing what
people hide away, right.
[00:39:04.408]We think that
[00:39:06.010]when you do that,
[00:39:06.977]you are regulating
what people say.
[00:39:08.512]As soon as these
[00:39:10.948]someone in the government
[00:39:13.250]is going to be trusted
to draw the line
[00:39:14.418]as to where fake news ends
[00:39:16.287]and where political
speech begins, right.
[00:39:18.656]What we are saying is
that we do not trust
[00:39:22.059]not just to draw that line,
[00:39:23.260]but we don't trust
[00:39:24.562]to move that line
[00:39:25.529]in a way that will suit them.
[00:39:27.531]We also hear this
comparison to television
[00:39:30.267]and radio, right.
[00:39:31.602]We importantly bring
you the rebuttal
[00:39:33.571]that social media
[00:39:35.606]It is not just
the New York Times
[00:39:38.008]that defines what is on
your Facebook newsfeed.
[00:39:39.643]It is what you say
[00:39:41.479]and your friends say.
[00:39:42.446]It's the comments on articles.
[00:39:43.781]It's the debate that goes on
within those comment sections.
[00:39:46.650]We didn't think the
[00:39:48.219]it is a
[00:39:49.353]well known principle
[00:39:50.354]that you cannot regulate
[00:39:52.990]You cannot regulate
[00:39:54.592]that people come to
[00:39:55.593]based on information.
[00:39:56.594]We don't think it
would be legitimate
[00:39:57.795]for the state to do that.
[00:39:59.029]And we think that when the state
[00:40:00.131]is going to find
it very difficult
[00:40:01.398]to distinguish between
[00:40:02.733]what is media and what
is people's thoughts,
[00:40:04.902]they shouldn't regulate.
[00:40:06.303]They then say that we're going
to have user participation
[00:40:09.707]and that's going to make
this system very effective.
[00:40:11.542]We think that
[00:40:12.877]either you will have a system
[00:40:15.679]where very few users flag things
[00:40:17.181]as fake news,
[00:40:18.516]in which case you won't get the,
[00:40:19.550]it won't be widespread
enough to be effective,
[00:40:21.452]because not enough articles
will be flagged by users,
[00:40:23.687]and you won't catch everything.
[00:40:26.757]Or you'll have
[00:40:28.092]everyone acting as
their own censor,
[00:40:29.493]and we think that that would be
[00:40:30.928]terrible because users are
[00:40:32.496]We think that there are people
[00:40:34.965]who will hear Alex Jones
[00:40:36.634]telling them that they're
poisoning the water,
[00:40:38.536]that the globalist agenda
[00:40:40.871]to further some thing
that is a threat to them,
[00:40:42.640]and then conveniently sell
them some water filters
[00:40:43.974]and brain pills on the side.
[00:40:45.609]We think that some
people will think
[00:40:47.211]that he is reliable
[00:40:48.479]and will flag anything
that contradict him
[00:40:51.315]even though it is
[00:40:52.783]We think you will get a mess
[00:40:54.084]and you will not
solve the problem.
[00:40:57.321]We think that
[00:40:58.656]the solution to this problem
[00:41:00.191]is found in the actions of
technology companies themselves,
[00:41:05.062]to coin a phrase,
[00:41:06.297]what is Facebook but its people.
and without users,
[00:41:10.968]Facebook is nothing.
[00:41:12.670]We think that Facebook then
[00:41:13.971]finds that it has an
incentive to respond rapidly
[00:41:16.507]because any bad PR,
[00:41:18.042]any accusation that is peddling
[00:41:21.245]could impact it's stock price.
[00:41:23.380]And that's all Facebook
really cares about.
[00:41:25.716]We think that when company
gets these bad reactions,
[00:41:27.284]they have to react
[00:41:30.254]See how quickly Reddit ended up
[00:41:35.326]when it became clear
[00:41:36.727]that there was a
massive media storm
[00:41:38.295]about them doing so.
[00:41:39.563]We also think companies
[00:41:41.065]can respond flexibly
[00:41:42.333]and the companies
have the ability
[00:41:43.901]to rapidly hire people
[00:41:45.236]of a wide variety of backgrounds
[00:41:46.804]to help them do this.
[00:41:47.938]Note how Facebook
already existing political
[00:41:50.908]fact checking organizations.
take longer to do things
[00:41:55.446]and have more layers
[00:41:56.947]and more strange incentives
[00:41:58.916]to respond to this.
[00:42:00.484]We also think that
[00:42:01.785]at the moment,
[00:42:02.786]the government gets involved,
[00:42:03.821]the state has asymmetric power.
[00:42:05.122]The state has the
ability to change,
[00:42:06.724]change the laws as
quickly as it wants,
[00:42:09.026]to reinterpret laws
[00:42:10.294]because it has the power of
the law and order system.
[00:42:12.630]It has the power even if
it doesn't get involved
[00:42:14.765]through the justice system,
[00:42:16.200]it has the power to,
[00:42:17.234]to change how much tax
these companies pay,
[00:42:20.037]to threaten them with
all sorts of things.
[00:42:23.274]We think that once
they have that power,
[00:42:25.175]and once they get involved--
[00:42:28.479]they will abuse that power
[00:42:29.446]and do it,
[00:42:30.447]yes, I will take you.
[00:42:31.448]COLTON: So, why aren't we
living in a dictatorship
[00:42:33.784]after we passed
[00:42:36.453]RICHARD: So as we've said,
[00:42:40.658]are not similar to social media
[00:42:42.660]in that setting up a massive
television broadcasting aerial
[00:42:46.730]is very difficult
[00:42:48.832]and opening a Facebook
account is very easy.
[00:42:50.167]We think that
[00:42:52.069]the government therefore
regulates the electromagnetic
[00:42:53.938]spectrum on what
you can broadcast,
[00:42:55.739]but it cannot regulate
[00:42:58.342]because there is no
[00:42:59.743]between what the Washington
Post posts on the Internet,
[00:43:01.245]and what you post
on the Internet.
[00:43:04.415]we think that's,
[00:43:06.083]that's the difference there
[00:43:07.551]and that's why
[00:43:08.485]television and radio wasn't
necessarily as toxic.
[00:43:10.521]But as we also said,
[00:43:11.922]the stuff about echo chambers
[00:43:13.324]is hardly new.
[00:43:14.425]We think that
[00:43:16.460]you have always been likely
to have the political opinions
[00:43:19.096]of the people in your community,
[00:43:20.931]of the people of your age group,
[00:43:22.599]of the people of your job.
[00:43:24.001]We think that
[00:43:25.469]this problem did not
come out of thin air.
[00:43:27.805]In closing, I would
simply say that
[00:43:30.507]in the United Kingdom,
[00:43:32.009]we do not have a
[00:43:34.144]We do not have a constitution
[00:43:36.080]that we can rely on
[00:43:37.147]that says we are guaranteed
the freedom of speech,
[00:43:39.016]the freedom of religion,
[00:43:40.050]the freedom of assembly,
[00:43:41.452]and the freedom to
[00:43:44.154]And that means that we
cannot challenge things
[00:43:46.256]that we believe to
[00:43:47.958]as easily as people
in the United States.
[00:43:50.527]We do not,
[00:43:51.562]the government has
far more power over us
citizens than we do.
[00:43:54.732]And this means that
[00:43:58.102]has the ability to surveil us
[00:43:59.970]and has much more surveillance
[00:44:01.405]and that was passed
[00:44:02.940]and was completely legal
[00:44:04.541]and was waived through
without much protest.
[00:44:06.910]You in the United
States have that.
[00:44:08.679]I believe that you
should treasure it.
[00:44:11.515]I believe that
you should not let
[00:44:12.916]some heavy handed regulation
[00:44:16.820]immediate response to
one very small problem
[00:44:18.655]to take that freedom
of speech away.
[00:44:20.924]I would urge you to
be on side negative.
This debate today has been
full of a lot of great ideas
[00:44:44.615]and interesting examples.
[00:44:46.917]But, we believe on
the affirmative side
[00:44:49.119]that it comes down to
three important questions.
[00:44:52.122]How does democracy thrive
without and informed populous?
[00:44:56.894]Are you truly free
[00:44:59.163]if you're kept in the dark?
[00:45:00.631]And what is the best way to
promote equitable discourse?
[00:45:03.934]We are standing in
favor of equitability,
[00:45:07.504]transparency, and responsibility.
[00:45:10.307]First, we think that all
ideas should be equal.
[00:45:14.411]Ideas that aren't fake news.
[00:45:16.814]And fake news is a real thing.
[00:45:19.016]Ideas that are
[00:45:21.618]Things like Donald Trump
was endorsed by the Pope.
[00:45:24.455]This isn't partisan,
it's not true.
[00:45:28.826]The Brits think that
[00:45:30.461]everything is equal on our
social media platforms.
[00:45:32.329]I value the idea of equality
on social media platforms.
[00:45:36.467]But things aren't equal on
our social media platforms
[00:45:40.304]when we know nothing
about our algorithms
[00:45:43.006]or why certain
topics are trending.
[00:45:45.008]This is why regulation
[00:45:47.444]We aren't regulating content
[00:45:50.681]like the British
would like to suggest,
[00:45:52.616]or shutting users
[00:45:53.984]who are actually
spreading real information
[00:45:57.087]off of Internet sites.
[00:45:58.989]We're shutting out fake accounts
[00:46:01.325]that are using the Internet
[00:46:03.393]to manipulate American citizens.
[00:46:06.663]Next, we believe that
the Internet should be
[00:46:09.066]a space with transparency.
[00:46:10.701]We should know who is
[00:46:13.770]and fake news articles.
[00:46:15.506]Right now, we don't on
sites like Facebook.
[00:46:18.509]This is what we're talking about
[00:46:20.077]when we discuss things
like the FEC regulations.
[00:46:22.079]They don't apply to
[00:46:25.115]and they should.
[00:46:26.617]This has nothing to do with
[00:46:28.185]whether you're choosing
to participate in Facebook
[00:46:29.987]or you're not.
[00:46:31.355]An advertisement is
content you're exposed to
[00:46:33.924]regardless of the platform.
[00:46:35.526]And you should know
who paid for it
[00:46:37.961]and where it's coming from.
[00:46:40.063]How can you have optimism
about people's ability
[00:46:42.366]to tell if something is fake
[00:46:44.067]when people were targeted
by fake Twitter accounts,
women who were told
[00:46:49.940]they could vote by
home on election day
[00:46:52.209]and they believed it.
[00:46:53.644]I'm optimistic about
[00:46:55.746]to discuss information
[00:46:57.347]in a civic engagement forum
[00:47:00.017]But I'm not optimistic
about people's ability
[00:47:01.952]to discern what's
fake and what's not
[00:47:04.421]when it's already
[00:47:06.557]for people to do so.
[00:47:08.258]This is again why we think
regulation is important.
[00:47:10.694]And finally, we think that
[00:47:13.597]there needs to be responsibility
[00:47:15.265]on both behalf of the government
[00:47:17.467]and social media companies.
[00:47:19.570]At this point, it seems
like both us and the British
[00:47:23.273]are arguing the same thing.
[00:47:24.374]We both want social media
[00:47:26.043]to do a better job
[00:47:27.945]of self regulating.
[00:47:29.246]We don't want regulation
[00:47:30.948]that limits speech.
[00:47:32.382]We don't want
[00:47:34.718]and we don't want to eliminate
[00:47:39.356]This is about targeting
[00:47:41.425]We're not trying to push
down the disenfranchised.
[00:47:45.062]We already regulate
speech in this country.
[00:47:48.632]It's about drawing the line
[00:47:50.834]of where we regulate speech,
[00:47:52.302]where it's appropriate to do so.
[00:47:54.538]They want to draw the line
in favor of absolute speech,
[00:47:58.508]a way that allows
us to be misinformed
[00:48:01.178]by Twitter accounts,
[00:48:02.980]by Russians and
[00:48:05.182]that want to manipulate
[00:48:07.618]I want to draw the line
[00:48:09.319]in a way that allows us
[00:48:10.654]to actually have equitable
[00:48:12.389]and transparent social
[00:48:15.525]This debate is easy.
[00:48:17.427]The British have oversimplified
how social media works
[00:48:20.130]and how regulation works.
[00:48:21.932]If you support ideas
[00:48:26.637]you will support
In a number of ways,
[00:48:41.718]this is a really nice debate
[00:48:42.719]because both sides want
to achieve the same thing.
[00:48:44.588]We want a great democracy,
[00:48:45.589]we want transparency,
[00:48:46.823]we want good
[00:48:48.859]and we want the
truth to get out.
[00:48:50.861]But we think we achieve
it in different ways.
[00:48:52.462]So I'm going to
explain why I think
[00:48:53.930]that our side has
won this debate.
[00:48:55.565]First idea that I
really want to challenge
[00:48:57.768]is that we get the
concept of equitability
[00:48:59.436]from the opposite side.
[00:49:00.937]And we think this is
[00:49:02.439]in a social media context.
[00:49:03.740]Because to get equality
[00:49:06.610]of ideas on social media,
[00:49:08.211]you would literally
have to regulate and say
[00:49:10.347]10 people have to
post about cat videos,
[00:49:12.649]10 people have to
post about dog videos.
[00:49:15.152]We think that ignores the
way that social media works
[00:49:17.554]in the content is created
by individual users
[00:49:20.624]and there's a up-voting,
[00:49:23.527]of people who enjoy
[00:49:26.229]moving them upwards
in the news feeds.
[00:49:28.899]And we get the idea that
[00:49:30.300]the advertising in particular,
[00:49:32.769]is harmful because
it targets people
[00:49:34.571]and means that they
make bad decisions
[00:49:37.274]as they're termed.
[00:49:38.475]But, I really think
[00:49:40.677]how people engage with content,
[00:49:42.579]because I get Facebook ads
[00:49:44.348]for like fancy
cheese toasty makers,
[00:49:46.350]and yogurt retreats in Barley.
[00:49:48.018]But I don't go and spend
my hard money on it, right.
exposed to huge amounts
of commercial advertising
[00:49:53.557]but we're not
standing here saying
[00:49:56.159]that we have a problem
[00:49:57.394]about how heavily
influenced by this we are
[00:49:59.596]because we believe
that we have an ability
[00:50:01.264]to think for ourselves.
[00:50:02.899]And it's a real leap
of logic, right.
[00:50:05.302]It's true that Russia
created a lot of content.
[00:50:07.738]It's true that Donald
Trump won the election
[00:50:10.140]but there's a huge leap of logic
[00:50:11.475]to say that is the only reason
[00:50:13.677]or that was a significant
driver in why that happened.
[00:50:15.946]We don't think they fulfilled
their burden on that side
[00:50:18.315]of proving it.
[00:50:19.449]We don't think the
[00:50:21.718]have achieved their goals
[00:50:23.086]and in fact, we think
they make democracy worse.
[00:50:24.821]So the idea of transparency,
[00:50:25.922]so they say that
[00:50:27.057]companies use these
[00:50:29.393]which are really harmful
[00:50:30.594]because it allows an
unequal platform of ideas.
[00:50:32.829]But we say when you give
it to the government
[00:50:35.232]to write the rule book
[00:50:36.400]of what are the ideas
that are allowed
leads to censorship.
[00:50:40.103]So what happens when,
[00:50:42.072]if the government decides
[00:50:43.907]that climate change is not
scientifically robust enough
[00:50:47.310]and that's considered fake news?
[00:50:49.079]Or if Black Lives Matter is
considered to be an extremist
[00:50:54.618]who's ideas shouldn't
[00:50:56.253]across the platform.
[00:50:57.487]They don't achieve transparency.
[00:50:59.656]They just shift power to a body
[00:51:01.858]which won't be open,
[00:51:03.360]which doesn't give
[00:51:06.096]of their reasoning,
[00:51:07.097]and in fact, concentrates power
[00:51:08.665]in a small number of people.
[00:51:10.000]They talk about the
value of discussion,
[00:51:12.068]and they say, you know,
[00:51:14.171]what's the secret to having
better discussions on Facebook?
[00:51:16.506]Sure, there is a
part of Facebook
[00:51:18.942]which is purely like,
[00:51:20.143]videos of dogs being
forced into baths,
[00:51:22.078]which is really
funny by the way,
[00:51:23.780]you should check it out.
[00:51:25.949]But the thing is, right,
[00:51:26.983]sure every now and then
[00:51:27.984]we'll unfollow someone
[00:51:29.252]who has an opinion
that we disagree with.
[00:51:30.420]But every now and then,
[00:51:31.521]we will pick our battles too.
[00:51:32.489]Or we will start
having a conversation
[00:51:33.857]and challenge those ideas.
[00:51:35.325]With it, you get so
much more interaction
[00:51:37.127]on social media.
[00:51:38.462]Then this idea of truth, right,
[00:51:39.863]and fake news,
[00:51:41.565]which I honestly think
[00:51:42.566]is a phrase that people
are obsessed about
[00:51:44.067]in this country.
[00:51:47.003]And they got this concept
of the Pope, right,
[00:51:48.271]and that the Pope
[00:51:49.940]gave him massive support.
[00:51:51.408]Well, I really don't understand
[00:51:54.277]what the marginal impact
of that article was
[00:51:57.047]compared to a photo
of Donald Trump
[00:52:00.016]stood next to the Pope.
[00:52:01.418]Like the implicit
impact of that,
[00:52:03.487]the marginal impact of that
[00:52:05.989]is really, really small.
[00:52:07.057]But also, as I was saying
about commercial advertising,
[00:52:09.125]it assumes that we
are brainless voters
[00:52:12.796]who can't discern
[00:52:15.165]We've been trained to
identify this stuff.
[00:52:17.067]We have independent fact
[00:52:19.402]And we think it's really harmful
[00:52:21.304]when you let the definition
[00:52:22.339]be driven by the government.
[00:52:23.540]Look, Richard gave a lot of
really important material
[00:52:26.276]on countries that
would like to use this
[00:52:28.879]to impose that asymmetric power
[00:52:31.181]onto the population.
[00:52:32.816]We don't think that
was responded to.
[00:52:34.184]This team tried to claim
[00:52:36.419]that minorities are
let down by fake news,
[00:52:39.289]but we don't actually think
that's the biggest problem
[00:52:42.025]as they claim it is.
[00:52:43.660]We think that social
media is so unique
[00:52:46.129]in that you can
[00:52:48.131]So you're not just
watching a TV show
[00:52:49.132]and then thinking to yourself,
[00:52:50.467]oh what do I think in
reflection to this piece.
[00:52:51.968]You co-create the material,
[00:52:53.670]and you share that
with other people.
[00:52:55.672]And any regulation
[00:52:56.773]of social media is
going to shut people out
[00:52:58.742]from that contribution.
[00:53:00.243]And for some people,
[00:53:01.378]it's the only voice
that they can have.
[00:53:03.346]That's why we are so
opposed to this regulation.
[00:53:05.248]Thank you for the debate today.
[00:53:07.350]It's been fantastic.
[00:53:08.418]Proud to oppose.
Thank you to both teams
[00:53:32.075]for a excellent debate.
[00:53:43.086]MODERATOR: It has become commonplace to say
[00:53:44.588]that debate, discussion,
[00:53:46.256]and dialogue are dying in the
modern American democracy.
[00:53:49.326]But to that I would say,
[00:53:51.294]tonight, here in
[00:53:53.196]on the campus of the
University of Nebraska,
[00:53:55.532]those values are alive and well.
[00:54:05.909]And they are alive
and well precisely
[00:54:07.043]because we have such a
wonderful audience here
[00:54:11.414]because at a certain point,
[00:54:12.682]it's not public speaking
[00:54:14.451]without the public.
[00:54:16.386]And now we need your help.
[00:54:18.221]We need you to help us decide
[00:54:20.023]who won tonight's war of word,
[00:54:21.558]wit, and wisdom.
[00:54:23.460]So I'm gonna ask you,
[00:54:25.362]for those of you who are
live in the audience here,
[00:54:27.998]when I announce each team,
[00:54:30.533]if you think they did the
best job of debating tonight,
[00:54:33.236]cause we ask you to decide not
[00:54:35.238]what you came in with
as your political ideas,
[00:54:37.307]but who you thought best
represented their positions
[00:54:39.609]in the debate,
[00:54:40.744]to clap, to cheer,
[00:54:42.145]to stomp your feet,
[00:54:43.380]to loudly shout Nebraska,
[00:54:45.548]or whichever team,
[00:54:47.517]you think won tonight's debate,
[00:54:50.387]because both teams
did a great job.
[00:54:52.722]So if you thought the
best debating was done
[00:54:56.059]by the affirmative
University of Nebraska team
[00:54:58.728]please clap and cheer.
[00:55:00.397](cheers and applause)
[00:55:12.942]And if you thought
the best debating
[00:55:14.077]was done by noted
Iowa Hawkeye fans,
[00:55:20.517](cheers and applause)
[00:55:22.152]The British national team.
[00:55:25.622](cheers and applause)
[00:55:40.236]MODERATOR: I don't know if I
should answer that,
[00:55:41.738]that might be too close to call.
[00:55:45.742]Great job by everybody.
[00:55:47.410]So I think at this point,
[00:55:49.045]we'll declare the winner
[00:55:50.780]to be everyone who
[00:55:52.248]and to open it up
[00:55:53.583]for some wonderful
questions and answers.
Well, and diplomatically
stated Dr. Duncan.
[00:55:59.055]At this time, the
debaters and moderators
[00:56:01.591]will take questions
from the audience.
[00:56:03.059]You may submit questions again
[00:56:04.494]using the hashtag,
[00:56:06.429]Twitter hashtag E.
N. Thompson Forum,
[00:56:07.831]or write your
questions on note cards
[00:56:09.599]provided by the ushers.
[00:56:10.800]Our first question,
[00:56:11.968]as tradition has,
[00:56:13.036]comes from the E. N. Thompson
[00:56:15.505]We'll start with
the British team.
[00:56:16.806]What are your thoughts
on how world leaders
[00:56:18.108]should or should not
use social media?
[00:56:26.616]RICHARD HUNTER: So, I was actually remarking
[00:56:27.650]as we were walking
over to this event,
[00:56:29.219]that a seismic change
[00:56:31.488]has taken place,
[00:56:32.689]that has the potential
to forever alter
[00:56:36.426]over the next few months,
[00:56:38.094]world and international
[00:56:40.296]and that is that
Twitter has actually
[00:56:42.699]doubled it's character limit
[00:56:43.867]from 140 to 280 characters.
[00:56:48.571]that gives world leaders
twice as much space.
[00:56:50.774]I would say that
[00:56:52.976]I'm not sure there's one
way to use social media.
[00:56:59.849]I think it's going to be,
[00:57:00.817]it's always going to be
[00:57:06.256]the politician and
what their message is
[00:57:08.458]and the way they want to connect
[00:57:10.326]with their supporters.
[00:57:11.961]What I would say is that
[00:57:14.364]I'm trying to be
diplomatic as possible,
[00:57:20.370]there are ways in
which you communicate,
[00:57:27.811]and make you unite
all of your country,
[00:57:31.514]and make you act like a
representative of your country,
[00:57:34.184]and something that your
country can be proud of.
[00:57:41.825]I would prefer a world in
which every world leader
[00:57:44.160]represented that at all times.
[00:57:52.669]MIKE: Alright, for
the Nebraska team.
[00:57:53.837]What is the primary motivation
[00:57:55.238]behind the spread of fake news?
[00:57:56.873]Is is economic?
[00:58:02.378]ERIN: So, I would say that
[00:58:03.947]ideological is a big thing,
[00:58:06.182]so especially when we
look to our past election,
[00:58:09.652]a lot of that fake news
was targeted towards
[00:58:13.289]specific groups and,
[00:58:15.291]there were very
[00:58:17.060]So, the example that,
[00:58:19.329]I sited a lot
during the debate was
[00:58:22.565]telling them that they
could vote from home
[00:58:24.434]on election day,
[00:58:25.401]and giving them a phone number
[00:58:26.636]that they could
text a vote from.
[00:58:28.371]We know that African
[00:58:30.974]are often a reliable voting base
[00:58:33.142]for the Democratic party.
[00:58:34.544]So, telling these women that
[00:58:37.380]they can vote from home,
[00:58:39.082]aka, they're not going
to be able to vote,
[00:58:41.017]that clearly to
me has ideological
[00:58:43.286]drive behind it.
[00:58:46.122]COLTON WHITE: Yeah, and I
think it sometimes depends upon
[00:58:48.892]where the source of it was from
[00:58:50.760]because certainly there are
[00:58:53.162]a lot of ideological reasons,
[00:58:54.530]like Erin just pointed out,
[00:58:55.665]but there are also the examples,
[00:58:56.900]like the Hungarian teens
[00:58:59.035]who made fake news accounts
[00:59:01.537]so they could get the ad
revenue from their website.
[00:59:04.974]I mean, sure they might
[00:59:07.777]but maybe try to find
a better way kids
[00:59:10.046]of going about that.
[00:59:12.081]So it just really depends on,
[00:59:14.183]on the context.
[00:59:15.752]But either way,
[00:59:18.254]like we should be looking
at the effects of it is,
[00:59:20.456]no matter what the
intentions behind it were.
[00:59:24.961]MIKE: Alright, thank you.
[00:59:25.929]From our Twitter
feed this evening,
[00:59:27.764]Richard and Becky,
[00:59:28.932]in what ways would
basic freedoms be hurt
[00:59:31.067]if social media were regulated?
[00:59:33.569]REBECCA: I think it's,
[00:59:35.905]I think it speaks to
the kind of content
[00:59:37.607]that we were just talking about
[00:59:38.808]in the context of the debate.
[00:59:39.943]It's about expressing
[00:59:41.077]your ideas and your opinions.
[00:59:42.745]And I think
[00:59:43.846]social media is just another way
[00:59:45.915]for people to get
their voices heard.
[00:59:48.651]And I think it is a
really important way
[00:59:50.253]for people who feel
[00:59:51.454]shut out of conventional media,
[00:59:53.423]whether that's young people
[00:59:56.259]And so I think it
is quite dangerous route
[00:59:58.161]to go down,
[01:00:00.096]when you start
developing a criteria
[01:00:01.864]for what's allowed
and what's disallowed.
[01:00:04.233]Cause typically, you don't
have a lot of transparency
[01:00:06.869]with how those,
[01:00:08.037]those criterias are developed.
[01:00:09.672]So it's about people
having a voice
[01:00:11.841]and having an impact on
[01:00:15.278]MIKE: Alright, to
the Nebraska team.
[01:00:16.279]How do you protect the
[01:00:18.414]of social media companies
[01:00:24.420]COLTON: So I think that
[01:00:26.122]maybe just patents,
[01:00:27.123]or other traditional means.
[01:00:28.891]I mean when Apple
releases an iPhone,
[01:00:31.894]it's not all of a
sudden the worry of
[01:00:33.963]Apple's not gonna have any money
[01:00:35.665]because everyone else
creates an iPhone now.
[01:00:39.335]I think the algorithms go
[01:00:42.405]at least from my understanding.
[01:00:44.273]We can have
[01:00:45.875]ways of preserving that
[01:00:48.578]in the same way
we've preserved other
[01:00:51.381]types of intellectual property.
to the British team,
[01:00:56.285]our Twitter followers
want to know
[01:00:57.353]what is a toasty?
[01:01:03.092]As the the grilled cheese
expert on this team,
[01:01:04.127]as a vegetarian traveling
around the states,
[01:01:05.695]I've had many grilled cheeses.
[01:01:07.697]It's basically just like
[01:01:09.465]a toasted sandwich.
[01:01:11.234]So like a toasty is
a toasted sandwich.
[01:01:14.003]I think you call
it grilled cheese.
[01:01:15.371]I don't know what
you would call it
[01:01:16.973]if it didn't have cheese in it.
[01:01:18.041]Like a grilled tomato.
[01:01:19.742]RICHARD: No, I've seen,
controversy with this
[01:01:23.813]in certain sectors
of the Internet
[01:01:25.348]is that if it's,
[01:01:26.616]if it's just cheese,
[01:01:27.850]then it's a grilled cheese.
[01:01:28.885]But if it has other ingredients,
[01:01:30.086]then it's a melt.
[01:01:34.424]so the toasty covers both ways.
[01:01:42.298]REBECCA: I feel reassured,
[01:01:43.266]cause that's obviously why no
one laughed at my toasty joke.
[01:01:47.370]MIKE: Seriously, there
were a number of questions.
[01:01:48.871]Once again, to the British team,
[01:01:50.206]tell us a little bit more
about the British debate team.
[01:01:52.742]RICHARD: Yeah, so this program
[01:01:55.445]has been running since 1922.
[01:01:59.148]And it's run by a charity
[01:02:02.085]called the English
[01:02:03.686]beset at Dartmouth
House in London.
[01:02:04.921]They run a variety of programs
[01:02:06.489]where they organize
[01:02:08.524]and they go into high schools
[01:02:10.560]and run debate programs,
[01:02:11.627]and help people get excited
about debate and discourse.
[01:02:14.697]The first tour in 1922,
[01:02:17.400]as we were informed,
[01:02:18.534]the people going were just given
[01:02:21.571]a boat ticket,
[01:02:22.572]a letter guaranteeing them
[01:02:25.041]and the name of the
person they had to meet.
[01:02:26.943]And they were just told
good luck and have fun.
[01:02:29.278]We get a lot more
support than that.
[01:02:32.448]So the English Speaking Union
[01:02:34.784]sorted out all our flights
[01:02:36.352]and they partner with
[01:02:37.386]the National Communication
[01:02:39.088]Committee for International
Discussion and Debate,
[01:02:41.224]which is the American end
[01:02:42.925]which coordinates with all
the various universities
[01:02:44.994]that host us.
[01:02:46.329]And this is stop number 18
[01:02:48.598]out of 21.
[01:02:50.199]We've been going for
[01:02:52.034]about six and a half weeks.
[01:02:55.805]that's sort of the potted
history of the tour.
[01:02:58.141]MIKE: Alright, thank
you and congratulations.
[01:02:59.542]To the Nebraska team,
[01:03:00.710]what are your underlying
goals and purpose
[01:03:02.044]for becoming involved in debate.
[01:03:04.180]ERIN: My what?
[01:03:05.781]MIKE: Your underlying
goals or purpose.
[01:03:07.049]ERIN: Oh, underlying goals.
[01:03:08.050]I thought you said
my other life goals.
[01:03:10.920]I was like,
[01:03:11.921]let's not go there right now.
[01:03:15.558]so I talked about
this in the pre-event.
[01:03:18.127]I actually completely
joined high school speech
[01:03:20.897]completely by accident.
[01:03:22.231]I didn't know what,
[01:03:25.001]it's was called forensics
at my high school.
[01:03:26.235]I didn't really
know what that was
[01:03:27.837]and I just signed
up for the class.
[01:03:29.605]But, after I joined,
[01:03:31.908]it turned out being
[01:03:33.376]really, really interesting,
[01:03:34.977]and I really enjoyed it.
[01:03:36.712]And then when I came to college,
[01:03:41.017]I knew I was coming
to the university so,
[01:03:42.151]the thought of being
on the speech team
[01:03:44.720]and having something to do
[01:03:46.422]that was academically
[01:03:51.160]And I had a lot of friends
that I knew on the team
[01:03:53.329]so that was another
push towards it.
[01:03:55.431]COLTON: Yeah, mostly it's
because I'm really nerdy.
[01:03:59.068]ERIN: That too.
[01:04:00.403]COLTON: There's also,
[01:04:01.671]it's a creative outlet
[01:04:02.905]in terms of why I
[01:04:05.775]as the years progress
from here on out,
[01:04:07.710]mostly it's going
to shift towards
[01:04:09.946]educating other people
[01:04:11.013]in speech and debate.
[01:04:12.381]And not only just about that,
[01:04:14.483]about the world,
[01:04:15.451]and making sure that
it gets passed on
[01:04:17.720]to other people who
[01:04:18.988]can maybe get those
opportunities as well.
[01:04:20.990]MIKE: Alright, well
congratulations to each of you.
[01:04:23.526]Now back to our topic at hand.
[01:04:24.860]To the British,
[01:04:26.362]if we don't regulate
social media in any way,
[01:04:28.464]where do we draw the line?
[01:04:29.865]Would we allow Twitter
to promote a tweet
[01:04:31.634]that encourages violence
[01:04:32.935]as long as it was paid
for and disclosed?
[01:04:37.306]you want to?
[01:04:38.841]REBECCA: I mean this is probably
the point of the discussion
[01:04:41.777]where I would take off my veil
of this side of the debate
[01:04:44.247]that I'm on
[01:04:46.048]and answer it in a genuine way.
[01:04:47.483]So in this format,
[01:04:49.485]pick which point
of view you argue,
[01:04:50.486]and so actually my
[01:04:54.056]is I think some regulation
is probably quite important.
[01:04:56.259]But you already voted, so.
[01:05:05.501]I mean obviously you don't,
[01:05:07.470]you know, we accept
that the state
[01:05:10.840]has the obligation to regulate
[01:05:13.676]what TV stations can broadcast,
[01:05:15.778]and what they broadcast.
[01:05:17.280]But we think that the
state shouldn't have
[01:05:19.081]the ability to
[01:05:20.116]regulate what you say
[01:05:21.917]if you're just shouting it
[01:05:23.219]on a street corner necessarily.
[01:05:24.387]I think you have to draw
the line somewhere in that.
[01:05:28.324]I'm not exactly sure
where you draw that line,
[01:05:31.861]and I'm not exactly sure
how you draw that line.
[01:05:34.330]We were assigned this side
[01:05:36.599]of this resolution.
[01:05:40.870]That's what I argued,
[01:05:42.038]but in real life,
[01:05:43.039]I'm not sure.
[01:05:44.340]REBECCA: I guess just to
underline that point,
[01:05:45.441]it's an interesting
position to be in
[01:05:46.842]where we principally concede
[01:05:48.711]that regulation to some
extent is important.
[01:05:50.946]But it's that practicalities
[01:05:52.982]perspective of how
exactly you hash that out,
[01:05:56.285]and how you create it in a way
[01:05:57.286]that is fair
[01:05:58.254]that doesn't create
[01:06:00.089]that we talked about
in our speeches,
[01:06:02.191]whilst protecting people
that need protecting,
[01:06:03.793]as in the examples so,
[01:06:05.961]you know, I'm sure
there's a panel of experts
[01:06:08.064]somewhere that could
hash out the detail
[01:06:09.665]a bit better than we could.
[01:06:11.267]MIKE: Alright, thank you,
[01:06:12.268]and for the Nebraska team,
[01:06:13.269]perhaps you also
shared a disclosure,
[01:06:14.370]a personal views,
[01:06:15.471]but won't a regulatory body
[01:06:17.440]be just another algorithm
[01:06:19.075]that the general
public has no access to
[01:06:20.509]or understanding of.
[01:06:21.577]Who watches the watchmen?
[01:06:24.313]ERIN: So that is definitely
a real concern,
[01:06:26.015]and I think one of
the big issues with
[01:06:28.684]like fully disclosing
those types of algorithms
[01:06:31.354]to the public is that
[01:06:32.722]those algorithms do contain
[01:06:35.324]like personal information,
[01:06:36.992]because the reason that
you get certain content
[01:06:39.862]on your Facebook feed
[01:06:40.996]is because of what you like,
[01:06:42.565]what you're interested in,
[01:06:43.699]where your from,
[01:06:44.900]who your Facebook
friends with, et cetera.
[01:06:46.302]So you don't want that
type if information
[01:06:48.204]just out in the open,
[01:06:50.005]released to everyone.
[01:06:51.407]But I think going off of that,
[01:06:53.175]we hope that
regulation that exists
[01:07:00.015]some type of partisan committee,
[01:07:02.084]and there's like
[01:07:04.286]the social media companies
and the government
[01:07:05.955]so a type of joint oversight
[01:07:08.090]between each other.
[01:07:09.625]MIKE: Thank you,
to the British team,
[01:07:12.461]which do you feel
is more effective
[01:07:14.029]at reaching a large audience,
[01:07:15.297]a larger audience,
[01:07:16.732]real news or fake news?
[01:07:20.069]RICHARD: Well, it's the Mark
Twain line, right,
[01:07:22.838]the a lie can get
halfway around the world
[01:07:26.208]before the truth
has it's socks on.
[01:07:27.643]I'm not sure, in that I think,
[01:07:30.679]I think there's
[01:07:33.449]that people don't
like to believe
[01:07:35.084]that the news they
are reading is fake.
[01:07:38.954]People don't want
their news sources
[01:07:41.357]to be seen as untrustworthy,
[01:07:42.792]or people to think
[01:07:44.193]that what they're
reading is untrustworthy.
[01:07:46.395]But, people simultaneously
want to read things
[01:07:51.300]that they are pre
inclined to believe.
[01:07:55.004]they want to believe that
[01:07:57.072]climate change is made up,
[01:07:59.041]and it's not going
to effect them,
[01:08:00.743]even if the truth is that
[01:08:02.311]it is very real
[01:08:03.779]and it definitely is
going to effect them.
[01:08:07.917]my long winded
[01:08:11.320]is that I don't know.
[01:08:15.257]MIKE: To the Nebraska team.
[01:08:16.258]Was the recent tension
with North Korea
[01:08:17.893]a foregone conclusion,
[01:08:18.961]or was it instigated by our,
social media posts?
[01:08:24.767]ERIN: Do you have any thoughts?
[01:08:26.535]COLTON: Yeah, I would say that
[01:08:27.536]what's really important
[01:08:29.471]about this question
[01:08:31.974]is that it's not a
foregone conclusion yet
[01:08:35.044]whether or not we
[01:08:37.179]So I don't know for sure,
[01:08:40.381]counter factuals as
to whether or not
[01:08:41.584]there would be
[01:08:42.718]aggression or non-aggression
[01:08:44.420]if we hadn't have had Twitter.
[01:08:46.255]But what's really important
for us as a public right now
[01:08:49.425]like really convey
[01:08:51.660]how important it is
that our government
[01:08:54.029]doesn't act in aggressive ways
[01:08:55.564]against North Korea right now,
[01:08:56.631]because one of the
things that I'm seeing
[01:08:58.567]that's really worrying
[01:08:59.635]is people just feeling
that it's an inevitability
[01:09:02.337]and not really thinking
about the consequences
[01:09:04.540]that such a conflict could have.
[01:09:06.274]Like I don't want to see
Seoul and Tokyo leveled
[01:09:11.479]complacency in terms of well,
[01:09:13.649]it's you know, a
[01:09:15.283]So I think that,
[01:09:16.752]I don't know the answer
to the counter factual,
[01:09:18.453]but I think that we
need to take a stance
[01:09:20.956]against further aggression,
[01:09:22.825]like right now,
[01:09:23.859]to make sure it
[01:09:33.569]MIKE: For the British team,
[01:09:34.537]how does protection of
children and minors work
[01:09:36.538]regarding social media
[01:09:38.107]as this audience grows?
I mean I think that's a really
[01:09:44.580]really difficult question
[01:09:47.182]I think there certainly
is regulation needed
[01:09:49.852]minors and children
on social media.
[01:09:52.988]What exactly that looks like,
[01:09:56.492]I don't know right now.
[01:09:57.660]But that seems a
pretty important group
[01:10:00.429]that need regulation.
[01:10:02.398]RICHARD: I definitely think
[01:10:03.499]it's hard to know
[01:10:05.100]because their is
[01:10:06.135]a generational difference.
[01:10:07.770]And I've talked to,
[01:10:09.238]I've talked to one
or two of my old
[01:10:11.707]teachers in school
[01:10:13.642]and they've definitely noted
[01:10:15.878]that it's a
[01:10:18.514]you know, people reaching,
[01:10:22.518]growing up as a teenager
and as a young person,
[01:10:25.754]have had social media
[01:10:28.123]as a constant background
in their life,
[01:10:31.527]and had it expected
that you would take part
[01:10:32.995]in social media.
[01:10:34.029]Whereas we didn't.
[01:10:35.631]I didn't get a Facebook account
[01:10:37.066]till I was 16.
[01:10:40.369]it's hard to know
the effects of that
[01:10:43.606]as someone who hasn't
had that experience.
[01:10:45.541]MIKE: Alright, thank you.
[01:10:47.309]To the Nebraska team.
[01:10:48.310]Isn't all news effectively,
[01:10:49.511]or any story biased to the
person writing the news
[01:10:52.548]or telling the story?
[01:10:53.849]ERIN: Yeah, absolutely.
[01:10:55.451]I think like what we tried
to convey in our speeches
[01:11:00.723]yes, news will inherently
be biased in many cases.
[01:11:04.293]It involves who the
reporter is talking to,
[01:11:08.530]what story they want to cover,
[01:11:12.001]that angle they're
trying to take.
[01:11:17.139]this, like the news
that we read is biased.
[01:11:18.674]Like what populations
[01:11:22.144]That's a problem,
[01:11:24.380]there's a different between
[01:11:25.648]a story that takes
one specific angle,
[01:11:28.851]and something that is
[01:11:32.454]So, that was what
we tried to convey.
[01:11:36.191]MIKE: Thank you.
[01:11:37.192]To the British team.
[01:11:38.160]Assuming an objective
truth is possible,
[01:11:40.129]why should we feel
comfortable with government
[01:11:42.564]or big business
determining it for us
[01:11:44.433]as proposed in the debate?
[01:11:48.804]COLTON: Was that at us?
[01:11:49.838]MIKE: The British team.
[01:11:51.340]How they really feel.
[01:11:53.108]RICHARD: I think
[01:11:56.278]assuming that an
objective truth exists,
[01:11:58.647]I think it's too big a burden
[01:12:01.250]to place on any
[01:12:04.053]to say that every
one individual person
[01:12:07.156]do all the research and reading
[01:12:09.525]and compile all the
[01:12:13.929]be aware of that.
[01:12:15.731]I think at some point,
[01:12:16.899]some external person,
[01:12:19.768]or external body,
[01:12:21.003]or something is going to
[01:12:23.172]impart knowledge onto you,
[01:12:24.907]or is going to
[01:12:26.141]inform you of
things you would not
[01:12:28.944]otherwise be informed of.
[01:12:33.215]what that external body is,
[01:12:34.750]is a matter for debate,
[01:12:36.018]but I don't think,
[01:12:37.119]I don't think it's realistic
[01:12:39.321]to say that people
[01:12:40.355]can do all work themselves.
[01:12:42.091]REBECCA: I also think probably
[01:12:43.125]agree with what
was just said about
[01:12:45.094]truth and not being objective,
[01:12:48.363]and then given that,
[01:12:49.498]the different governments
might have different views
[01:12:50.566]on quite controversial issues.
[01:12:52.267]That's why it is something
[01:12:54.136]that is really important to
have proper accountability
[01:12:56.171]and challenge for
in terms of how
[01:12:57.639]we let governments
right and what's wrong
[01:13:01.343]and what should be allowed,
[01:13:02.611]and what we can't discuss.
[01:13:05.180]MIKE: Alright, thank you.
[01:13:06.181]To the Nebraska team.
[01:13:07.182]Do you think creating
[01:13:09.752]perhaps like the FCC,
[01:13:10.853]would help with creating
[01:13:12.921]and if so,
[01:13:14.256]how would you go about that?
[01:13:18.460]COLTON: In relation to
[01:13:21.964]I think it would
be a good precedent
[01:13:25.033]to fit it within current
[01:13:27.669]because if we
hastily put together
[01:13:32.341]to do what
[01:13:34.009]is essentially the job of,
[01:13:36.278]like the FCC,
[01:13:37.346]then it's just going to create
a more complicated process.
[01:13:39.715]And I think that, you know,
[01:13:41.650]regulations can be good
[01:13:43.185]but they're only good in so
far as they're effective.
[01:13:45.888]And they're effective if they're
[01:13:50.359]that know what they're doing.
[01:13:51.360]And in so far as
[01:13:52.995]the FCC and other federal
organizations have shown
[01:13:55.497]that they know
what they're doing
[01:13:57.099]in relation to regulating
other media companies.
[01:13:59.501]I think that we should just
add to the list of duties
[01:14:06.708]like social media.
[01:14:08.110]ERIN: Going off of that,
[01:14:09.144]I think that perception is
really important here too.
[01:14:12.014]And like the FCC isn't seen,
[01:14:14.750]in my opinion,
[01:14:16.518]as particularly partisan.
[01:14:18.086]I think creating a new body
[01:14:20.989]in this political climate,
[01:14:22.658]specifically to regulate
social media sites,
[01:14:26.929]that this is a really
[01:14:29.531]I think that would be seen as
[01:14:33.335]and would cause
[01:14:34.303]a lot of, like,
[01:14:35.704]lack of trust in the body.
[01:14:38.140]And that could be really
problematic as well.
[01:14:41.109]MIKE: Alright, thank you.
[01:14:43.145]take the final question,
[01:14:44.112]I'd like to remind you each
[01:14:45.214]to mark your calendars
[01:14:46.548]for the next E.
N. Thompson forum
[01:14:47.783]on world issues.
[01:14:48.951]It will feature a conversation
with Misty Copeland.
[01:14:50.786]Please join us on February 13th
[01:14:52.554]at 7:00 p.m.
[01:14:54.156]Here at Lincoln
[01:14:55.691]at Kimball Recital hall.
[01:14:56.692]And our last question,
[01:14:58.026]after I thank you
again for coming
[01:15:00.195]all the way across the pond
[01:15:01.730]and from the UNL team as well,
[01:15:03.131]I would like you to
give us your perspective
[01:15:05.167]and or predictions on the
future of social media.
[01:15:10.239]REBECCA: What was that, sorry.
[01:15:11.240]Our perspectives on the--
and or predictions
[01:15:13.242]on the future of social media.
[01:15:14.243]REBECCA: Oh, okay.
[01:15:18.480]I think that there will be
[01:15:22.150]I think that there's
sufficient political capital
[01:15:25.721]the influence of Russia
[01:15:27.389]on the elections in the States,
[01:15:29.224]that there is going
to be a demand from
[01:15:32.361]electorate to see
[01:15:34.229]see some tangible change
[01:15:36.298]in terms of how
[01:15:38.934]social media is regulated.
[01:15:39.968]What I think is
[01:15:41.203]is there has been a
simultaneous push towards
[01:15:43.005]some kind of regulation,
[01:15:45.440]as there has been a push from
[01:15:48.844]to put forward a list of ideas.
[01:15:50.579]Like Mark Zuckerberg
[01:15:51.680]did a list of things
that he could see
[01:15:53.715]as being tangible changes.
[01:15:55.284]So like including
the ad revenue,
[01:15:56.852]including the source
of the advertising
[01:15:58.887]and stuff like that.
[01:16:00.022]I expect what will
[01:16:02.090]is that there will be a
[01:16:03.625]conversation between government
[01:16:05.294]and social media
[01:16:06.295]and an agreement between them
[01:16:07.296]about measures that are imposed,
[01:16:08.730]a number of statutory
[01:16:10.299]But I think the process
is necessarily going to be
[01:16:14.836]because there's just
so many complexities
[01:16:17.139]in terms of
[01:16:18.073]how conversations happen,
[01:16:19.675]how do you define
fake news, et cetera,
[01:16:21.877]because of all the really
[01:16:23.378]we've had about
like objective truth
[01:16:24.513]and how minorities get
represented and stuff.
[01:16:26.581]So I think we're
gonna get regulation
[01:16:29.151]and I think social
[01:16:31.186]gonna support it
[01:16:32.187]because they have to,
[01:16:33.221]cause they have to show that
they're really embracing
[01:16:36.258]But I think there will
be a backlash to it,
[01:16:37.559]because I think it won't be
perfect the first time round.
[01:16:38.660]And we'll maybe get
to where we need to be
[01:16:41.630]within a few years.
[01:16:43.131]RICHARD: Yeah, I agree
with all of that.
[01:16:45.934]I think definitely in
terms of the timeframe,
[01:16:48.236]I would expect,
[01:16:49.871]because all these
companies are US based,
[01:16:51.807]and although this conversation
[01:16:53.608]needs to be US driven,
[01:16:54.609]I would expect a lot of
that regulation to be
[01:16:56.144]finalized by 2020.
[01:16:59.748]And whatever that election
ends up looking like,
[01:17:03.185]we're gonna have
about those issues again.
[01:17:05.721]I would say the other issue
[01:17:07.889]that is completely
different from this debate,
[01:17:09.925]but I think is going
to become a big one,
[01:17:12.160]and there's going to be more
news stories involving it,
[01:17:15.998]the connection between
people's online identity
[01:17:18.834]and people's IRL identity.
[01:17:21.570]Because I notice a
lot of my peer group
[01:17:24.206]and a lot of my friends,
working as lawyers,
working as teachers,
[01:17:30.712]or whatever field they're in,
[01:17:31.980]they have changed
their names on Facebook
[01:17:34.416]to be not their real name.
[01:17:36.051]And they're using
[01:17:37.019]usernames or aliases
[01:17:38.387]on Twitter and
Facebook and things
[01:17:40.589]because they don't want to be
[01:17:43.191]they don't want to be found.
[01:17:44.192]I think that's a really
[01:17:46.461]as to how that relates to
[01:17:47.462]your ability to get jobs,
[01:17:50.665]and your ability to
be a public person
[01:17:52.601]in the public sphere.
[01:17:54.102]We will eventually
get to a point where
[01:17:55.871]the person who
becomes the president
[01:17:59.074]put like badly worded tweets
[01:18:00.509]at the age of 16.
[01:18:03.378]I think that's a world,
[01:18:04.379]well it maybe the world
we're already living in,
[01:18:07.716]but it will eventually be
the world that we live in,
[01:18:09.217]and I think the transition
[01:18:11.653]into that being the
world we live in
[01:18:13.121]is going to be very interesting.
[01:18:15.390]MIKE: Thank you.
[01:18:16.391]From team Nebraska.
[01:18:17.392]ERIN: Just on a more
[01:18:19.127]I'm also really excited
[01:18:21.063]about the use of social media,
[01:18:22.397]specifically in politics.
[01:18:23.932]While it seems for
my political life,
[01:18:27.202]that social media has been there
[01:18:28.937]for all of it,
[01:18:30.539]it's really, really new
[01:18:32.240]and there's a lot of
campaigns here in Nebraska
[01:18:34.276]and across the country,
[01:18:36.545]that are using social media in
[01:18:38.180]really exciting and
[01:18:40.749]to engage with
[01:18:43.418]and just their voter base
[01:18:46.788]in ways that I think even
after this past election,
[01:18:50.392]are really, really unique.
[01:18:51.993]And I think that will
continue to change.
[01:18:53.762]And we may see more
and more engagement
[01:18:55.530]between elected officials
[01:18:58.600]and their constituents
[01:19:01.002]on social media,
[01:19:03.605]politicians and candidates
using social media
[01:19:05.273]in an appropriate way
[01:19:07.175]is very exciting.
[01:19:08.443]COLTON: To quote our brilliant
[01:19:13.682]I don't know.
[01:19:16.318]Honestly, I'll agree with what
[01:19:19.054]everything else was said.
[01:19:21.156]The only thing I'll add is that
[01:19:23.391]our answers aren't fixed
[01:19:25.494]because the technology
[01:19:26.728]and neither is our society.
[01:19:29.197]So I think we probably
[01:19:30.398]need to keep
[01:19:31.533]checking in on this issue
[01:19:32.934]as often as we need to
[01:19:34.136]to make sure that we're being
responsive to the times.
[01:19:37.339]MIKE: Alright, thank
you ladies and gentlemen.
[01:19:38.473]Thank you for attending
the Wilson Dialogue.
Log in to post comments