How Evolutionary Biology Helps Us Understand Human Conflict and Cooperation | CAS Inquire
CAS MarComm
Author
11/15/2024
Added
30
Plays
Description
Clay Cressler, biological sciences, gave the CAS Inquire talk on Nov. 12.
Searchable Transcript
Toggle between list and paragraph view.
- [00:00:00.000]Thank you all for being here. It's great to see such a wonderful turnout tonight for our
- [00:00:04.740]CAS Inquire lectures series. Welcome everyone. We're thrilled to be doing this for the fifth
- [00:00:10.620]straight year. As you know, this year's theme is War, Peace, and Reconciliation. This theme
- [00:00:17.660]is incredibly well chosen given the persistence and devastating reactions of war across the world
- [00:00:25.140]today and the no less persistent and urgent calls for peace and of reconciliation. Here in the first
- [00:00:34.260]inquire since the close of local, state, and national elections just last week, I'm especially
- [00:00:40.400]grateful for the opportunity to bring our community together to grapple with these difficult questions
- [00:00:46.140]and to do so in a manner that embraces complexity, nuance, and contending intellectual and moral
- [00:00:53.980]views.
- [00:00:54.460]We embrace those particular challenges in the College of Arts and Sciences because we
- [00:00:59.840]are committed and dedicated to the clear-eyed pursuit of truth, and I'm especially grateful
- [00:01:06.540]that our CAS Inquire students who are here this evening get to deal with these questions
- [00:01:11.880]in a deep way over the course of the entire year.
- [00:01:15.060]The questions that are at issue in this lecture series raise fundamental questions about the
- [00:01:20.300]very nature of humanity.
- [00:01:22.700]And as our guest speaker today, might, brought to you today, the questions that are at issue
- [00:01:23.700]in this lecture series raise fundamental questions about the very nature of humanity and, as
- [00:01:23.700]our guest speaker today, might, brought to you today, the questions that are at issue in
- [00:01:24.220]this lecture series raise fundamental questions about the very nature of humanity and, as our guest
- [00:01:24.220]speaker today, might, brought to you today, the questions that are at issue in this lecture series
- [00:01:24.220]raise fundamental questions about the very nature of humanity and, as our guest speaker today, Mike,
- [00:01:24.220]raise fundamental questions about the very nature of humanity and, as our guest speaker today, Mike,
- [00:01:24.280]raise fundamental questions about the very nature of humanity and, as our guest speaker today, might, argue the very nature of existence, and not just humanity. But we'll hear what he has to say about that. In keeping with the culture of the college, and it's exemplified in this lecture series, we pursue these really difficult questions through a diverse range of intellectual orientations and disciplinary perspectives, which have included just so far political science, classics, and upcoming history. And tonight, we're really thrilled to have the unique intellectual pleasure of having the opportunity to be here.
- [00:01:53.980]We're very pleased to host Dr. Clay Cressler, Associate Professor in the School of Biological
- [00:02:04.560]Sciences, to lead us through that. Dr. Cressler's research focuses on understanding how ecological
- [00:02:11.000]and evolutionary dynamics are shaped by the cross-scale interaction between individual
- [00:02:17.540]level and population level processes. He's a widely published scholar in the field of the
- [00:02:23.740]ecology and evolution, with a focus on infectious disease processes. And Dr. Kressler's lab
- [00:02:30.680]and his research has been well funded by both NSF and the NIH. Dr. Kressler earned his PhD
- [00:02:39.000]from the University of Michigan, and after completing several postdoctoral fellowships,
- [00:02:43.520]he joined the School of Biological Sciences in the college in 2015. We're very thrilled
- [00:02:48.260]to have him in our community, so please join me in welcoming him here to the podium. Thank
- [00:02:53.500]you very much for coming. I've been telling everybody that I'm both very excited and very
- [00:03:09.680]nervous about this talk. I'm excited because I hope that by the end of the talk, I will
- [00:03:16.500]have at least convinced you a little bit that understanding the evolution of social behavior
- [00:03:23.260]is an area of study, if not one that actually helps us to understand why we as humans behave
- [00:03:28.880]the way that we do. At the same time, I'm very nervous about this talk. I'm nervous
- [00:03:34.880]for a very relevant reason, which is that history shows us that when evolutionary thinking
- [00:03:43.300]is applied to humans, the results can sometimes be not just flawed from an evolutionary perspective,
- [00:03:53.020]I'm thinking specifically here of things like eugenics, but also the fallacious idea
- [00:03:58.260]that there's a biological basis to race. What I'm trying to do in this talk, by talking
- [00:04:04.600]about human behavior and applying this kind of an evolutionary lens to it, is both exciting
- [00:04:11.180]but also a little nerve-wracking, because I find myself, as I was putting this talk
- [00:04:15.580]together, trying to walk a very fine tightrope. I hope that, as my audience, you would be
- [00:04:22.780]forgiving of me if I don't quite stay on it. Nevertheless, I forge ahead. This is also
- [00:04:32.860]a little scary to me because I don't know anything about war or peace, so I find myself
- [00:04:42.060]giving a talk in this lecture series, but hopefully we'll be okay. I'm going to attempt to convey three big ideas
- [00:04:52.540]and one big caveat in my talk today. And my first big idea is that to the extent that social
- [00:04:59.900]behaviors affect the fitness of organisms and are encoded at least partially in the genes that those
- [00:05:07.580]organisms carry, both of which are true, that evolutionary biology provides a kind of a distinctive
- [00:05:13.260]lens for thinking about social behaviors in terms of their fitness costs and benefits. My second big idea
- [00:05:22.300]is that terms that we use really commonly in everyday language, terms like cooperation and
- [00:05:28.140]selfishness, altruism and spite, have very specific meanings within evolutionary biology.
- [00:05:35.340]These are not just nice or mean, but rather strategies that can be adaptive or not, depending on the local
- [00:05:43.820]circumstances. Which leads me to my third big idea, is that social structures, environmental structures, who we
- [00:05:52.060]interact with, how we interact, can really dramatically affect the costs and benefits
- [00:05:57.500]of engaging in any kind of social behavior, and we do well to keep that in mind. My big caveat
- [00:06:05.020]flows right out of why I am nervous to give this talk. Be so, so careful in applying
- [00:06:11.820]any of these ideas that I'm going to talk about. I'm going to say true things today.
- [00:06:16.220]Nothing I'm going to say will not be true, but everything that I will say will be only very
- [00:06:21.820]carefully applied to an understanding of the complexity and richness of human behavior.
- [00:06:26.860]And I'll try very hard to point out those pitfalls as we approach them. But I challenge each of you
- [00:06:33.260]before you go sort of trying to do this on your own, be careful. Does this situation actually fit
- [00:06:40.620]in the rather narrow box that I'm going to open today?
- [00:06:45.660]Okay. All that said, let's dive in. So if we're going to attempt to
- [00:06:51.580]apply an evolutionary lens to social behavior, we need to get some settle on some terminology.
- [00:06:56.860]In particular, if social behaviors are the product of natural selection, that is,
- [00:07:01.100]they've evolved, then two things must be true. First, they have to affect fitness, right? Those
- [00:07:06.860]behaviors must have some effect on fitness. That is the number of copies of your genes
- [00:07:11.740]that you will leave in the next generation. And they must be determined by your genes.
- [00:07:17.260]And by your genes, let me be really clear here. We all have the same genes. None of
- [00:07:21.340]us have different genes, like different actual genes than anybody else. Where we differ is in
- [00:07:27.660]the particular version of each gene that each of us carries, right? All of us have, hopefully,
- [00:07:32.700]the genes to have built irises. But the color of your iris compared to the person sitting next to
- [00:07:39.020]you has everything to do with which version of the genes that you happen to be carrying.
- [00:07:44.460]So for social behaviors, what we're suggesting is that
- [00:07:47.180]the sets of genes that influence behavior, there's different versions of those genes.
- [00:07:51.100]And those different versions of genes interacting with each other and with their environment will
- [00:07:54.940]produce different sets of behaviors. And those behaviors will have fitness consequences.
- [00:07:59.420]If all of that is true, then our behavior evolves. It evolves over time. In particular,
- [00:08:06.140]so if June has a set of gene variants that cause her to engage in behaviors that increase
- [00:08:11.500]her fitness, whereas I have versions of the genes that cause me to engage in behaviors
- [00:08:15.820]that decrease my fitness relative to June, then June will leave more copies of her genes in the
- [00:08:20.860]next generation than I will, and the population over time will evolve to be more and more
- [00:08:26.380]behaviorally like June, and less and less behaviorally like me.
- [00:08:31.980]And I'm being very intentional in my choice of words here. Notice what I've said is that
- [00:08:36.300]copies of my genes in the next generation, not the number of descendants that I will leave.
- [00:08:41.820]I'm taking a very gene-centric view of evolution here, and there's a reason for
- [00:08:47.020]that which will become clear in a moment, because it turns out that this gene-centric view of kind
- [00:08:51.740]of seeing our bodies really as just vehicles that get our genes into the next generation,
- [00:08:56.940]a view that was kind of most famously advocated for by Richard Dawkins in his 1976 book The
- [00:09:02.940]Selfish Gene, that that view can actually help us to understand things that are otherwise quite
- [00:09:07.420]paradoxical in the world of behavior. Okay, with that foundation laid, what we can do is we can
- [00:09:16.620]begin to think about the evolution of our genes in the next generation. We can begin to think about
- [00:09:17.000]how we can begin to approach social interactions with this evolutionary lens. We can think about
- [00:09:21.960]the costs and benefits of engaging in any kind of social interaction. So, for example, if we think
- [00:09:27.980]about donors and recipients of behavior, a person doing that or an organism doing the behavior,
- [00:09:33.020]and then the recipient of that behavior, if the behavior has a fitness benefit to the donor,
- [00:09:39.380]but a fitness cost to the recipient, then we could call that behavior maybe selfishness,
- [00:09:44.100]right? This is, I will gain a benefit from this social interaction,
- [00:09:46.980]interaction that I'm having with you, but it will come at a cost to you. Conversely,
- [00:09:51.220]we might think about behaviors that carry a benefit for both the donor and the recipient.
- [00:09:55.600]We might call those behaviors cooperation or mutual benefit if you don't like, if the term
- [00:10:00.740]cooperation carries too many sort of extra baggage. And from an evolutionary perspective,
- [00:10:05.460]these behaviors are actually quite easy to understand because for the donor, the person
- [00:10:10.400]or organism engaging in the behavior, they're fitness beneficial. We would expect those
- [00:10:16.960]common over time. But what's interesting when we look at the natural world is that
- [00:10:21.620]we need to hold in our heads that cooperation and selfishness are not fixed things over
- [00:10:27.620]time. They can change. So to give a really good kind of fun biological example of this,
- [00:10:31.700]I'm going to pull from a system that's probably very familiar to any of you who grew up on
- [00:10:35.860]a Nebraska farm. This is the interaction, the cooperative interaction that occurs between
- [00:10:41.200]nitrogen fixing bacteria and legumes like soybeans.
- [00:10:46.940]So, plants have an acute need for nitrogen. It's a really critical nutrient that fuels
- [00:10:52.220]how they grow. Unfortunately, the most common form of nitrogen on the planet is found in
- [00:10:56.760]the atmosphere, and it's useless to a plant. Plants cannot do anything with atmospheric
- [00:11:01.500]nitrogen. They need it to be transformed into a bioavailable form, but they can't do that
- [00:11:06.540]themselves. Bacteria, though, can do it. And so, these
- [00:11:10.960]bacteria, there has been an evolved mutualism, a cooperative relationship that persists over
- [00:11:16.920]time between legumes and these nitrogen-fixing bacteria. In exchange for fixing nitrogen
- [00:11:24.800]for the plant, the bacteria get to live inside of these nodules in the roots of the plant,
- [00:11:30.420]and the plant makes glucose through photosynthesis and gives some of that glucose to the bacteria.
- [00:11:36.680]Both pay a bit of a cost, right, for partnering in this way, but the benefit outweighs the
- [00:11:41.960]cost. But what we know is that that cooperative interaction is not fixed. It can change the
- [00:11:46.900]change. In particular, there are strains of those bacteria that cheat. They don't fix
- [00:11:52.800]nitrogen. They don't turn it in and make it bioavailable, but they colonize those root
- [00:11:56.560]nodules anyway, shifting the interaction from cooperation to selfishness.
- [00:12:03.460]The plant responds in kind. It can do things like cut off oxygen to the root nodules to
- [00:12:08.400]literally choke out the bacteria that are living there and cause them to leave the nodules.
- [00:12:13.860]You can also shift the interaction by changing the environment.
- [00:12:16.880]If you take that soybean plant and you plant it in soil that's been heavily fertilized
- [00:12:21.360]with nitrogenous fertilizers, for example, they don't need the rhizobium bacteria that
- [00:12:25.820]do the nitrogen fixing anymore. They'll kick them out anyway because the benefit that they
- [00:12:30.520]receive from the bacteria has now been reduced, and so the cost that they pay isn't worth
- [00:12:35.760]it.
- [00:12:36.760]Although we can understand these behaviors pretty easily because they are beneficial
- [00:12:41.820]to the donor, selfishness and cooperation are really important to keep in mind. They
- [00:12:46.860]are not static. They can change over time.
- [00:12:51.880]More interesting, though, and paradoxical from an evolutionary perspective, are the
- [00:12:56.840]behaviors that exist along the top row of our interaction matrix because these are behaviors
- [00:13:01.580]that involve a fitness cost to the donor. In particular, if the donor pays a fitness
- [00:13:08.040]cost to provide a fitness benefit to the recipient of the behavior, we call that altruism. Altruism
- [00:13:16.840]in the evolutionary sense, not in the sense of just being helpful, but an actual, I am paying a
- [00:13:22.760]significant fitness cost in order to help another individual to aid their fullness. This is a very
- [00:13:30.760]weird thing to exist in nature, and yet it does. There are many examples of altruistic behavior
- [00:13:37.960]in nature. For example, in the top left there, that's a building squirrel engaging in an alarm
- [00:13:44.860]calling. This is pretty common.
- [00:13:46.820]Across lots of different animals, when one individual in a community sees a predator,
- [00:13:51.940]it often stands up and starts making a call that alerts all of his fellows, "There's a
- [00:13:56.900]predator nearby." Even though we know empirically that attracts the attention of the predator,
- [00:14:02.160]makes that individual much more likely to be killed by the predator, than it would have
- [00:14:06.700]been if it had just gone, "Oh, a predator," and ran away, going like, "Well, nobody else
- [00:14:12.060]saw it." A second example on
- [00:14:16.800]the bottom left is an example of cooperative breeding. Over 220 species of birds and 120
- [00:14:23.460]species of males engage in cooperative breeding. Cooperative breeding very specifically is
- [00:14:28.460]when an individual who is totally capable of having babies of its own chooses not to
- [00:14:34.800]have babies in order to help some other individual in the community reproduce. It's foregoing
- [00:14:41.060]its own fitness to improve the fitness of another. The most extreme examples of that
- [00:14:46.780]altruism come from eusocial insects like ants and bees. These are organisms where there
- [00:14:53.280]are whole groups of individuals that will never reproduce. They're actually sterile.
- [00:14:58.020]They'll never have any reproduction of their own. Their fitness is zero from an evolutionary
- [00:15:02.800]perspective. They spend their whole lives engaging in work or protection of the nest,
- [00:15:07.960]bringing resources to a queen who does all the reproduction for the entire colony. We
- [00:15:14.280]even have even weirder examples than that.
- [00:15:16.760]Like the slime mold on the bottom right. So slime molds typically live as individual independent
- [00:15:23.160]single cells. But when the environment that they are inhabiting deteriorates, they form
- [00:15:29.220]what's called a fruiting body. And you can actually see them. I know I'm not supposed
- [00:15:33.360]to, but I'm gonna walk over here for a second. You can actually see the individual cells
- [00:15:37.260]making up that fruiting body. All of these cells are dead.
- [00:15:41.140]They have sacrificed their own ability to reproduce, to form a stalk that lifts the
- [00:15:46.740]other cells up so that they can produce spores that will then be dispersed away, hopefully
- [00:15:52.060]to better environments by the wind. They have given up all of their fitness in order to
- [00:15:57.180]help some other individual reproduce. That's so strange. All of those things are so strange
- [00:16:03.920]from a simple evolutionary perspective that says that natural selection acts to maximize
- [00:16:09.300]fitness. How does this, any of this, maximize fitness? It doesn't make sense.
- [00:16:16.720]The intuition for why this happens, why this might fall, comes from a tongue-in-cheek quote
- [00:16:24.540]often attributed to J.B.S. Haldane, although nobody's actually sure if he really said it,
- [00:16:28.760]which is this: "I am prepared to lay down my life for eight cousins or two brothers."
- [00:16:35.960]Why does that make some sense? If we think about ourselves, each of us in our cells,
- [00:16:44.040]we carry 23 pairs of chromosomes.
- [00:16:46.700]One set of chromosomes you inherit from your mom, one set of chromosomes you inherit from
- [00:16:52.640]your dad.
- [00:16:53.640]On average, a full sibling will carry about 50% of the gene variants that you carry.
- [00:17:00.520]If you could sacrifice your life to save two siblings, well, potentially, the same amount,
- [00:17:09.020]the same opportunity for all of the gene variants that are in you to get passed, there's still
- [00:17:13.420]the same opportunity for them to get passed on.
- [00:17:16.680]You are slightly less related to your first cousins, about an eighth, given the vagaries
- [00:17:21.580]of genetic inheritance.
- [00:17:24.420]You'd have to sacrifice your life for a lot more cousins to have the same opportunity
- [00:17:29.220]for the particular gene variants that you carry to get passed on.
- [00:17:35.440]What this quote does for us is it enlarges our perspective on what is fitness.
- [00:17:40.880]If you were coming into this, you might think, well, fitness is about my reproduction.
- [00:17:45.060]What this says is, no, no, no, if evolution is about getting copies of your genes into
- [00:17:51.900]the next generation, well, you can do that by reproducing yourself, or you can do it
- [00:17:57.720]by helping genetic relatives to reproduce.
- [00:18:01.180]Not just your direct fitness, but what we call your inclusive fitness.
- [00:18:04.440]Your direct fitness plus that indirect fitness that you get from helping a genetic relative
- [00:18:10.460]to reproduce, kind of weighted by how closely related they are to you.
- [00:18:14.760]This idea was expanded and elaborated on, and really applied to the concept of altruism
- [00:18:21.920]by W.D. Hamilton in the 1960s, who came up with a really actually wonderfully simple
- [00:18:29.720]expression to describe when we would expect a gene variant that causes an organism to
- [00:18:36.660]behave altruistically to evolve and spread through a population.
- [00:18:42.620]This insight is one of the most important insights
- [00:18:44.740]in evolutionary biology since Darwin.
- [00:18:48.040]And it's very simple.
- [00:18:50.040]It just says this: A gene that causes altruism will spread if Rb is greater than C.
- [00:18:55.540]What are these?
- [00:18:56.540]B, that's the fitness benefit to the recipient of the altruistic act.
- [00:19:02.320]C is the fitness cost to the donor of the altruistic act.
- [00:19:07.860]And R is the relatedness of the donor and the recipient, so if the benefit is big and
- [00:19:14.720]the cost is small or if the relatedness is high, then a gene for altruism can spread,
- [00:19:21.060]suggesting that altruism is more likely to evolve and be maintained in populations comprised
- [00:19:26.180]of relevance.
- [00:19:27.180]We know that we can show this from this very simple map, but actually empirical experiments
- [00:19:31.380]and empirical observations back it up.
- [00:19:34.640]Conversely, as populations become more genetically diverse, relatedness goes down, so the benefit
- [00:19:41.660]is more likely to be felt by an unrelated individual.
- [00:19:44.700]An unrelated relative that opens the space for cheaters to invade and displace all of
- [00:19:50.920]those altruists and move back to a more selfish society where fitness is really just about
- [00:19:56.380]your direct things.
- [00:19:59.560]Interestingly, in all of those examples that I showed, what we know from empirical observation
- [00:20:07.480]is that in all of those cases, relatedness is very high.
- [00:20:12.400]In the cases of eusocial insects, it's a single region.
- [00:20:14.680]Reproducing queen, reproducing with a single male.
- [00:20:18.360]Everyone in that colony is a sibling.
- [00:20:21.120]And by helping mom reproduce, they are literally as genetically related to all of their siblings
- [00:20:25.820]as they would be to their own children.
- [00:20:28.120]And so forsaking their own reproduction doesn't matter, because they're all getting their
- [00:20:33.600]genes into the next generation anyway.
- [00:20:36.080]Cooperative breeding birds only forms, or typically, there's always exceptions, monogamously
- [00:20:41.840]pair-bonded birds.
- [00:20:44.660]The helping bird is typically a younger child, a younger chick, helping mom and dad to have
- [00:20:52.860]more siblings.
- [00:20:54.120]She's equally genetically related to all of those siblings as she would be to her own
- [00:20:58.320]offspring.
- [00:20:59.320]So forsaking her own reproduction to help a more mature, experienced breeder to reproduce
- [00:21:03.640]can actually be fitness beneficial.
- [00:21:07.440]Okay, what about our final square?
- [00:21:12.480]This one describes a situation.
- [00:21:14.640]Where donors pay a fitness cost in order to inflict harm on the recipient.
- [00:21:21.300]So they pay a fitness cost to exact a fitness cost on their interactor.
- [00:21:27.300]We can call this evolutionarily, this is spite, because you're actually spiting yourself,
- [00:21:33.140]spiting your own fitness in order to inflict pain.
- [00:21:37.300]When would this evolve?
- [00:21:38.300]We can't, surely we can't use Hamilton's rule in this circumstance to explain
- [00:21:44.620]these kinds of behaviors, but it turns out that actually we can, with a slight reimagining.
- [00:21:52.300]When I described R before, I used it in a normal sense of relatedness, as in how closely
- [00:22:00.060]related are you to a sibling.
- [00:22:02.240]Technically, what it is, is how closely related are donor and recipient compared to the average
- [00:22:07.980]relatedness of anybody in the population.
- [00:22:10.420]You can be both positively related if you're more related.
- [00:22:14.600]On average.
- [00:22:15.600]Or you could actually be negatively related to other individuals in your population if
- [00:22:21.140]you're less genetically related to them than on average.
- [00:22:24.520]In that case, where you have a negative R and a negative B, because I'm not providing
- [00:22:30.060]a benefit if I'm engaging in spiteful behavior, I'm providing a ham.
- [00:22:36.360]You can still satisfy Hamilton's rule, and spite can evolve in a certain sense.
- [00:22:41.320]There actually are cool examples of spite.
- [00:22:44.580]In nature, this one is one that I will-- oh, I want to pause for one second and say, spite
- [00:22:53.180]in nature is often actually a form of secondary altruism, actually.
- [00:22:57.780]Because spite is often directed at non-unrelated individuals in order to provide a secondary
- [00:23:03.140]benefit to related individuals, for example, by killing off competitors so that your relatives
- [00:23:09.120]can do better because they have less competition.
- [00:23:11.900]Spite is quite often secondary altruism.
- [00:23:14.560]Here's a really cool example.
- [00:23:16.100]This is work done by a postdoc in a lab, Dr. Dan Metz.
- [00:23:21.420]So what you see here is actually two parasites interacting with each other.
- [00:23:26.800]The bigger one is Phyllophthalmus gralli, and the smaller ones are another parasite
- [00:23:32.620]called Haplorchus familiar.
- [00:23:34.560]And what's happening there is that Haplorchus is literally biting into the side of the Phyllophthalmus
- [00:23:42.980]and sucking its guts out.
- [00:23:44.540]Why?
- [00:23:45.540]It is attacking that other parasite.
- [00:23:48.940]Why?
- [00:23:49.940]Because these two parasites very often share hosts.
- [00:23:54.040]They're found co-infecting the same host, often freshwater snails, and the Haplorchus
- [00:24:01.220]is attacking the other one to protect its relatives.
- [00:24:04.600]Why is this spite?
- [00:24:05.980]Because those soldiers are sterile.
- [00:24:08.720]They have no gonads, they can never reproduce, they will spend their whole life, their whole
- [00:24:14.520]life, attacking and sucking the guts out of other parasites so that their siblings that
- [00:24:20.420]are reproductively capable can reproduce.
- [00:24:23.640]It's spite as a form of secondary altruism.
- [00:24:27.840]And lots of other examples of spite in nature follow a similar pattern from soldier casts
- [00:24:32.620]of our eusocial insects that just patrol the area attacking anything else that gets close
- [00:24:37.640]to an ant nest, for example, or even even more crazy things like bacteria pseudomonas
- [00:24:44.500]genosa can produce a toxin that can kill other bacteria, but the only way to release the
- [00:24:50.220]toxin is by exploding itself.
- [00:24:52.500]So individual bacterial cells will explode themselves, flood the environment with a toxin
- [00:24:56.940]that kills competitors, but not closely related individuals that have the gene variants necessary
- [00:25:01.720]to make the antitoxin.
- [00:25:03.940]Okay, what have we learned so far?
- [00:25:09.620]First thing we've learned cooperation and altruism.
- [00:25:14.480]Behaviors that we might like to encourage, right, if we're thinking about trying to apply
- [00:25:18.500]this to humans, they're not fixed over ecological or evolutionary timescales.
- [00:25:24.160]Change the environment, change the genetic structure of the population.
- [00:25:28.040]You can shift the costs and benefits of any behavior, causing cooperation to become selfishness
- [00:25:34.720]or altruism to become favored over, or spite to be favored over altruism.
- [00:25:40.580]Second, really important to keep in mind that cooperation from
- [00:25:44.460]an evolutionary perspective, cooperation is not altruism, and selfishness is not spite.
- [00:25:50.180]We might use those terms interchangeably in our everyday speech, but they are really different
- [00:25:55.940]from an evolutionary perspective, because cooperation and selfishness are behaviors
- [00:26:00.360]that incur a fitness benefit to the actor, the donor of those behaviors.
- [00:26:04.520]They're easy to understand.
- [00:26:06.340]Spite and altruism are much harder to understand, because you have to pay a fitness cost to
- [00:26:10.700]engage in those behaviors.
- [00:26:12.440]Thirdly, altruism is more easily maintained when donors and recipients are closely related,
- [00:26:20.980]whereas spite can emerge in populations where genetic relatedness is highly variable, especially
- [00:26:26.180]in spite against non-relative benefits relatives.
- [00:26:29.820]Okay, this is the first screaming, flaming, bright red stoplight.
- [00:26:36.100]Be so careful.
- [00:26:37.100]This does not suggest that the best way to increase
- [00:26:41.940]cooperation and selfishness is through non-relative benefits.
- [00:26:42.420]It does not suggest that the best way to increase cooperation or altruism in human society is
- [00:26:44.480]by somehow trying to artificially make our communities much more genetically homogeneous.
- [00:26:50.680]That is not what emerges from this, right?
- [00:26:53.640]First off, we should know from hundreds of years of royal birth defects that increasing
- [00:27:01.240]genetic relatedness among human populations is a ticket to inbreeding and very bad genetic
- [00:27:07.240]outcomes.
- [00:27:08.240]But even more importantly, well, equally importantly, it does not suggest that the best way to increase
- [00:27:12.400]the amount of genetic relatedness, which is not the same as like, how similar do I look
- [00:27:22.160]to the person who's standing next to me?
- [00:27:23.980]We are very bad as humans at judging genetic relatedness actually, we're uniquely bad.
- [00:27:30.460]One very weird example relating to our immune systems, we cannot tell whether someone is
- [00:27:36.880]our sibling or not.
- [00:27:40.780]Mice can.
- [00:27:42.380]Mice at two ends of the cage, they can't see each other, they can smell whether that
- [00:27:45.820]other mouse is a genetic sibling.
- [00:27:47.620]We can't do that.
- [00:27:48.620]And if you think about human history, you might say, "Well, I know what I look like,
- [00:27:52.400]and I know what my sister looks like, so I can tell."
- [00:27:55.440]Think back 100,000 years when our behaviors would have been shaped by evolution, you don't
- [00:28:00.160]know what you look like 100,000 years.
- [00:28:03.760]If you didn't watch your sibling be born from a woman that you identify as your mother,
- [00:28:09.200]you don't know if you're related to her or not.
- [00:28:12.360]You don't know because you don't know if her dad and your dad were the same dad.
- [00:28:17.740]So we have to be very careful because Hamilton's rule only applies if genetic relatedness is
- [00:28:23.660]high, not our perception of whether genetic relatedness is high.
- [00:28:27.180]So that's a second really important way that this doesn't work.
- [00:28:33.040]More plausibly, if we're going to try to think about how or if these ideas apply, maybe they
- [00:28:41.340]helped shape evolution.
- [00:28:42.340]Maybe they helped shape our behavior in our earliest human, when we would have lived in
- [00:28:46.700]small bands of relatively closely related individuals interacting with other small bands
- [00:28:53.460]of closely related individuals.
- [00:28:55.740]Maybe, maybe, and only very carefully, would we try applying some of these insights, which
- [00:29:01.400]are all true, to understanding human evolution.
- [00:29:06.100]Maybe those kinds of interactions set us up to be a species that can engage in altruism.
- [00:29:12.320]Why we behave the way we do now, I think we need to turn to a slightly different branch
- [00:29:17.880]of evolutionary biology, which we're familiar to some of you, Mark, at least, who have seen
- [00:29:22.640]this, which is game theory.
- [00:29:25.100]This is the most famous game in all of evolutionary game theory.
- [00:29:28.760]This is a prisoner's game.
- [00:29:30.500]Probably many of you have seen or heard the prisoner's glimmer described, even if you've
- [00:29:34.780]never played.
- [00:29:35.780]The setup is this, right?
- [00:29:37.700]Two criminals are caught, A and B. Prosecutors don't have enough evidence.
- [00:29:42.300]They don't have evidence to convict both of them on a big crime, but they would if one
- [00:29:48.660]of them, if they can get one of them to testify.
- [00:29:50.560]They take them and put them in separate groups, and they say to both of them the same thing.
- [00:29:56.280]If you both stay quiet, you're both going to a jail effort, but if you, A, convict A,
- [00:30:04.680]testify, and B stays quiet, well, you go free.
- [00:30:08.800]You'll go to jail for three years. Of course, B is hearing
- [00:30:13.300]the same thing, right? If I testify and he stays
- [00:30:17.260]silent, I walk free, and he'll go to jail for three years.
- [00:30:20.420]But if we both squeal, we both go to jail
- [00:30:25.400]for two years. This is the prisoner's online. What do you do if you were faced
- [00:30:29.360]with this situation?
- [00:30:33.740]What I want you to do is turn to somebody who's
- [00:30:36.180]really going to do it. So turn to somebody who's sitting near you, ideally somebody you don't know. Don't introduce
- [00:30:40.360]yourself. Say nothing to them. You have been offered this deal,
- [00:30:44.240]the two of you. What I want you to do is think about what's the strategy that you
- [00:30:48.420]should employ that maximizes your benefit.
- [00:30:51.100]All right? So you're ready? On three,
- [00:30:56.260]on the count of three, and then you're going to reveal out loud whether
- [00:31:00.380]you defected, you testified, you dirty rat,
- [00:31:03.620]or whether you stayed silent.
- [00:31:06.300]Cap the code. All right? One,
- [00:31:11.040]two, three, go.
- [00:31:14.160]How many of you cheated? How many of you said, I'm testifying?
- [00:31:31.240]How many of you cheated?
- [00:31:33.240]How many of you stayed quiet?
- [00:31:35.000]You're so nice.
- [00:31:37.240]It turns out that actually from an evolutionary perspective,
- [00:31:41.660]the best strategy is defect. Testify.
- [00:31:45.320]Because your expected payout by squealing is better than your expected payout
- [00:31:51.880]if you stay quiet, right?
- [00:31:52.940]If we kind of go across the top row and look at A's payout, well,
- [00:31:56.440]what do they get if they stay quiet? They get a negative one.
- [00:31:59.000]If they stay quiet and B testifies, though, they get a negative three.
- [00:32:02.680]So,
- [00:32:03.280]on average, it's like a negative two. Whereas if they testify, well,
- [00:32:07.280]in the case that B stays quiet, rock on, you pay nothing.
- [00:32:11.100]Whereas if they both squeal, you get a negative two.
- [00:32:13.960]So, on average, it's a negative one.
- [00:32:15.100]So, your best strategy is actually you all are terrible at maximizing your
- [00:32:20.880]expected evolutionary payout, except for those of you who defected.
- [00:32:23.520]You guys are smart. Okay. Let's play again.
- [00:32:27.840]But this time, I want you to turn to somebody different,
- [00:32:30.080]and we're going to play over and over and over.
- [00:32:32.340]Okay. So, turn to somebody new.
- [00:32:37.780]You can actually introduce yourself this time,
- [00:32:39.580]because you're going to keep playing.
- [00:32:40.520]And then on three, I'm going to count to three,
- [00:32:44.580]and you're going to declare what you're doing.
- [00:32:46.600]Okay. Ready? One, two, three.
- [00:32:51.280]Rubio. All right.
- [00:32:53.200]All right. Ready?
- [00:33:01.660]We're going to play again.
- [00:33:02.340]Okay. One, two, three.
- [00:33:04.720]Reveal. Testicles.
- [00:33:06.060]Got you, please.
- [00:33:09.000]One, two, three.
- [00:33:11.420]Reveal.
- [00:33:12.020]One, two, three.
- [00:33:17.940]Reveal.
- [00:33:18.460]One, two, three.
- [00:33:25.260]Reveal.
- [00:33:25.800]Okay. All right.
- [00:33:28.400]We'll stop.
- [00:33:29.720]A couple of things here.
- [00:33:31.480]Okay.
- [00:33:31.680]So how many people?
- [00:33:32.320]How many people ended up cooperating by the end?
- [00:33:35.280]We're keeping track of things, plots, and facets here.
- [00:33:42.140]That's what I asked you to actually track your score.
- [00:33:45.340]It might have altered your perspective on what to do.
- [00:33:48.720]So there's a couple of key things to this.
- [00:33:50.040]What we just did is called the iterated prisoner's dilemma,
- [00:33:53.880]because you're going to play over and over and over again.
- [00:33:56.640]The same part.
- [00:33:57.660]And you can remember what they did the last time.
- [00:34:01.000]And importantly, it's
- [00:34:02.300]iterated and unceasing, right?
- [00:34:04.220]I didn't tell you how many times you were going to play.
- [00:34:06.620]I just said you're going to keep playing.
- [00:34:07.920]You're going to keep playing until I tell you to stop.
- [00:34:11.920]And there's been a lot of work on iterated prisoner's dilemma.
- [00:34:15.220]It's one of the best-studied models in evolutionary game theory.
- [00:34:18.800]And we know from many, many decades of work on it
- [00:34:21.680]that actually the best strategy when engaging in this
- [00:34:25.160]is a really simple one.
- [00:34:26.340]It's called tit-for-tat.
- [00:34:28.360]In tit-for-tat, I offer to cooperate initially, and then I just
- [00:34:32.280]mirror whatever my partner does.
- [00:34:34.780]If they cheat, then I cheat.
- [00:34:37.320]If they cooperate, then I cooperate.
- [00:34:39.160]And we just keep going back and forth.
- [00:34:40.900]And it can be shown and has been shown many, many times.
- [00:34:43.260]There's a very famous computer tournament in the '80s
- [00:34:48.900]at the University of Michigan.
- [00:34:50.160]I actually took a class with Bob Holland, or Bob Axelrod,
- [00:34:54.720]who ran the tournament that showed everybody submitted
- [00:34:58.200]these elaborate, often really elaborate programs of what
- [00:35:01.740]their strategy was.
- [00:35:02.260]What their strategy was going to be
- [00:35:03.700]to win this evolutionary game.
- [00:35:05.720]And it turned out the winning strategy was a one-liner.
- [00:35:08.020]It was literally just cooperate, then mirror.
- [00:35:11.080]That's it.
- [00:35:11.740]And that strategy beat all of the others.
- [00:35:13.700]And every time they do these tournaments,
- [00:35:16.580]even when they make the situation more complex,
- [00:35:19.620]at the core of the winning strategy is tit for tat.
- [00:35:25.000]And what is tit for tat from our perspective?
- [00:35:28.500]It's not altruism.
- [00:35:30.700]It's really more reciprocity.
- [00:35:32.240]It's, I scratch your back or pick parasites out of your fur.
- [00:35:39.520]You pick parasites out of my fur.
- [00:35:42.800]I'm not sacrificing fitness forever.
- [00:35:45.500]I'll pay a little fitness cost to help you now in exchange for you paying a little fitness
- [00:35:49.200]cost later to help me.
- [00:35:50.900]And social grooming is a really classic example of reciprocity.
- [00:35:56.240]Actually also things like vampire bats that will share blood meals with unrelated babies
- [00:36:02.220]with the knowledge, the sure knowledge, that tomorrow my neighbor may share, my unrelated
- [00:36:10.600]neighbor may share some blood meals with my baby.
- [00:36:19.540]Other games, though, can reveal how the best strategy can depend on who you're playing
- [00:36:25.780]with.
- [00:36:26.780]So we're going to play it.
- [00:36:27.780]We're going to consider it.
- [00:36:28.780]We're not going to play this one, but we're going to consider it.
- [00:36:30.480]This is called the ultimatum game.
- [00:36:32.200]So in the ultimatum game, you have two roles, a proposer and a responder.
- [00:36:37.660]And the proposal makes an offer of how to divide a shared resource.
- [00:36:41.420]And they can make a fair offer, 50/50, or they can make an unfair offer where they decide
- [00:36:46.520]to take it.
- [00:36:47.520]I will take 90% of this, thank you very much, and you can have the other 10.
- [00:36:51.620]And then the responder can accept or reject that offer.
- [00:36:54.440]If they accept, then everybody gets what they get.
- [00:36:56.820]And if they reject, then both players get nothing.
- [00:36:58.820]So there's a cost associated with rejecting.
- [00:37:02.180]And so you can say, well, given that, everybody's strategy space can be defined up by what kind
- [00:37:09.420]of offers do they make and what kind of offers will they accept.
- [00:37:12.680]And so S1 is they only make unfair offers, but they'll accept any offer that's made to
- [00:37:19.460]them.
- [00:37:20.600]So it's a selfish, but a pragmatic selfish strategy.
- [00:37:26.640]S2 is a spiteful strategy.
- [00:37:32.160]Makes unfair offers, but rejects unfair offers that are made to them.
- [00:37:36.660]It's a strategy that is seeking to minimize the payoff for their opponent.
- [00:37:42.320]Whatever they do, they want to hurt their opponent, even if it comes at a cost to them,
- [00:37:45.940]because they reject the offer, which means they get nothing, anytime somebody reciprocates
- [00:37:50.720]and makes an unfair offer to them.
- [00:37:52.820]S3, we might consider like the altruistic strategy.
- [00:37:56.280]It's the one that says, I'll give you half of everything and I'll take whatever you'll
- [00:38:00.140]give me.
- [00:38:01.140]So they pay a cost.
- [00:38:02.140]They pay a cost because they're willing to accept a crappy offer and they're always willing
- [00:38:08.040]to make a fair offer, even in a situation where they might be playing against somebody
- [00:38:12.580]who would accept any offer that they would make.
- [00:38:15.000]They could offer 1% and that person might take it, but they don't do it.
- [00:38:18.900]They only make fair offers.
- [00:38:21.020]And strategy four is, of course, then to be fair, to make fair offers, but to reject when
- [00:38:26.540]somebody makes an unfair offer to you, which might be the one where you sort of feel the
- [00:38:30.120]most like, yeah.
- [00:38:32.120]That seems right.
- [00:38:33.120]I'll be fair, but if you're going to be a jerk to me, I'm going to be a jerk right now.
- [00:38:38.820]So people have explored this model in spaces where you sort of start a whole population
- [00:38:45.220]of individuals, each of which have a unique set of strategies, and you can watch as a
- [00:38:50.680]strategy like sort of evolve over time, and we can look for the set, the strategy or set
- [00:38:56.320]of strategies that does the best, that maximizes the payoff, the sort of evolutionarily stable
- [00:39:02.100]strategy to use a little bit of jargon. And what we know is that in the case where who you
- [00:39:10.260]interact with is totally at random, right? So there's no structure or assortment to who you
- [00:39:17.020]interact with. Whatever my strategy is, I interact with somebody from the population at random.
- [00:39:21.360]What wins in that case is that make unfair offers, accept any. That's the strategy that
- [00:39:27.640]has the highest expected payoff over everything else.
- [00:39:32.080]Beat every other strategy when everybody is playing with the strategy. You can't do better
- [00:39:37.520]than just doing that. That's sort of sad because that's a world where unfairness reigns. You only
- [00:39:45.580]make unfair offers and you always get stuck with unfair offers. Lots of people have been really
- [00:39:50.800]interested in, well, what are the conditions that might promote behaviors that aren't,
- [00:39:55.080]but actually lead to a little bit more fairness? One way that we can get that,
- [00:40:02.060]is by leaning back on Hamilton's theory and saying, well, what if you're more likely to
- [00:40:08.660]interact with individuals that have a similar strategy to you, positive assortment? That's
- [00:40:14.200]like saying I interact with close genetic relatives to the extent that these strategies
- [00:40:18.940]might be genetic. They are in these simulations. Then what happens is, well, you can still get in
- [00:40:25.860]the unfair, sad world, but there also exists a possibility of going to kumbaya,
- [00:40:32.040]where everybody is altruistic and nice and they only make fair offers. When the population is a
- [00:40:37.700]mix of S3 and S4, individuals that only make fair offers, it doesn't really matter what their
- [00:40:44.300]strategy is for declining offers because they never decline. They're always just
- [00:40:48.260]happy because everybody's making fair offers and accepting them. That's great.
- [00:40:51.940]What happens in the opposite case, where Hamilton's rule says spite could evolve,
- [00:40:57.380]where you are actually more likely to interact with an individual who
- [00:41:02.020]strategy is different than yours, where relatedness in that sense is negative.
- [00:41:06.740]Interestingly, what happens in that case is actually there's no possibility of getting
- [00:41:13.700]S1 evolving. What happens is a stable mix of S2, the spiteful strategy, and S3,
- [00:41:22.320]which is very strange and interesting. So on some level, we're in a slightly fairer world
- [00:41:28.520]because now there are actually individuals out there that make
- [00:41:32.000]fair offers and will accept anything. But it's a weird world where we're sort of a mix of people
- [00:41:39.000]who behave very spitefully and people who behave very altruistically. And the reason that this
- [00:41:44.540]world exists is because we expect that spite can do well in this situation, but actually it only
- [00:41:51.120]works if there's somebody to be spiteful towards. So spite can never completely exclude those
- [00:41:56.900]altruists. They need them. The only way for spite to work is if there's somebody to be spiteful
- [00:42:01.980]towards me. And they can't be spiteful towards anybody else because anybody else does better
- [00:42:06.120]than they do in that role. Okay, I want to use the last couple of minutes that I have
- [00:42:15.100]to talk about one last study. And it's the one that gets us the closest but also the
- [00:42:19.740]most dangerously close to the theme of this lecture today. So this is a very famous paper
- [00:42:26.660]that I'm going to talk about. It's been cited over two thousand times since it was
- [00:42:31.960]published in the mid-2000s. And they're really setting up this model for understanding the
- [00:42:38.120]evolution of behavior, really thinking about earliest human populations or small groups
- [00:42:43.080]of individuals that interact with each other. And so what they set up is this situation
- [00:42:48.160]where behavior is controlled of individuals is controlled by two genes. One gene controls
- [00:42:52.400]how you interact with others outside your group. And you can be either tolerant towards
- [00:42:57.400]them or you can be parochial towards them. If you're tolerant, then when you're tolerant
- [00:43:01.940]and your group interacts with another group, there's some possibility that you'll exchange
- [00:43:05.260]research. You'll get some benefit out of that interaction. If you're parochial, you don't
- [00:43:08.800]want to interact with another group. And there's a second gene that controls the behavior within
- [00:43:14.260]the group, the altruistic. And in this case, altruism is modeled as a public good. So altruists
- [00:43:20.440]contribute to a public good that's distributed evenly to everyone in the population, regardless
- [00:43:25.220]of whether they contributed to the public good or not.
- [00:43:28.560]Or you can be non-altruist. You take advantage of the altruists in the population, which
- [00:43:31.920]is nothing to you. So every individual in each group can be either a TN, where they
- [00:43:37.160]cooperate with other groups, but they don't share resources within the group, or a PN.
- [00:43:42.680]You can be DIDS or a TA, which might be the one we hope for, right? The hopeful outcome
- [00:43:48.000]would be that we see the evolution of only TA individuals who are tolerant towards outsiders
- [00:43:53.540]and very altruistic within the group, or PA.
- [00:43:57.040]And this is where their focus is really going to be, because these PA individuals are a
- [00:44:01.900]lot more consumed in the model to act. They're not just intolerant, they're actually spiteful
- [00:44:09.620]towards outgoing members. And so this is where the idea of war comes in, because what they
- [00:44:16.400]do in the model is they say, okay, when two groups interact, that interaction can be hostile
- [00:44:20.140]or not. And whether it's hostile or not depends on how many tolerant individuals there are
- [00:44:25.220]in each group. If there's lots of tolerant individuals in each group, then the outcome
- [00:44:29.140]is it's not hostile, and all of those tolerant individuals
- [00:44:31.880]get a payout for being tolerant. We wanted to cooperate, and we did. If there are not
- [00:44:38.880]very many tolerant individuals, then there's a possibility that the two populations will
- [00:44:44.220]go to war. The two groups will go to war with each other. Whether they go to war or not
- [00:44:48.020]depends on the difference in the number of parochial altruists in each group. If one
- [00:44:54.340]group has a lot of these spiteful altruists and one group has few, then war is very likely
- [00:45:01.860]If they have similar numbers, then war is unlikely. And if war doesn't happen, then
- [00:45:08.500]the top individual is going to pay out again. If they go to war, though, then there's three
- [00:45:13.200]plus one. You could win, you could lose the war, or the war could be a drama. Whether
- [00:45:21.040]you win or lose depends on that relative advantage, sort of probabilistically, relative difference
- [00:45:26.100]in the number of kind of fighters you have in your population. But there's always some
- [00:45:31.840]possibility of a draw even when those numbers are different. And if there's a-- regardless,
- [00:45:37.980]win, lose, or draw, some fraction of those PA individuals are assumed to die because
- [00:45:43.360]of the fight. The potential benefit to the group, which is why they call this interaction
- [00:45:49.500]kind of spiteful altruists, is because there is a benefit to the group of engaging in fighting
- [00:45:55.220]because if you lose, some fraction of non-fighters in the population also die and/or lose. And
- [00:46:01.820]they're replaced by individuals, offspring of individuals from within the group. So this
- [00:46:09.600]is the setup. What they do is they initialize the population. Everybody is a TN, so tolerant
- [00:46:15.340]towards other groups, non-altruistic within a group, and then they let the thing evolve
- [00:46:20.540]over thousands of generations. And what they were interested in is, well, where do we end
- [00:46:25.100]up in the space of altruism and parochialism? Do we end up up here where we have lots of
- [00:46:31.800]individuals that are not very altruistic? That's kind of the worst possible scenario.
- [00:46:37.880]Well, actually, not in the model. You might think of that as being the worst, but it's
- [00:46:42.020]not. Actually, this is the worst scenario to end up in, because this is where you fight
- [00:46:45.660]outsiders. But you're very gentle and kind towards other individuals within your own
- [00:46:49.980]group, contributing to the public good and sharing it. Or you can end up down here where
- [00:46:54.340]you're tolerant towards others, but you're not very altruistic within your group. Or
- [00:46:58.080]you can end up over there, which is kind of where we would like to end up, where you're
- [00:47:01.780]sort of cooperative, but also very altruistic within your group. What do they actually find
- [00:47:07.000]is that there's a strong tendency for the models to end up in one state or the other
- [00:47:11.920]and actually to go back and forth between them.
- [00:47:14.380]The population is either comprised almost entirely of tolerant non-altruists, that's
- [00:47:20.780]that lower left corner, or parochial altruists, these warlike individuals that want to fight,
- [00:47:28.340]which is hence the title of the paper, which is the co-evolution of altruism and war.
- [00:47:31.760]This suggests that somehow war might promote the evolution of altruism and being a very
- [00:47:37.560]positive outcome.
- [00:47:38.560]Now, I think there's really important caveats with it.
- [00:47:40.440]I'm going to tell you what they said, and then I'm going to tell you where I disagree
- [00:47:43.260]with them.
- [00:47:44.260]At the within-group level, they say, well, selection favors these TN individuals because
- [00:47:49.320]they receive resources from other groups when groups interact, but they don't pay the cost
- [00:47:55.480]of being altruistic, and it kind of expressly forbidden TA from ever evolving because the
- [00:48:01.740]way they distribute resources within the group kind of violates Hamilton's rule because they
- [00:48:07.180]can't, they don't prep, they are not able to allocate resources based on relatedness,
- [00:48:11.960]which is kind of a critical feature for Hamilton's rule.
- [00:48:14.840]So to some extent, they kind of made that not even the possible solution to the model.
- [00:48:19.700]The second thing they say is that at the between-group level, and this is true for sure, selection
- [00:48:24.400]favors these PA individuals that want to fight because groups that don't have those individuals
- [00:48:29.980]go to war all the time and always lose.
- [00:48:31.720]You have to have some of them if everybody surrounding has them, but again, they've assumed
- [00:48:39.520]that PNs, those parochial non-altruists, don't fight.
- [00:48:43.880]They just don't like anybody, basically, in the model.
- [00:48:48.560]It's unclear what would happen if those individuals also contributed in some sense to fight.
- [00:48:53.600]I think that this is an interesting model, which is why I bring it up.
- [00:48:57.200]It's probably the thing I'm the most about of everything in my talk.
- [00:49:01.700]But it's interesting enough that I thought it was worth it.
- [00:49:04.860]What have we learned from here?
- [00:49:08.560]Reciprocity can allow for cooperation, even between non-relatives, especially if you force
- [00:49:15.240]interactions repeatedly, you force two parties to interact over and over and over again,
- [00:49:19.760]then you can end up in a situation where reciprocity is the best strategy.
- [00:49:23.260]It's not quite altruism, but at least it's cooperative on some level.
- [00:49:28.780]We also see that.
- [00:49:31.620]Depending on the way that we interact, who we interact with, we can get these non-intuitive
- [00:49:35.860]situations where a little bit of spite might be the only thing that allows a little bit
- [00:49:40.540]of altruism to persist in the population.
- [00:49:44.260]And lastly, we saw this maybe problematic example where inter-group conflict might actually
- [00:49:50.000]be a force that helped to select for within-group altruism.
- [00:49:53.260]But again, my big caveat, folks, be careful.
- [00:49:56.780]I gave this talk to my son who's just super interested in history of war, and he just
- [00:50:01.540]spent the whole time trying to shoehorn every war that he could think of into that model,
- [00:50:07.300]and it became very clear how problematic that would seem to be.
- [00:50:11.480]Okay.
- [00:50:12.480]So I hope that in today's talk, at the very least, you learned some interesting biology,
- [00:50:17.900]even if it has nothing to do with human behavior or peace or reconciliation.
- [00:50:23.400]But I hope that at least you got the sense that evolutionary biology is an interesting
- [00:50:28.000]lens through which to think about behavior.
- [00:50:31.460]Think about behavior in a different way than the way we usually do, by actually applying
- [00:50:35.460]the ideas of evolutionary biology, thinking about fitness costs and benefits, not just
- [00:50:39.700]is this nice or is this mean, and also revealing the way that social structures, environmental
- [00:50:45.060]structures may promote or inhibit the spread of behaviors that we might want to see in
- [00:50:51.720]human societies.
- [00:50:52.720]With that, I will close by saying, if you're interested in these ideas, I strongly recommend
- [00:50:57.900]this book.
- [00:50:58.900]A lot of what I talked about is disciplinary.
- [00:51:01.380]It's discussed in that book, and it's a really interesting and much broader treatise on this
- [00:51:05.900]whole subject.
- [00:51:06.900]I'm going to do that a little nervously.
- [00:51:31.300]Most of what you talked about was altruism at an interest-specific level, but for humans,
- [00:51:37.240]I don't want to stay here, we see altruism towards the environment and other species
- [00:51:42.660]like dogs and cats.
- [00:51:43.660]Yeah.
- [00:51:44.660]Sure.
- [00:51:45.660]How would you explain that?
- [00:51:46.660]Yeah.
- [00:51:47.660]Super interesting question.
- [00:51:48.660]For those on Zoom, just in case you couldn't hear, asking about why are we altruistic towards
- [00:51:54.100]non-relatives?
- [00:51:55.100]In particular, there's a really famous example of this.
- [00:51:57.860]Has anybody ever heard of the trolley car problem?
- [00:52:01.220]Most of you are nodding your heads, right?
- [00:52:02.440]It's like you were standing next to a trolley car, you can throw a lever and divert a trolley
- [00:52:06.860]car.
- [00:52:07.860]And people have done it where it's like, there's a person on the side or a dog asking, do you
- [00:52:14.420]divert the trolley car so that it hits the person or the dog?
- [00:52:18.880]And depending on who the person is on the other side, you might divert the trolley car
- [00:52:24.420]to hit the person and not the dog, which is so weird from an evolutionary perspective.
- [00:52:29.140]Why would you do that?
- [00:52:31.140]People divert.
- [00:52:32.140]It's like, it's your aunt.
- [00:52:33.140]And they're like, hmm, we're saving the dog.
- [00:52:38.520]So that maybe just reinforces this like humans are really bad at distinguishing kin from
- [00:52:43.260]non-kin.
- [00:52:44.260]We sort of misapply altruism in places where it doesn't sort of apply.
- [00:52:50.580]Or more generously, maybe it points at some like altruism is so deeply ingrained in us
- [00:52:56.040]through whatever we went through as we were evolving, that we now recognize something
- [00:53:01.060]or some benefit in being altruistic towards others, even though again, to just be very
- [00:53:06.620]clear, right?
- [00:53:07.620]When I'm being nice to my dog, I'm not paying fitness.
- [00:53:10.340]So it's not truly altruistic in that sense.
- [00:53:13.840]So just like to be careful, but you know, like diverting the trial car to hit my brother
- [00:53:19.200]or not.
- [00:53:20.200]I am actually paying the cost there, an inclusive fitness cost, because killing my brother before
- [00:53:26.680]he can have a lot and the children decreases my inclusive fitness.
- [00:53:30.980]Some of my genes are in that.
- [00:53:32.760]So there is an element of it, but it is very weird.
- [00:53:35.680]It's very weird.
- [00:53:38.740]Are there examples of organisms that are equally as bad as recognizing relatedness as humans
- [00:54:00.900]and how social structures work?
- [00:54:02.980]Oh, that's ... Somebody pay him more.
- [00:54:11.000]I will say just as an example, the one trait that we are good at distinguishing relatives
- [00:54:16.320]from non-relatives is that you can actually smell how genetically distinct someone is
- [00:54:24.100]from you at really important immune website.
- [00:54:27.160]So it's really, really important to have as much variability in your immune genes
- [00:54:30.820]as possible, because the more variability you have, the better you are at recognizing
- [00:54:35.160]in diverse sets of pathogens and parasites.
- [00:54:38.140]And so in order to make sure that to try to maximize the amount of genetic variability
- [00:54:44.080]in your offspring, you can actually smell how genetically distinct and immune genes
- [00:54:50.120]are a prospective partner is the one example where we are actually a little bit able to
- [00:54:55.940]distinguish king from non-king, but yeah, throw that out there.
- [00:55:00.740]So my question is, why are you talking about this if you don't want us to link
- [00:55:07.840]the human behavior?
- [00:55:08.840]Because you showed us really good examples in nature and then said, well, if you think
- [00:55:17.340]about it, most human conflict looks like it directly relates into that model, but don't
- [00:55:24.060]link it, otherwise you're eugenicists.
- [00:55:30.660]More than I said.
- [00:55:32.300]No, no, it's more that it's not that I don't think any of these things apply.
- [00:55:41.040]It's that I think that the ways that they apply are likely nuanced and may apply more
- [00:55:46.460]to our long ago history than they do to where we are now as a society.
- [00:55:51.860]So they may tell us something about why we are the way that we are on some deep level,
- [00:55:57.560]but not necessarily why we specifically in this particular context.
- [00:56:00.580]In this particular interaction, we behave in any particular way.
- [00:56:05.960]It's a very, I actually had, this is like the source of my nervousness, was this realization
- [00:56:11.360]last night that, oh, what am I talking about?
- [00:56:15.480]Do I actually believe that what I'm talking about has a huge amount of scope to explain
- [00:56:21.020]human behavior?
- [00:56:22.020]And the answer is frankly, I'm not sure.
- [00:56:25.420]That's the true answer to that question is, I think because we know that all of these
- [00:56:30.500]things do work in nature-- kin selection, that's the underlying thing behind Hamilton's
- [00:56:34.940]rule.
- [00:56:35.940]Hamilton's rule is a real thing.
- [00:56:36.940]It definitely applies.
- [00:56:38.460]And so if there were ever situations where human populations conformed to the conditions
- [00:56:44.580]where Hamilton's rule applied, then absolutely, our behaviors would be subject to evolution
- [00:56:49.300]according to those rules.
- [00:56:51.120]Do they apply now?
- [00:56:52.120]I don't think they do.
- [00:56:53.120]We're far too genetically diverse, right?
- [00:56:54.960]If I sacrificed my life to save everyone in this room, would it increase my inclusive
- [00:56:59.420]fitness?
- [00:57:00.420]Probably not.
- [00:57:01.860]Probably not because we're not genetically related.
- [00:57:03.820]We're not nearly genetically related.
- [00:57:07.060]But 20,000 years ago, if I sacrificed my life for the good of the band of hunter-gatherers
- [00:57:13.580]that I lived in, would it have increased my inclusive fitness?
- [00:57:16.520]I think maybe it might have.
- [00:57:18.620]And that might explain something deep about the way we interact with one another.
- [00:57:24.740]So I don't know if that totally answered your question or if I just artfully dodged it.
- [00:57:30.340]It's a really good question, it's really important.
- [00:57:37.800]And that's why, I mean, I'm glad that you asked, because I hope that everybody in here
- [00:57:41.020]is kind of asking that question as they walk out.
- [00:57:43.720]Did I just learn some cool biology?
- [00:57:45.580]There's value in that.
- [00:57:47.280]Did I learn something deep and fundamental about humans?
- [00:57:49.580]There would be value in that too, if you felt like that.
- [00:57:52.400]Or did I answer it as someone sitting somewhere in the muddy middle between those two poles?
- [00:57:57.400]I hope you're somewhere in the muddy middle.
- [00:58:00.260]So we are out of time, but I'm going to take the privilege of holding the microphone and
- [00:58:04.520]ask one more.
- [00:58:05.520]Sure.
- [00:58:06.520]And I wonder if this might help or get us to another answer to Jack's last question,
- [00:58:13.180]which is, I wonder how much of this has the inapplicability to humans has to do with the
- [00:58:20.080]way we're defining altruism and spite.
- [00:58:24.260]In most regular uses of those terms, they're really intentional.
- [00:58:30.180]Yeah.
- [00:58:31.180]And it sounded like you were really clear at the beginning to say this is not your generic
- [00:58:39.700]use of altruism and spite, where it's not that there's some beings saying, "I need my
- [00:58:45.960]genes to reproduce, and what's my best strategy to doing that," and that's not how humans
- [00:58:50.860]perceive it.
- [00:58:51.860]That's not how anything evolves, right?
- [00:58:55.040]We do things, it may or may not help our genes to get into the next generation.
- [00:59:00.100]Those behaviors, those traits will increase in frequency in the population if it doesn't.
- [00:59:05.020]It won't.
- [00:59:06.020]You used a word there that I wanted to follow, "intention."
- [00:59:10.540]So what do you mean by intention?
- [00:59:12.820]I mean that in an interaction, I'm sitting there thinking about what is the
- [00:59:17.600]best way for my gene pool to move on.
- [00:59:19.900]Ah, I see.
- [00:59:20.900]Right, right, right.
- [00:59:21.900]Yeah, we don't think about that, right?
- [00:59:22.900]I thought you meant intention in the sense like, when species engage in altruistic acts,
- [00:59:27.100]they are intentionally doing that, right?
- [00:59:30.020]It's not like it's an intentional decision to do that, it's just that it's encoded, it's
- [00:59:35.540]encoded.
- [00:59:36.540]Wait, my decision to be spiteful in any given moment is about I'm mad at you, so I'm going
- [00:59:43.540]to get you, not I want my gene pooled.
- [00:59:48.100]Which distinguishes, yes, our human definition of spite from an evolutionary definition of
- [00:59:52.540]spite.
- [00:59:53.540]Thank you all for staying.
- [00:59:54.540]Thank you so much, Dr. Presser.
- [00:59:59.940]Thank you all for joining us in January for our next webinar.
- [01:00:08.120]Thank you.
The screen size you are trying to search captions on is too small!
You can always jump over to MediaHub and check it out there.
Log in to post comments
Embed
Copy the following code into your page
HTML
<div style="padding-top: 56.25%; overflow: hidden; position:relative; -webkit-box-flex: 1; flex-grow: 1;"> <iframe style="bottom: 0; left: 0; position: absolute; right: 0; top: 0; border: 0; height: 100%; width: 100%;" src="https://mediahub.unl.edu/media/23570?format=iframe&autoplay=0" title="Video Player: How Evolutionary Biology Helps Us Understand Human Conflict and Cooperation | CAS Inquire" allowfullscreen ></iframe> </div>
Comments
0 Comments