Verbing Science! with Ritu Raman
Curt Bright
Author
02/09/2023
Added
19
Plays
Description
7. Ritu Raman, “Bio and the Bot”
The nature of scientific research is that you don’t know for sure whether something will work when you first try it – if you did know, it wouldn’t really be science, now would it? The most successful scientists are comfortable taking intellectual risks – trying out an idea, seeing what happens, saying, “Hmmm, that’s strange,” and trying to figure out what went wrong, often many times over before something finally goes right – maybe something they couldn’t even have anticipated at the beginning of this process. Meet Dr. Ritu Raman: engineer, innovator, risk-taker! In this episode, Ritu shares how wondering whether living tissue could be used to develop a new kind of robot—a “bio-bot”— started her on a journey that, like so many scientists’, has been marked by lots of failures, some successes, and a perpetual willingness to be surprised.
Searchable Transcript
Toggle between list and paragraph view.
- [00:00:00.889](upbeat music)
- [00:00:07.680]I wanted to be a priest.
- [00:00:09.510]They said, "Girls don't do that."
- [00:00:13.290]I wanted to be captain of the national cricket team.
- [00:00:16.320]They said, "Girls don't do that."
- [00:00:20.400]I wanted to be a scientist.
- [00:00:22.980]They said, "Girls don't do that."
- [00:00:24.990]And I said, "I am sick and tired of your negativity."
- [00:00:29.730]This is the story of Dr. Ritu Raman,
- [00:00:32.190]engineer, innovator, and risk taker.
- [00:00:36.270]The nature of scientific research
- [00:00:37.920]is that you don't know for sure
- [00:00:39.480]whether something will work when you first try it.
- [00:00:42.090]If you did know it wouldn't really be science now, would it?
- [00:00:45.690]The most successful scientists are comfortable
- [00:00:47.730]taking intellectual risks, trying out an idea,
- [00:00:50.970]seeing what happens, saying, "Hmm, that's strange,"
- [00:00:53.973]and trying to figure out what went wrong,
- [00:00:56.160]often many times over before something finally goes right,
- [00:00:59.520]maybe something they couldn't even have anticipated
- [00:01:01.800]at the beginning of this process.
- [00:01:04.398]In this episode, Ritu shares how her experiences growing up
- [00:01:07.770]as a child of engineers in India, Kenya and the US
- [00:01:10.938]shaped her desire to use science to solve problems,
- [00:01:14.670]and how she started wondering whether living tissue
- [00:01:17.040]could be used to develop a new kind of robot, a biobot,
- [00:01:20.634]that can adapt and change in response to its environment.
- [00:01:24.750]Her journey, like so many scientists,
- [00:01:27.060]has been marked by lots of failures, some successes,
- [00:01:30.056]and a perpetual willingness to be surprised,
- [00:01:33.180]and perhaps most importantly, to surprise herself.
- [00:01:36.900]I'm Jocelyn, and let's get Verbing.
- [00:01:38.755](upbeat music)
- [00:01:49.378](indistinct), right,
- [00:01:50.211]of making robots out of living materials.
- [00:01:52.590]And you know, in some levels, I'm kind of tempted
- [00:01:56.040]to be like, "Well, yes, these biobots work in a Petri dish,
- [00:02:00.060]but if I drop 'em on the ground, they're gonna die.
- [00:02:02.220]We have nothing to worry about.
- [00:02:03.450]This isn't an issue."
- [00:02:05.100]But I also don't wanna be dismissive,
- [00:02:06.630]because I think with this science,
- [00:02:08.430]like with all other science,
- [00:02:10.860]anytime you build something,
- [00:02:12.510]even if you intend to use it for good,
- [00:02:15.450]or to help people live happier or healthier lives,
- [00:02:19.830]it's possible that somebody else could try
- [00:02:21.840]to use that technology for a harmful application.
- [00:02:26.550]So I actually think a lot about the ethics
- [00:02:28.680]of building with living materials.
- [00:02:30.570]And I host a lot of workshops around the country.
- [00:02:33.660]We're actually doing one at Harvard Medical School in March.
- [00:02:37.260]I'm speaking at their Medical School Bioethics Conference,
- [00:02:40.530]where I kind of pitch a couple different scenarios
- [00:02:42.690]of what if a biobot started being able to replicate itself
- [00:02:48.120]or what if a biobot, we were able to engineer ones
- [00:02:50.970]that we could release into the wild and then we lost one?
- [00:02:55.320]How would you deal with it?
- [00:02:56.400]And we don't just ask these questions to scientists.
- [00:02:58.590]We ask this to everyone,
- [00:03:01.140]regulatory people, people who do policy,
- [00:03:03.810]people who are just interested in science,
- [00:03:05.250]people who hate science, kids,
- [00:03:07.833]people in their 80s, everyone,
- [00:03:10.200]'cause I really am not interested in doing science
- [00:03:14.610]if it's not something that's interesting or impactful
- [00:03:16.828]to other people in a way that they want.
- [00:03:19.440]So I pitched these ideas.
- [00:03:20.640]I'm like, "Hey, I made this new technology.
- [00:03:22.500]I think it could be really helpful for all of us,
- [00:03:25.393]but maybe it has some potential risks.
- [00:03:28.290]What do you think?
- [00:03:29.123]How do you think that we could mitigate that?"
- [00:03:31.710]And the scenarios I kind of pose
- [00:03:33.540]are nothing that we can do right now, right?
- [00:03:35.520]There's no way that I could get a biobot
- [00:03:37.980]to replicate itself right now
- [00:03:39.750]or maybe even in the next 50 years,
- [00:03:41.730]but maybe somebody would be able to do that in 100 years,
- [00:03:44.490]and if so, then I wanna be asking those questions now
- [00:03:47.760]to show that I care about our technology
- [00:03:51.330]having a positive impact on our world.
- [00:03:53.820]Right, because I mean, one part of science communication
- [00:03:56.760]that I've argued too is just we kind of have to look ahead
- [00:03:59.220]and be able to...
- [00:04:00.556]Really, the experts need to be able to look ahead
- [00:04:02.790]and kind of control that narrative in a way
- [00:04:04.440]and not allow non-experts to control it
- [00:04:06.870]and use it for fear or something worse,
- [00:04:08.730]but be like, "Hey, yeah, there's a lot of possibilities,
- [00:04:11.040]and let's all work together and talk together
- [00:04:12.684]about all of these things."
- [00:04:13.922]And plus just you stating three of those,
- [00:04:17.190]I could write short stories about it
- [00:04:18.630]as science fiction or something like that, right, because-
- [00:04:20.505]Yeah.
- [00:04:21.540]We're actually playing it out the scenario
- [00:04:23.370]and kind of seeing ahead of all these things
- [00:04:25.230]that we can potentially use it for,
- [00:04:27.514]both the good, the bad, and everything in between.
- [00:04:31.080]And so, yeah, I think that is excellent
- [00:04:33.779]and I would really like to do that workshop.
- [00:04:38.400]And so important that you're having this conversation,
- [00:04:40.353]not just with scientists, and that, as you said,
- [00:04:43.050]you're not being dismissive of these concerns,
- [00:04:44.880]even if they're not coming to the fore right now
- [00:04:47.050]in the work that you're doing,
- [00:04:48.300]but that you're taking them seriously
- [00:04:50.340]and including everybody in the conversation.
- [00:04:52.050]And it's reminding me of a conversation
- [00:04:53.820]Brad and I had a couple of weeks ago
- [00:04:55.344]sort of kicking off our If Then Ambassador series
- [00:04:58.186]where we talked about the just the whole notion
- [00:05:00.570]of objectivity in science
- [00:05:02.610]and why we need multiple perspectives
- [00:05:04.426]because there is no such thing as true objectivity
- [00:05:08.640]in the way we might think of it,
- [00:05:09.720]and even if we could achieve objectivity
- [00:05:12.720]in the sense of being totally value neutral,
- [00:05:14.730]is that really what we wanna do,
- [00:05:15.635]or do we wanna have a science that is responsive
- [00:05:18.660]to our value system and and to what people need and want?
- [00:05:22.650]Yeah, I mean, I couldn't have said it
- [00:05:24.960]more beautifully myself.
- [00:05:26.250]I think it's so important to talk to people outside
- [00:05:29.940]of your field and to not be dismissive of them.
- [00:05:31.950]And actually, the idea of engaging people
- [00:05:34.290]who weren't scientists wasn't even mine.
- [00:05:36.150]It was something that I came up with after talking
- [00:05:38.790]to a person who's a philosopher.
- [00:05:41.100]That's what he does.
- [00:05:42.090]He philosophies all day.
- [00:05:44.370]And he was talking about, there's been different sorts
- [00:05:47.159]of conventions over the time talking about AI or stem cells,
- [00:05:52.200]and a lot of the times it's just scientists talking
- [00:05:54.390]to each other and then people get really angry or upset
- [00:05:57.540]about deploying these technologies in the world,
- [00:05:59.700]and I think rightly so.
- [00:06:00.791]So it's something that I really wanted
- [00:06:02.670]to do right, right away from the very beginning,
- [00:06:05.616]and ensuring that we were talking to everyone
- [00:06:08.420]who will be impacted by this technology, which is everyone.
- [00:06:12.263]Yeah.
- [00:06:13.096]It sounds just like you described,
- [00:06:14.250]going to the gastroenterologist weekly to say,
- [00:06:17.977]"Hey, here's what I'm thinking. Here's my device,"
- [00:06:20.106]and getting his feedback as you went along
- [00:06:22.590]so you didn't invest in this whole project
- [00:06:24.720]and then find out it was useless to him.
- [00:06:26.190]You wanna do the same thing with people
- [00:06:28.140]who are gonna be affected by this technology in general,
- [00:06:30.720]so that's everyone.
- [00:06:33.320]Yeah.
- [00:06:34.153]I want people to know that scientists are nice.
- [00:06:35.840]We're only doing this 'cause we wanna help,
- [00:06:38.155]and most of us aren't sitting in a lab somewhere
- [00:06:41.058]by ourselves trying to plot evil plots.
- [00:06:44.013]We just wanna help, so let us know how we can help you.
- [00:06:48.300]Brilliant. I love it.
- [00:06:49.890]My cat Tesla might be an exception,
- [00:06:51.960]but we are not plotting any evil plots, no.
- [00:06:55.290]Actually, you're right.
- [00:06:56.610]We want to help the world.
- [00:06:57.719]And a big thing is when people,
- [00:06:59.808]especially fear is gonna be,
- [00:07:02.190]a lot of times that's spawned just by the unknown, right,
- [00:07:05.550]or the inability to understand how something works.
- [00:07:08.700]And so the more we talk with other people,
- [00:07:11.010]the more they'll learn how it works
- [00:07:12.600]and the less afraid they'll be.
- [00:07:13.770]And then the less afraid you'll be,
- [00:07:15.600]the more you'll utilize that technology for something good
- [00:07:18.240]and helpful and progressive for the world.
- [00:07:20.111](upbeat music)
The screen size you are trying to search captions on is too small!
You can always jump over to MediaHub and check it out there.
Log in to post comments
Embed
Copy the following code into your page
HTML
<div style="padding-top: 56.25%; overflow: hidden; position:relative; -webkit-box-flex: 1; flex-grow: 1;"> <iframe style="bottom: 0; left: 0; position: absolute; right: 0; top: 0; border: 0; height: 100%; width: 100%;" src="https://mediahub.unl.edu/media/20402?format=iframe&autoplay=0" title="Video Player: Verbing Science! with Ritu Raman" allowfullscreen ></iframe> </div>
Comments
0 Comments