Using robotics to phenotype crops in greenhouses and in the field.
Toggle between list and paragraph view.
- [00:00:00.320]♪ (music) ♪
- [00:00:04.879]Hello, I'm Santosh Pitla and I'm an
- [00:00:06.827]Associate Professor
at the University of Nebraska-Lincoln.
- [00:00:10.687]My research focus is on
- [00:00:13.929]Here is a phenotyping robot
that is designed
- [00:00:17.703]to phenotype different crops
in a greenhouse.
- [00:00:22.361]This project lead PI is Dr. Yufeng Ge
- [00:00:27.800]and you're seeing Abbas Atefi.
- [00:00:30.074]He's the Ph.D. student who
worked on this project.
- [00:00:34.552]So this robotic arm can be using
- [00:00:39.304]machine vision techniques,
- [00:00:41.043]approach a leaf
- [00:00:42.217]and take sensor readings.
- [00:00:45.146]So, for example, you can get
- [00:00:47.087]leaf temperature and chlorophyll content,
- [00:00:50.906]all without destroying the leaf.
- [00:00:55.800]So this is an in-vivo robotic sensing.
- [00:00:59.140]So, our ultimate goal is to take these
- [00:01:03.939]phenotyping robotic arms to the field.
- [00:01:08.197]And to be able to do that we need
mobile robotic platforms
- [00:01:13.263]and that is where my primary focus is;
- [00:01:17.297]developing control architectures
- [00:01:20.145]for field mobile robotic platforms.
- [00:01:24.970]So, what you're seeing here are
our test platforms
- [00:01:29.218]that we use in our parking lot to
- [00:01:33.520]develop our control algorithms.
- [00:01:36.556]So there are three robots here
- [00:01:38.187]and they are operating in
- [00:01:40.133]leader/follower behavior.
- [00:01:43.180]So they steer to the tasks
- [00:01:45.387]and work in the field.
- [00:01:47.199]So these platforms are great to evaluate
- [00:01:50.202]our control architectures and
- [00:01:52.086]use sensing technologies.
- [00:01:53.795]So we have more robotic platforms that
- [00:01:57.027]can actually go in-between the crop rows,
- [00:02:00.429]so they are very small robotic platforms
- [00:02:03.942]to actually sense the micro-climate.
- [00:02:08.308]And when you are in the crop rows
you don't have a GPS,
- [00:02:11.071]so we are using laser sensors
- [00:02:13.618]so that the robot can navigate
even without a GPS.
- [00:02:20.488]Ultimately we want to sense
- [00:02:24.911]the temperature, humidity, solar radiation,
- [00:02:28.167]so that is what this robot is doing.
- [00:02:31.696]We have a big robotic platform,
- [00:02:34.999]which is sixty horsepower.
- [00:02:37.562]So this robotic platform,
- [00:02:41.060]we call it as "Flex-Ro"
- [00:02:42.330]and currently it is used for phenotyping.
- [00:02:48.455]So, this is a multi-purpose robotic platform
- [00:02:51.665]and it can do planting,
- [00:02:57.018]cover crop planting, and spraying.
- [00:03:01.831]But for this application we're using it
as phenotyping platform.
- [00:03:06.406]So as you can see, as the machine is moving
- [00:03:09.384]it is collecting images of this soybean crop.
- [00:03:13.187]We also have spectrometers on this
machine, an ultrasonic height sensor,
- [00:03:21.287]so really you can phenotype
the crop in the field.
- [00:03:26.069]So in the background
you see our Spider-cam
- [00:03:30.430]which is suspended using cables
which can be moved anywhere in the field.
- [00:03:35.418]the main difference between the Spider-cam
and the Flex-Ro
- [00:03:38.995]is you can take Flex-Ro to any field
- [00:03:41.999]and you can phenotype large areas .
Log in to post comments