Preliminary Examination of Light and Color Based UAV Communication
Malka Lazerson
Author
07/28/2021
Added
19
Plays
Description
Willingness is an important part of communication and cooperation, something needed with interacting with drones. How does light and color affect human and robot communication? And how does it affect comfort in these interactions?
Searchable Transcript
Toggle between list and paragraph view.
- [00:00:03.000]Hello, my name is Malka Lazerson, and I'm with the Applied Unmanned Systems Research Experience for Undergraduates (REU).
- [00:00:13.000]I major in Computer Science at the California State University, Fullerton, The research project I worked on summer is the Preliminary Examination of Light and Color Based Unmanned Aerial Vehicle Communication.
- [00:00:30.000]I worked on this project with my mentor Paul Fletcher, and fellow REU student Christopher Jemin Oh. Bioluminescent lighting has worked as a communication method since prehistoric times.
- [00:00:46.000]Light color can also communicate intentions and relatively recently, we have assigned meanings to those colors. The color green means go, or something positive The color red might mean stop, danger, or something negative.
- [00:01:05.000]Many aerial robots do not have faces. Thus, humans cannot pick up on social cues we have evolved to look for.
- [00:01:14.000]This means there is a need for better design constraints and signaling mechanism on drones.
- [00:01:22.000]What we studied this summer was how light and color can make humans more comfortable with the flying robots, in this case drones.
- [00:01:32.000]We also wanted to see if these would help humans better understand the robots intention. For our experiment, we use the DJI Flamewheel 450 drone fitted with a programmable LED strip.
- [00:01:52.000]The programmable LED strip was individually addressable, which meant you could control each LED on the strip.
- [00:02:00.000]We had five flight plans.
- [00:02:03.000]The flight plans included the forward, the backward, the left, the right, and the diagonal motions. The diagonal would also include the left and right directions.
- [00:02:20.000]The lights would be signaling these directions for participants to view in a video later.
- [00:02:28.000]To program the lights, we used an Arduino Uno, which was controlled by the robotic operating system, also known as ROS.
- [00:02:40.000]We use the FastLED library to write these programs. The FastLED library is a C Plus Plus based Arduino library.
- [00:02:52.000]This was used to control various factors of the LED strip, such as the color, the brightness, and the speed at which the lights moved. In our first survey,
- [00:03:06.000]we did not give any context to the participants. In the second survey, we did give context on the lights purpose to the participants.
- [00:03:19.000]The videos would be posted on Amazon's Mechanical Turk, and the surveys would be made with Qualtrics.
- [00:03:33.000]We noticed in our results that, overall, the participants were comfortable with the experience.
- [00:03:41.000]You'll notice in the first bar chart on the top left, there is some major discomfort.
- [00:03:51.000]In the second bar chart on the top right, this discomfort disappears entirely.
- [00:03:58.000]Despite this, the accuracy was not all that great.
- [00:04:04.000]On the bottom left, you will notice the pie chart.
- [00:04:08.000]It tells us that most of the participants in the first survey picked the downward motion
- [00:04:17.000]more than half the time. We're not really sure why that is.
- [00:04:22.000]The good news is, in the second survey, their accuracy did improve and the downward bias disappeared.
- [00:04:33.000]Something we would like to see tested in the future is how different light and color patterns on more complex drone movements would affect a humans' comfort and accuracy in guessing the motions.
- [00:04:48.000]The downward choice really did not seem random. So that would be something interesting to look into.
- [00:04:57.000]We also found that participants sometimes thought a red light on the right side meant the drone was intending to move rightward.
- [00:05:07.000]This was not what we intended.
- [00:05:10.000]Let's say the drone has a green light on the left side, and red on the right side. We intended this to mean the drone is going to move left.
- [00:05:21.000]But the participants saw red and right, and probably made the assumption that this meant a rightward motion. Why they didn't focus on the green here would be interesting to explore.
- [00:05:35.000]I would like to Alisha Bevins, Dr. Brittany Duncan, the NIMBUS Lab, and the NSF for making this project possible. I would also like to thank you for taking the time to watch my research project video today.
The screen size you are trying to search captions on is too small!
You can always jump over to MediaHub and check it out there.
Log in to post comments
Embed
Copy the following code into your page
HTML
<div style="padding-top: 56.25%; overflow: hidden; position:relative; -webkit-box-flex: 1; flex-grow: 1;"> <iframe style="bottom: 0; left: 0; position: absolute; right: 0; top: 0; border: 0; height: 100%; width: 100%;" src="https://mediahub.unl.edu/media/17480?format=iframe&autoplay=0" title="Video Player: Preliminary Examination of Light and Color Based UAV Communication " allowfullscreen ></iframe> </div>
Comments
0 Comments