Animals in the wild must make crucial decisions by detecting moving targets. Their survival can depend on finding prey or evaluating a potential mate and their eyes all perceive colors in nature a little differently. Getting an accurate view of what animals are seeing has been a challenge, but a camera system developed by scientists at the University of Sussex in the United Kingdom and George Mason University in Virginia could help ecologists and filmmakers create videos that closely replicate the colors that different animals see in their natural environments. The system is described in a study published January 23 in the open-access journal PLOS Biology.
[Related: How do animals see the world?]
Different photoreceptors in the eyes can affect how we perceive the world around us. Animals including bees, reindeer, and some birds can see ultraviolet (UV) light that human eyes cannot perceive. By reconstructing the colors that we know animals can see, scientists can learn more about how they communicate and navigate the world around them.
“As sensory ecologists, we are interested in how animals perceive colors in nature. Traditional techniques for measuring these colors often told only part of the story,” Daniel Hanley, a study co-author and sensory ecologist at George Mason University, tells PopSci. “The scientific community lacked adequate tools for studying colors in motion. We designed our camera system to provide a solution to this problem. Now, we can record color signals as they would appear to wild animals in the wild.”
The new camera system builds on current techniques called spectrophotometry. Using this technique, images are taken at specific wavelength ranges that are typically beyond what humans can see. However, using these methods can be time consuming, produce false colors, require specific lighting conditions, and can’t always capture something that is moving.
To overcome some of these limitations, the team developed a camera and software system that captures animal-view videos of moving objects under natural lighting conditions.
“The system is built around two separate cameras, where light is split and simultaneously directed to a camera sensitive to ultraviolet light and to a standard camera sensitive to human visible light,” says Hanley.
[Related: How this computer scientist is rethinking color theory.]
One of the cameras simultaneously records video in four different color channels: blue, green, red, and UV. That data is then processed into perceptual units using a popular programming language called Python. This generates a more accurate video of how animals perceive those colors, based on what biologists know about the photoreceptors in their eyes. The team tested this new camera system against the traditional spectrophotometry methods and their new system predicted the perceived colors with an accuracy of over 92 percent.
“Our project was quite involved, and we had many surprises along the way,” says Hanley. “The most surprising thing that we discovered was how much clouds can impact a perceived color. We didn’t tend to notice these shifts, but they were notable.”
The system is built from commercially available cameras and is housed in modular 3D-printed casing. The software that the team developed is also open-source, so other researchers could build on this technology in the future.
“We plan to apply the camera system as broadly as possible. Currently, we are exploring a range of applications from natural history through conservation,” says Hanley. “Our hope is that through community engagement our designs can improve and we will gather many novel observations about colors in nature.”
The post How animals see the world, according to a new camera system appeared first on Popular Science.
Articles may contain affiliate links which enable us to share in the revenue of any purchases made.
from | Popular Science https://ift.tt/eUFuHWM
0 Comments