2024-01-25 サセックス大学
◆新システムは既存の手法よりも92%以上の精度で知覚色を予測し、動物の動的で正確な色覚を捉え、科学者や映画製作者に新しい研究や制作の可能性を提供します。
<関連情報>
- https://www.sussex.ac.uk/research/full-news-list?id=63262
- https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.3002444
新しいカメラシステムとソフトウェアパッケージを用いた自然界の動物視点映像の記録 Recording animal-view videos of the natural world using a novel camera system and software package
Vera Vasas,Mark C. Lowell,Juliana Villa,Quentin D. Jamison,Anna G. Siegle,Pavan Kumar Reddy Katta,Pushyami Bhagavathula,Peter G. Kevan,Drew Fulton,Neil Losin,David Kepplinger,Michael K. Yetzbacher,Shakiba Salehian,Rebecca E. Forkner,Daniel Hanley
PLOS Biology Published: January 23, 2024
DOI:https://doi.org/10.1371/journal.pbio.3002444
Abstract
Plants, animals, and fungi display a rich tapestry of colors. Animals, in particular, use colors in dynamic displays performed in spatially complex environments. Although current approaches for studying colors are objective and repeatable, they miss the temporal variation of color signals entirely. Here, we introduce hardware and software that provide ecologists and filmmakers the ability to accurately record animal-perceived colors in motion. Specifically, our Python codes transform photos or videos into perceivable units (quantum catches) for animals of known photoreceptor sensitivity. The plans and codes necessary for end-users to capture animal-view videos are all open source and publicly available to encourage continual community development. The camera system and the associated software package will allow ecologists to investigate how animals use colors in dynamic behavioral displays, the ways natural illumination alters perceived colors, and other questions that remained unaddressed until now due to a lack of suitable tools. Finally, it provides scientists and filmmakers with a new, empirically grounded approach for depicting the perceptual worlds of nonhuman animals.