Panatrap – Open Source 360 Camera Traps

Panatraps are an open-source collection of design files and code for turning commercially available 360 “VR” cameras into panoramic camera traps for studying wildlife in natural environments. These are created in Gamboa, Panama, at Digital Naturalism Laboratories by Andrew Quitmeyer and Danielle Hoogendijk. Work on this project has been supported by:

The project files are all open-source and available on these project pages and the project’s github repo: https://github.com/Digital-Naturalism-Laboratories/Panatrap

Motivation

Camera traps are a super useful tool for field biologists and conservationists! They let us record animals without the presence of humans being in the way! You have probably seen their use on shows like Planet Earth to capture gorgeous footage of super rare leopards and such! They are also invaluable for monitoring less rare species in many different other climates though too! (And also as secret traps to catch poachers!). Cameras traps have lots of limitations though:

  • Narrow Field of views cause “placement bias” (e.g. https://www.wildlabs.net/community/thread/231 )
  • Commercially available cameras can be pretty pricey (We can make them cheaper!)
  • Commercially available ones are generally kinda bulky (and apparently elephants hate them! and try to break them!) – we can make them smaller ourselves!
  • The few Commercially available 360 traps that do exist, use old tech for doing only horizontal scanning and are still pricey! (e.g. https://www.amazon.com/Wildgame-Innovations-360Deg… )

As camera trapping expert, Roland Kays, told us about this project, “one 360 camera trap does the work of 10 conventional camera traps!”

The problem is, 360 camera traps aren’t really a thing that exists much yet. There is a ton of work to be done just testing and evaluating them before scientists can really start relying on them for research. So first we need to just start making them ourselves and getting them out in nature!

Design Criteria

Loosely defined, camera traps are simply photographic instruments that can be remotely triggered by wildlife. Typically this is done with a PIR motion sensor triggering a camera to snap a photo. Thus when transforming commercially available cameras to become camera traps, the key abilities we need are:

  • Remote Triggering
  • The ability to tell the camera to snap a photo or start recording a video
  • Low power consumption
  • Typically this is the ability to remotely turn the camera ON/OFF, or at least bring the camera in and out of a low power “sleep mode”, and to remotely charge it
  • Speedy Capture
  • We were told that a typical camera trap aims to go from sensing an animal’s presence to capturing the creature in under 1/2 second
  • This ability is often at odds with the “Low power consumption” goal, since many cameras have a not-insignificant amount of wake up time when starting.
  • Image quality
  • An ideal camera trap will provide a high resolution image of the captured animal
  • Outdoor Readiness
  • The camera trap needs to be able to withstand being left alone for several days or weeks in its target environment.
  • Camera Traps face challenges from the elements (such as rain, heat, wind, and cold)
  • They also face challenges from local creatures (such as small insect destruction, or elephant smashing)
  • For most mammal surveys, the cameras are recommended to be placed at “knee height” or about 50cm above the ground

Initial Challenges

  • How to control the camera
  • How to weather proof it
  • How to let the sensors work through weather proofing (We learned that Acrylic blocks the PIR’s from working!)

Supplemental challenges

Once these things are up and running, even if they function great, there’s a ton more bonus challenges that will come up such as:

  • Processing and Identification workflow for 360 photo and video
  • Durability testing
  • Transportability testing
  • Night vision hacking – either adding a flash or hacking the lenses to make them IR sensitive

Cameras

The collection currently consists of 2 camera trap setups based on the

  • Xiaomi Misphere (MADV in the USA)
  • Full electronic functional control over
  • Turning camera ON/OFF
  • Switching camera from Video to Photo Mode
  • Trigger a photo or video recording
  • Ricoh Theta V
  • Full electronic functional control over
  • Turning the camera on (waking from sleep mode)
  • Triggering a photo or video to begin recording

as we hack more cameras we will add them and their files to this list and repository. We hacked the Ricoh Theta S and posted detailed instructions here: but this was quite a destructive hack (we literally ripped the camera apart and soldered things to its button). The Panatrap project focuses on non-destructive, reversible hacks to turn your 360 camera into a cool trap.

The full list and detailed comparisons of the cameras we are looking at are here:

Xiaomi Misphere

The first camera that we hacked because it has decent image quality, and importantly it has two exposed electrodes on the camera that make hacking very easy!

They made an interface on the bottom of the camera for the camera to be easily controlled with their included “selfie stick.” It’s just two metal electrodes that when connected it sends a message to the camera(the original selfie stick has a 220 ohm resistor connected to a button). From our examinations, the 3 messages you can send to the camera with this interface are:

  • Long Press (5 seconds)
  • Camera ON/OFF
  • Medium Press (2 seconds)
  • Toggle Recording Mode (Video to Photo, or Photo to Video)
  • Short press (0.5-1 seconds)
  • Shutter Button (Take a photo, or start or stop a video)

Code

Code is available in this repo https://github.com/Digital-Naturalism-Laboratories/Panatrap/tree/master/MADV

Hardware

Physical Designs are available here for free download: https://a360.co/2nrPLd8

and included in this repo

Field Tests and Current Battery Life

Current Battery Life: At least 17 hours

Our current setup with the single 3.7V lipo powering the arduino and the MADV running on its own battery has been able to work continuously for at least 17 hours (it ran overnight and triggered upon some things, but it was very dark and we don’t have lights yet)

We ran several field tests around DINALAB in Gamboa Panama, in the Soberania national park nearby, and up in colder cloud forest areas near El Valle, Panama.

How-To Build

This separate post will guide you through how to build your own!

Ricoh Theta V

This camera was substantially more locked up in our attempts to remotely operate it (especially in a way that allows for long term use and have a decently quick trigger).

The advantage of including this camera in our hacks is that it is one of the most popular 360 cameras with lots of infrastructure surrounding it.

The camera runs android, and has an accessible USB Api (to recieve PTP commands) and in theory has all the abilities to remotely A) Turn ON/OFF and B)Trigger a Photo Capture. But the firmware blocks some of these features (even if you switch into developer mode).

Luckily we discovered that sending an emulated Keyboard command (through something like a Teensy Arduino 3.2), can remotely wake up the camera (even though this feature is not documented anywhere!), and then we just need to trigger the photo. The photo triggering is then done by a small servo connected to the arduino which manually taps the shutter button of the camera. The servo is shut down in between photos to reduce power consumption.

Code

Code is available in this repo https://github.com/Digital-Naturalism-Laboratories/Panatrap/tree/master/Theta_V

Hardware

Physical Designs are available here for free download: https://a360.co/2m22suQ and included in this repo

Field Tests and Current Battery Life

Current Battery Life: At least 52 hours

Our Panatrap build for the Theta V includes an additional 3000mAh battery pack which powers the arduino, but also charges the attached Theta V. We haven’t had time yet to actually let the thing drain completely, but it should power the camera for quite some time (of course all these depend on how much activity there is).

Unlike the MADV, the Theta V does not shut down completely, but instead goes into “Sleep Mode.” The advantage is that it can wake up very quickly (Less thank 1/2 a second), and be triggered to snap a photo. The disadvantage is that it uses more power in sleep mode. BUT we did some testing and found the sleep mode to be quite energy efficient! The camera lasted over 4 days in sleep mode!

Theta V Sleep Mode Tests

(Monday) 99% – 4pm
94% – 5 hours later
(Tues)16 hours -88% 8:22 am1:12 pm – 84%6:24pm – 79%12:35AM – 75%
(weds) 8:50AM – 71%12:44pm(weds) – 68%1:57PM – 68%(brief charge to upload plugin code)

7:49 AM (thursday) – 43%
hacking CA camera release https://nfgworld.com/cactus-radio-trigger-for-ricoh-theta-s/
1:12 pm – 20% (after two hours of playing around debugging)

We ran several field tests around DINALAB in Gamboa Panama, in the Soberania national park nearby.

How-To Build

This separate post will guide you through how to build your own!

Background

Initial Prototype

This project started in 2017 creating a quick proof-of-concept for a Scientist at Yale-NUS to see what could be done with relatively new commercially available 360-degree cameras working as camera traps. A full rundown of this first approach and examples from the field in Singapore are available here: https://www.instructables.com/id/360-Camera-Trap/

Picture of 360 Cameras

The first prototype was a destructive hack, meaning that we caused irreparable alterations to the camera (we literally brute forced this thing by cracking open the camera and soldering to its button ports).

Since then, this prototype was further hacked on (and mostly constantly re-fixed), by Danielle Hoogendijk and other participants at the first Digital Naturalism Conference in Thailand, such as Stig Anton Nielson.

Academic Research 2018-2019

Dani continued working with the 360 camera for her Master’s degree in the Netherlands. Her thesis, “Camera Trap Efficacy: a Wider View” compares the effectiveness of a 360 camera trap by comparing its rate of animal detection to traditional camera traps.

The results of this study show that the modified, 360 degree, camera trap was superior in detection, capturing and overall successful functioning (with a mean increment of 89.2% by using the modified camera trap), in relation to all target animal species.

Danielle Hoogedijk, “Camera Trap Efficacy: a Wider View”

Her conclusions were that the 360 camera trap was able to catch far more animals passing by the trap (simply due to its much wider field of view and sensor range). However technical difficulties with this early prototype led to a bunch of wrong triggers too. So we are making much more reliable prototypes for long term testing (She just completed this research project and got her diploma in September 2019!)

Conservation X Labs Grant 2019

Digital Naturalism Laboratories received a grant to refine these prototypes. Dani and Andrew Spent summer 2019 exploring the current state of commercial camera traps and 360 cameras, and talking with experts like Roland Kays. During this time we developed over a dozen prototypes and ended up with two functional designs ready for further deployment. These have been iteratively tested in the jungles and cloud forest of regions around Panama.

Testing and Evaluation

We performed many real world tests with these cameras in Gamboa, Barro Colorado Island, and the cloud forest near El Valle.

Animals

We had limited time, and so were currently unable to leave the traps out for extended periods of time, and so we have mostly just captured agoutis (an adorable, ubiquitous jungle rodent creature) and some stray dogs. We can’t wait to see what other animals we capture soon!

Next Steps

Just from our first month of testing, we have lots of exciting new tweaks to change and add. In the immediate future, here are the very next steps we are working on:

Form Fitting Weather Shield (in progress)

The “big clear box” approach to the weather shields has been working fine, except that it does mask a bit of the footage. Granted we are still getting visual coverage far greater than any traditional camera, there are still seams that block a bit. Perhaps most importantly, since the weather shield is offset a bit from the camera, there leaves room for an internal glare (for instance you can see we used a marker to color the MADV black because it was reflecting off its own internal weather shield).

If we do some basic acrylic molding (ideally vacuum forming), we could make a nice, non-distorting, optically clear housing for any camera. We did some test prints of a Theta V, that we molded with cheap Silicone+Acrylic paint, and tried melting some thin clear plastic over it, to some initial success!

Angled Weather Roof

Our current, boxy weather shield is effective, and keeps the horizontal imaging area terrifically clear, but the rain never seems to leave its upper horizontal surface. We need to take some lessons from tropical architecture around here, and make the roof angled at least a little bit to let the drops slide off. This will make sure the area above the 360 camera is also as clear as possible.


We have tested some versions with a curved over lid that seemed quite promising

More Camera Hacking – Samsung Gear

With its combination of cheap price tag, but high quality imagery, the Samsung gear is next on our list of cameras to be hacked. Most likely it will involve the Servo hack currently used with the Theta V, as its interface and documentation is quite poor.

Yarncraft and Cognition – Creativity and Cognition 2017 by Andrew Quitmeyer

Paper by Kitty Kelly about using yarn to explore the mind both biologically and mentally.

Abstract
The popularity of knitting and crochet, or yarncraft, is on the ascent. As more people discover its pleasures, enthusiasts and neuroscientists are also realizing that crafting with yarn elicits soothing and therapeutic effects. The meditative aspects of knitting and crochet are already familiar to the legions of yarncrafters, but recognition of the neuroscience of yarncraft is a relatively recent phenomenon. This work proposes to embody the relationship between yarncraft and its neurological benefits with a physical art project. This project will take the form of a large crocheted e-textile brain sculpture with embedded LEDs whose illumination is controlled live by a brain-computer interface worn by a yarncrafting practitioner. This sculpture visualizes the changes in the neurology of the yarncrafter.

Yarncraft and Cognition – Creativity and Cognition 2017 by Andrew Quitmeyer on Scribd

Digital Naturalist Design Guidelines: Theory, Investigation, Development, and Evaluation of a Computational Media Framework to Support Ethological Exploration

This paper outlines Andrew Quitmeyer’s PhD work developing a design framework for interacting with wild creatures and biological field work.

Digital Naturalist Design Guidelines: Theory, Investigation, Development, and Evaluation of a Computational… by Andrew Quitmeyer on Scribd

Abstract
This research aims to develop and evaluate a design
framework for creating digital devices that support the
exploration of animal behaviors in the wild. This paper
quickly shares the main concepts and theories from the
fields forming Digital Naturalism’s foundation while
presenting the key challenges emerging from these critical
intersections between field biology and computational
media. It then reviews the development of this research’s
hybrid methodology designed specifically for its multi-year
series of “Qualitative Action Research” fieldwork carried
out at a rainforest field station.
This paper analyzes the resulting on-site ethnographies,
workshops, design projects, and interactive performances,
whose take-aways are synthesized into design guidelines
for digital-natural media. This framework, itself, is then
evaluated via an extra iteration of fieldwork and the results
discussed. Finally, the paper identifies targets for continued
research development. Further areas of interest are
presented which will promote Digital Naturalism’s
progression into its own topic of study.

Hiking Hacks: Workshop Model for Exploring Wilderness Interaction Design (DIS 2018)

During our Digital Naturalism Conference, I will actually have to go full-on meta-conference  and present my research about the workshop model for Hiking Hacks at DIS 2018

http://dis2018.org/111-sessions-tuesday.html

Here is a full “pre-print” downloadable copy of the paper i will present

DIS_Hikinghacks_Revised_Final_PREPRINT

 

Hiking Hacks: Workshop Model for Exploring Wilderness Interaction Design (Preprint) – Andrew Quitmeyer by Andrew Quitmeyer on Scribd

First rainy day!

Today Gamboa had its first serious rain in months! It’s been an unusually long, hot dry season this year, apparently one of the most intense in the past century. In some ways, the dry season is terrific: it’s not that humid, and the roads are much more passable when they’re dry, making conditions safer for most humans. However, it gets REALLY hot without rain, and the plants and animals seem a little wilted.

It was wonderful to feel the fall of a real, prolonged rain today! The skies opened up, the temperature dropped, and the air was flooded with the smell of petrichor. The ground is greedily soaking up each drop of water, and I expect that the local fauna will be especially envigorated. Whenever we get even a mild drizzle during a dry spell, you can notice some increase in animal activity – extra squawking from the birds, a livelier spring in the agoutis’ step.

The storms are bound to get longer and more dramatic, and someday soon I’ll probably get sick of the resulting humidity, but for now, Gamboa is very grateful for the rain. – Kitty


Live Streaming 360 degree Jungle Audio

During his residency, Marc Juul, set up the first steps to live, Virtual Reality, 360-audio in the forest. The goal is to install these in fascinating natural places around the world to help, and people can subscribe to a network of high quality live audio for things like waking up to natural alarm clocks, creating luxurious ambient atmospheres anywhere, or helping detect human presence (ie poachers). The funds generated can then go to help the natural places hosting the audio streams!

Listen to a live stream from Forests in Gamboa!
(Works in Firefox and Chrome, for Safari you may need this link http://juul.io:8000/gamboa1.ogg or in VLC you can open this link http://juul.io:8000/gamboa1.ogg.m3u )

Marc Juul

Marc Juul is a super nice and super brilliant hacker coming from the meta-hackerspace Omnicommons in Oakland, California to help set up fun projects at the new DINALAB!

juul.io

Open House Birthday Party Gallery Opening!

We invite you to come to Dinalab, our house, maker space, and art gallery! Invitamos a todos a nuestra casa!

123b Humberto Zarate, Gamboa

Friday, April 5, 6pm

Kitty and Andy’s birthday is coming up in April! This is a combined housewarming and birthday party, but instead of gifts, we’d like you to bring something you’ve made that we can exhibit in our little gallery space for the party.

😊

It can be anything – drawing, photograph, origami, pottery, a scientific tool – as long as you made it! We can display the creations during the party, and you can take yours home at the end of the night . Message us if you have any questions about the thing you want to exhibit!

Also if you want to send a digital thing for us to print, please email us:

andrew.quitmeyer@gmail.com your artwork’s

-Name

-Description

-Actual artwork file

Also please bring a snack or a drink.

Kitty Kelly

Kitty Kelly (Quitmeyer) (wellreadpanda.com) is a librarian turned professional yarn-crafter. Her interests lie in sustainability, knitting + crochet, books, and red pandas.  She is the co-developer of the dinalab, and fixes/develops lab infrastructure while running workshops, events and logistics. Perhaps you will be able to become a mobile knitter / hiker like her!

She develops yarncrafted artwork to bring attention to scientific practices and discoveries. You can see more of her works at www.wellreadpanda.com