Mobile Modular Field Stations

The future of the biological field station is not large fancy lab dropped in the middle of a jungle, but rather a network of mobile laboratories distributed throughout an ecosystem.

Lab assistant, Alister, helps test out an early iteration of the mobile lab deep in the jungle

we are combining Digital Naturalism Laboratories’ previous research in Mobile Makerspaces like the Philippines “BOAT Lab” (https://www.youtube.com/watch?v=n0L-SNO4A5w) with the decades of experience that the sustainable architecture firm, Cresolus (https://www.cresolus.com/), has experience building in building tropical architecture like homes and field stations in national parks. We are creating modular, mobile laboratories that scientists can bring directly to their field sites around the world.

Video overview of the BOAT Lab – floating biological makerspace

What is the challenge?

Field biologists and conservationists are faced with a paradox: The goal of their work is to protect and understand natural ecosystems, but the mere act of accessing their field sites generally requires some amount of environmental destruction. Almost all travel for field biology relies on the burning of fossil fuels, which are destructive not only in the original drilling, but also in the pollution they give off into the very environments being studied. Moving vehicles back and forth between field sites also introduces physical destruction and noise pollution to natural areas which may negatively impact the work being done to begin with.

Additionally, most conservationists and field biologists are constrained for time and space by the samples they collect. Researchers need to get to the sites, collect their samples, and get them back to labs for processing within a limited time-window. This means that these trips to field sites are often a constant, daily commute which takes a toll on the environment (as well as the researcher’s and their projects!).

Like a surgeon cutting through healthy flesh to find a disease, these researchers will never be able to completely stop causing some damage to get to the places they study, but we can work to greatly reduce the damage caused.

Instead of having thousands of researchers around the world commuting between field sites just to bring samples back to the laboratory, we think that the whole laboratory should be brought to the field.

One Solution

These laboratory “Pods” can be towed or floated to field sites deep in forests or up rivers while withstanding inclement weather or hazardous terrain.  Advances in sustainable energy harvesting coupled with the miniaturization of technology means that researchers can process their data and samples on-site (even genetic sampling labs can be miniaturized!). Thus field biologists and conservationists can make fewer unnecessary trips back and forth between the field and lab and increase their productivity while minimizing their own footprint in these areas.

We have already tested functional prototypes of these modular labs with many researchers and conservationists around the world including those from the Smithsonian, National Geographic, and many major universities.

Helping field biologists and conservationists destroy less of the environments they are trying to save and understand.

Key Targets

1. The reduction energy of moving scientists back and forward to labs (your point).

2. The modular lab trailer that can be used multiple times in different configurations therefore reducing the cost of dedicated lab space in a building.

3. The trailer is made from reused car parts (the differential, axel and wheels come from a jeep, the chassis from a Toyota pickup).

4. Because it is a trailer it requires very little energy to move (vehicles would be traveling to site anyway).

5. It allows studies/research to continue for greater length of time (often scientists have to travel to a country multiple times to create their data sets this could help reduce that).

6. Allows scientists to work in tropical conditions more efficiently (ie it could be bug proof and weather proof so they don’t have to leave the site so often)

7. Allows scientists to sleep in the field (Can be adapted to provide accommodation so they don’t have to travel back to sleep somewhere)

Collaboration Inspiration

Both Digital Naturalism Laboratories (Dinalab) and Cresolus are concerned with getting researchers access to incredible ecosystems in sustainable ways. We met while repairing bridges on “Pipeline Road” nature park in Gamboa Panama.  Pipeline is one of the most heavily researched areas in the past century, but unfortunately, due to bureaucratic disagreements with the local field station and government entities, many parts of this historic field site have fallen into ruin preventing most visiting scientists and conservationists from conducting their work here. We set up our own volunteer initiative to restore access to these field sites using sustainably sourced and upcycled materials (here’s a time-lapse of one bridge we completely rebuilt

While spending days toiling in the hot jungle, on a volunteer initiative to rebuild bridge and trail infrastructure for a historic research and conversation site (Pipeline Road in Panama), Dinalab and Cresolus got to learn about each other’s work in mobile labs and sustainability design. We also got to hear the laments of the field biologists discussing the paradox of how they do field research because they love nature, but that in order to do it, they currently cause lots of pollution in the form of constantly driving back and forth to bring samples from the field to the lab.

This led us to the inspiration that, maybe instead of constantly bringing field samples to the lab, maybe we should bring the lab to the field!

No, we are taking Cresolus’s mobile architecture studios they bring to jungles when building parks and outfitting them with scientific tools. These initial tests proved not only functional, but can help increase productivity! Now we just need to design and test more!

Design Challenges

In our early research (https://dl.acm.org/doi/abs/10.1145/3196709.3196748),  we established a hierarchy of needs for labs that starts with

1)            Protection

2)            Organization

3)            Work Surfaces

4)            Light

5)            Information Access

6)            Power

Most people we discuss this challenge with automatically assume that getting power to a mini-modular field station deep in a remote field site would be the biggest challenge, but actually with solar power, pre-charged battery banks, inverters, and sustainable designs (e.g. passive cooling), getting power to the research equipment will not be our main challenge.

Instead the key aspects of the design are creating a modular system that can facilitate the research of many different types of scientists and conservationists while keeping the sensitive tools protected, organized, and easy to work with.

We already have many tested and experimental designs with workstations and lab equipment, for example, that expand from a trailer or pack into modular pelican cases.

Users

Scientists and Conservationists doing remote fieldwork. Tens of thousands of researchers visit established field stations per year, like the Smithsonian Tropical Research Institution in Panama, but generally still need to travel long distances from the field stations to their field sites. Most of these researchers are forced to spend long days commuting back and forth from the labs at the field station to their sites in the field, and because of the time-sensitive nature of the data they are collecting and samples to be processed, cannot simply stay out for longer durations. Due to the generally treacherous nature of back-country travel, each additional trip also increases the chance of physical or mechanical danger to the researchers and their vehicles.

Instead, visiting researchers will be able to rent our labs and bring them directly to their field sites. There, they can stay, conducting their work, with a modular selection of the laboratory tools they need, and make one final trip after their research has been finished.

Scientists will be able to rent our mobile laboratories and bring them directly to their field sites to conduct and process their work in nature. The pods are designed and tested for rough-terrain to get to off-road sites, and can be loaded onto pontoons to function as floating laboratories in aquatic environments. We also offer services to deliver the pods into the research sites for the scientists, and pods are made to perfectly fit in shipping containers to they can be used anywhere around the world.

Costs

The environment bears most of the cost currently that we hope to address.  At a time when fuel costs are absurdly cheap and biology and conservation budgets are small, many researchers feel forced to carry out their work the traditional ways, meaning lots of travel back and forth to field sites. The damage caused is not only from the drilling to extract the fossil fuels, or the pollution strewn across the target environment, but the constant back-and-forth travel disrupts the ecosystem and introduces noise-pollution which may impact the studies the researchers hope to conduct in the first place.

Additionally, many existing laboratories and field stations are still heavily dependent on fossil-fuels. Our pods, on the other hand, will be equipped with renewable power sources or pre-charged from our solar arrays.

Similar Projects

Mobile labs are not a completely new concept. Many research ships function like this already (famously like Jacques Cousteau’s floating labs), and other designers/researchers have launched similar projects such as Marko Peljan’s “Makrolab,” a modular lab that could fit into shipping containers and shipped around the world.

Other projects we know of include Steven Roberts’s “Nomadic Research Labs”, Marko Peljan’s Makrolab prototype for a mobile art and science workstation, The Hackteria Network’s outdoor DIY art-science workshops, American Arts Incubator’s “Waterspace” Project building a floating art-science makerspace, Jacobs and Zoran’s work with mobile digital craft labs and hunter-gatherer tribes in the Kalahari, and the Signal Fire Arts and Activism Residency that doubles as a backpacking trip.

Unfortunately, many scientists, especially small research groups, or grad students lack the funding needed to invest in such larger infrastructure. Our pods are customized with equipment for the individual researcher or small group and delivered to the field at minimal costs. Cresolus, as an established sustainable architecture firm working in national parks around the world, already has offices  and the ability to make and deliver these pods to the field sites already used by many researchers in Central America and Africa.

Moreover, all our designs will be open-source, so researchers can further add on to the designs we have and contribute to better mobile labs for everyone.

What are your Team’s primary work tasks and activities over the next 3-6 months? (optional)

We already have functioning prototypes tested with scientists in key research areas (Pipeline Road). Our goals for the very next stage of the project are to do another round of more formalized testing and evaluation and to develop further features to add to the pods’ designs.

We aim to enroll existing scientist clients such as such as Dr. Rachel Page’s Bat Lab with the Smithsonian Tropical Research Institute, and Corey Tarwater’s Avian Ecology lab from the University of Wyoming who both do extensive field work on Pipeline road. We will provide their researchers with reduced rate use of the pods.

Modular features that we intend to design into the projects that we will be working on over the next several months include:

-Mesh networking

-Safety equipment/signaling

-Sterile lab

-Expandable Flight Cages

-Dry ice storage / Peltier Coolers

-Expandable work stations

-Built in 360-camera traps / Acoustic monitoring

What are your project needs over the next 3-6 months in terms of resources, skills and knowledge? (optional)

To complete our next steps, we primarily need a little bit of funding to carve out some design time between our two organizations to dedicate to further develop the prototypes we already have.

We already have most of the materials, electronics, field sites, and evaluators available, we just need time to put these together and continue testing and sharing these designs.

What are your project goals?

For the very next stage of our project our main goal is to get one of these pods functional and consistently rented out to different researchers visiting our field sites over the next year.

Our longer-term goal is that within two years, we will have three of these pods available at the different field sites Cresolus works in, such as Gabon, or Belize.

Finally, we hope that within 5 years, we will have had and documented enough uses of these mobile laboratories that the idea of renting them out has become commonplace within research communities. We will have many pods available for conservationists and biologists around the world, and other organizations will replicate many of the ideas we have shared and tested.

Fast Jungle Face Shield

Our little jungle lab is trying to help manufacture face shields for medical workers, and we tested many designs out on the internet. The fastest design we have used so far is our own modification

DOWNLOAD ALL FILES HERE, NO REGISTRATION

https://drive.google.com/open?id=1R7S2vxSGS3DLOInkfokaIpI04cGfVeJM

based off the Georgia Tech Medical Innovation design

https://gcmiatl.com/COVID-19-personal-protective-equipment

The headband just needs
-acrylic-laser cutter
-Rubber band

-(optional) Eva Foam or Self-Adhesive weather stripping (for forehead comfort)

For the face shield, ideally you have thin sheets of PET that you can laser cut as well, but if you don’t you can use A4 transparency sheets (like we will be) or a sliced up 3 Liter soda bottle (like we also use).
The key advantages of this design are

SPEED- Each takes only about 5 minutes to cut, and maybe 8 minutes total to make (compared to 1.5 hours for a 3D print), plus you can nest them to use less material!


and the


USE of LIMITED MATERIALS
-The headbands can be cut out of Acrylic, PET, or most other plastic sheets you might have (could possibly use wood and MDF, but might be harder to sanitize)

-Can attach different types of simple or disposable face shields like A4 transparency sheets or 3 Liter soda bottles

There are plenty of other designs out there that may be nicer or fancier or might make sense if you have a fleet of 3D printers instead of a laser cutter. Figure out what works best with the materials you have. Here in Panama most of the stores except grocery stores are shut down, so most of these materials you can find at the Super 99 grocery store (e.g. EVA foam and Plastic sheets or soda bottles)

For the face shield, ideally you have thin sheets of PET that you can laser cut as well, but if you don’t you can use A4 transparency sheets (like we will be) or a sliced up 3Liter soda bottle (like we also use)

Túngara model – now with sound!

Sound on!!

A while ago, I crocheted a túngara, a frog I hear a lot during the wet season in Panamá. I wanted to have my model make the distinctive túngara call, which sounds like a video game sound effect, but I didn’t know how. For Christmas, Andrew gave me a bunch of cool electronics that I can record on and embed in soft toys. He even loaded one with a recording of a túngara for me!

We opened the frog up and inserted the device.

Here’s a picture of a real túngara with its characteristic inflated dewlap.

I’m looking forward to making more noisy toys like this! Someone suggested a toucan, which should be fun.

Note: This post is by Kitty and is cross-posted over all my personal blog, wellreadpanda.com 🙂

Bat Night Signal

Rachel Page’s Bat Lab at the Smithsonian Tropical Research Institute has been hosting a monthly outreach “Bat Night” in gamboa Panama, for quite some time. DINALAB figured it was about time they had their own bat signal!

Gamboa Games

We are making a suite of open-source DIY board games. The goal is to find things that

  • are fun to play
  • can be sold to support the lab
  • make use of the scrap material from science project prototyping
Some of the character figures cut from scrap materials

Panatrap: 360 Camera Comparisons (2019)

For our open-source 360 camera trap project, we wanted to evaluate the field and figure out what the most available and useful cameras to hack would be. We collected about 6 commercially available camera traps and evaluated them on their

  • Hackability
  • Image Quality

In our qualitative order, here is a ranking of those cameras we have tested and the order we want to hack them:

Camera type Object Pixel Density (front) Object Pixel Density (side) Distortion Sensitivity to infrared Price paid
MadV 127*127 133*133 None Lowest 300
Ricoh Theta V 100*100 100*100 None Highest 430
Samsung Gear 95*95 143*133 Some Low $83
Ricoh Theta S 100*100 100*100 None High 270
Zision 166*166 90*110 Most Low $60
Maginon View 50*50 45*50 Some Low $55

Basic Testing Procedure

Object Pixel Density

We took all of the cameras in a room with controlled lighting, and placed a colorful, standard-sized basketball exactly 2 meters away from each camera.

Set-up to test the cameras’ pixel resolution and level of distortion (after stitching the images).
Setup of the Theta V with the Ball 90 degrees to the side (the Theta V had the least amount of distortion after stitching)

Two pictures were taken in two camera positions, one with a lens straight in front of the subject, one with the camera sideways (or upwards for the Zision). The object is positioned at 2 meter.

IR Sensitivity

To test the cameras’ sensitivity to infrared light. We placed the same calibration object (the colorful basketball) at a distance of 2 meters, with the camera positioned between the subject and the light.

Xiaomi MADV (Mijia Sphere)

General Camera Details

Price
$349 ($299 on sale)

Object Pixel Density + IR Sensitivity

The MadV has a resolution of 127*127 px when ball is in front of the camera and 133*133 px from side view. Its sensibility to infrared light is quite poor. With the target at a distance of 2 meter, it was not able to show anything.

Picture with infrared light

Front                          

Side

Ricoh Theta V

Video Stitching Resolution4K
Internal/External StitchingInternal Stitching
360 Stitched Video FormatInternal:3840 x 1920 at 29.97 fps (56 Mb/s MP4 via H.264) 1920 x 960 at 29.97 fps (16 Mb/s MP4 via H.264) 
Still Image ResolutionJPEG: 14 Megapixel, 5376 x 2688 (2:1)
Number of Lenses2

Camera per Lens

Sensor1-Chip 1/2.3″ CMOS

Optics per Lens

Maximum Aperturef/2
Lens Elements7
Minimum Focusing Distance4.0″ / 10.2 cm

Recording

Recording Mediax Internal Flash Memory
Built-In MicYes
Channels4.0-Channel Surround
Audio FormatAAC-LC

Exposure Control

Shutter Speed1/25000 – 1/8 Second (Photo)1/25000 – 60 Seconds (Photo)1/25000 – 1/30 Second (Video)1/25000 – 1/30 Second (Streaming)
Photo ISO Range100 – 1600 (Auto)64 – 3200 (Manual)
Video ISO Range64 – 6400 (Auto)

The Ricoh Theta V showed a resolution of 100*100 px, no matter which position the camera was in.

Picture with infrared light.

Front                         

  Side

Its sensibility to infrared light is the highest of all the cameras tested.

Price
$429 ($379 on sale)

Samsung Gear 360 4K Spherical VR Camera

Image Sensor2 x 8.4 MP CMOS
Lenses2 x f/2.2 ultra-wide lenses
Max Video Resolution360° Dual Lens: 4096 x 2048 at 24 fpsSingle Lens: 1920 x 1080 at 60 fps
Video FormatMP4 (H.265)
Photo Capture Resolution360° Dual Lens: Up to 15 MP (5472 x 2736)Single Lens: Up to 3 MP (2304 x 1296)
Photo FormatJPEG
ISOUp to 1600
MicBuilt-in stereo microphone
Recording TimeUp to 130 minutes in 2560 x 1280 resolution at 30 fps
Battery1160 mAh
Card Slot1 x microSDXC card slot (supports up to 256 GB cards)
Supported Operating SystemsAndroid, iOS, Mac, Windows (360 Video Editor is not available for macOS computers)
Wi-Fi802.11 a/b/g/n/ac (2.4/5 GHz)
Bluetooth4.1
Interface1 x USB 2.0 Type-C
SensorsGyro, accelerometer

Picture with infrared light

Front                       

  Side

The Samsung Gear has a resolution of 95*95 px when positioned straight front of the ball, and 143*133 sideways. This camera shows a high level of distortion at the latter position, and has a slightly lower resolution compared to both Thetas.

It proved somewhat sensible to infrared light, though noticeably less than that of both Thetas.

Price
$82.99 (though as of September 2019 price went up to $200)

Ricoh Theta S


Picture with infrared light

Front                       

    Side

The Ricoh Theta S has, just as the Theta V, a resolution of 100*100 px for both sides. Its sensibility to infrared light was a bit less than that of the Theta V, but higher than that of the Samsung Gear.

Price
$269,32

Zision 360°Panoramic VR Full View Action Camera

Picture with infrared light

Side                            

Up

The Zision has a resolution of 166*166 px when positioned with its only lens facing the subject directly, but the picture shows highly distorted when positioned with the lens upwards.

Its sensibility to infrared light was poor, only a bit higher than that of the MadV.


Price
$59.99

Maginon View 360

50 euros (discount price)

http://www.maginon.com/website/uk/actioncams/view360/

Type of camera     Full-spectrum camera for 360° spherical panoramas

Image sensor     2x 2MP CMOS sensor

Photo resolution     8 MP (4,000 x 2,000 | interpolated), 5 MB (3,200 x 1,600 | interpolated), 3 MP (2,592 x 1,296)

Video resolution     2.048 x 1.024 (30fps)

Lens     2x 210° super wide-angle lens

Aperture     F = 2.0 | Focal length f: 0.88 mm

Recording time     Up to 120 minutes with fully charged battery (without WiFi, 2048 x 1024 / 30fps)

Memory     MicroSD card up to 32 GB (min. class 10 or faster)

Connections     Micro USB connection

Power supply     1,300 mAh lithium-ion battery

Dimensions     137 x 45 x 14 mmWeight     83 g

Picture with infrared light

Front         

Side

The Maginon has a resolution of 50*50,  and showed a bit of distortion when the camera is positioned sideways. Its sensibility to infrared light was similar to that of the Zision.

Price
$110,00 ($55.00 in sale)

Panatrap-MADV: How-to Build

This guide will show you the steps necessary to turn a commercially available Xiaomi MADV (mijia) into a fully functioning, animal sensitive, weatherproof Camera trap!


Code

Code is available in this repo https://github.com/Digital-Naturalism-Laboratories/Panatrap/tree/master/MADV

Hardware

Physical Designs are available here for free download: https://a360.co/2nrPLd8

and included in the Github https://github.com/Digital-Naturalism-Laboratories/Panatrap/tree/master/MADV

Hacking Theory

The MADV designers made an interface on the bottom of the camera for the camera to be easily controlled with their included “selfie stick.” It’s just two metal electrode contacts that when connected it sends a message to the camera(the original selfie stick has a 220 ohm resistor connected to a button).

This is the key interaction to take control of the MADV. We need two electrodes (conveniently spaces 2 header pins apart) to connect to the Ground and a Control pin on our Arduino

From our examinations, the 3 messages you can send to the camera with this interface are:

  • Long Press (5 seconds)
  • Camera ON/OFF
  • Medium Press (2 seconds)
  • Toggle Recording Mode (Video to Photo, or Photo to Video)
  • Short press (0.5-1 seconds)
  • Shutter Button (Take a photo, or start or stop a video)

Cut out or Print the Design

You will need to cut out two main parts of the design:

The Housing – These are the parts that will go around the camera and Arduino, and hold all the pieces together to make the project function

The Weather Shield – This is a 360-degree transparent case that lets the camera stand up again

In the link and the repo, these are 2-dimensional drawing files mean to be laser cut from 3mm acrylic. The full 3D model is there, though, so you could 3D print the housing instead if you don’t have a laser cutter.

I kept the the weather shield quite simple in design as well, so if you do not have a laser cutter, you can pretty much just make a box from clear acrylic that goes aroundDIY

DIY Pogo Pins / Prepare the Contact Connector

Ideally you would have these things called “Pogo pins” which are little spring mounted pins that press against electrical contacts such as those on the bottom of your MADV. If you have those, great! solder wires to two of them with one header pin space in between.

If you are out in the jungle, and don’t have Pogo pins, though, we can make our own ones. You just need:

  • Standard Male Header pins (3 linked together)
  • Chunk of hard rubber (we used a piece of our silicone soldering mat)
  • 2 wires

Take your set up 3 header pins, chop off the middle pin, and stab them all into the hunk of rubber cut to the size of the area in the camera housing for the electrodes. Solder two wires to the outside pins. They should line up directly with the electrical contacts of the camera. The rubber gives the mechanism some squishiness, so you can ensure a good contact with the camera.

See the two pins have a missing pin in between, and are squished up with some silicone rubber.

Connect all the Electronics

Now you just need to connect the components together!

The camera has two electrical contacts you just put together. The wire running to the contact closest to the center of the camera should go to the GND pin on your arduino.

The outside wire contact should go to pin 12 on your arduino.

Now you just need to connect the PIR motion sensors.

PIR

If you are using a 5V power source,

simply connect the Vin lines on the PIRs to the 5V on the arduino

The GND to the GND on the Arduino

and the OUT pins on the PIR to pins A4 and A3 on your arduino

Program

The code is all up at: https://github.com/Digital-Naturalism-Laboratories/Panatrap/blob/master/MADV/Code

This first, simple sketch should let you debug easily and make sure your PIR’s are working and your camera is triggering and shutting down correctly

/*
  PanaTrap - MADV
  Debug Code for the turning the Xiaomi MiSphere (MADV) camera
  into a remotely controlled,
  PIR triggered, Camera Trap
*/

/* in order to trigger the MADV camera to change modes, in theory you should connect a relay to your arduino, and trigger that to connect the nodes of the MADV
  //But we are trying to use minimal hardware, and found that if you connect the node closest to the center of the camera to the ground, and then
  //Connect another digital pin  to the other node
  //Initially set the digital pin to HIGH, and then when you pull it LOW
  //it will trigger the camera
  - Long Press (5 seconds)
  - Camera ON/OFF
  - Medium Press (2 seconds)
  - Toggle Recording Mode (Video to Photo, or Photo to Video)
  - Short press (0.5-1 seconds)
  - Shutter Button (Take a photo, or start or stop a video)
*/

//Front PIR motion sensors
int fPIR = A4;
int fPIRval = -1;

//Back PIR motion sensors
int bPIR = A3;
int bPIRval = -1;

//camera trigger operants

int trigger = 12;
int gndtrigger = 9;


//LED for debugging display
int led = 13;


void setup() {
  //Serial for debugging PIRs
  // initialize serial communication at 9600 bits per second:
  Serial.begin(9600);

  //Turning some of the pins on the camera into virtual Power sources and Grounds
  pinMode(A1, OUTPUT);
  digitalWrite(A1, LOW);

  pinMode(A2, OUTPUT);
  digitalWrite(A2, HIGH);

  pinMode(A5, OUTPUT);
  digitalWrite(A5, LOW);


  //Camera Trigger stuff
  pinMode(led, OUTPUT);
  pinMode(trigger, OUTPUT);
  pinMode(gndtrigger, OUTPUT);

  digitalWrite(led, HIGH);

  digitalWrite(trigger, HIGH);
  digitalWrite(gndtrigger, LOW); //**Andy note A

}

// the loop routine runs over and over again forever:
void loop() {
  // read the input on analog pin 0:
  fPIRval = analogRead(fPIR);
  bPIRval = analogRead(bPIR);
  // print out the value you read:
  Serial.print(fPIRval);
  Serial.print("    rear:  ");
  Serial.println(bPIRval);


  if (fPIRval > 600 || bPIRval > 600) {
    critterDetected();
  }
  delay(1);        // delay in between reads for stability
}

void critterDetected() {

  Serial.println("Critter detected");
  //Turn camera on
  onOffCamera();

  //Take a photo
  takePhoto();
  takePhoto();

  //TODO: Wait to see if other critters still around
  //before we shut off camera

  //TODO: the camera will toggle between photo and video each time it shuts off, add in a toggle here

  onOffCamera();

}

void onOffCamera() {
  Serial.println("cameraONOFF");

  //5 second turn on
  digitalWrite(led, LOW);
  digitalWrite(trigger, LOW);
  digitalWrite(gndtrigger, LOW);
  delay(5000);

  ///Chill
  digitalWrite(led, HIGH);
  digitalWrite(trigger, HIGH);
  digitalWrite(gndtrigger, LOW);
  delay(2000);
}

void takePhoto() {
  Serial.println("Take photo");

  //Take a photo
  digitalWrite(led, LOW);    // turn the LED off by making the voltage LOW
  digitalWrite(trigger, LOW);  // turn the LED on (HIGH is the voltage level)
  digitalWrite(gndtrigger, LOW);
  delay(1000);               // wait for a second

  ///Chill 4 secs
  digitalWrite(led, HIGH);
  digitalWrite(trigger, HIGH);
  digitalWrite(gndtrigger, LOW);
  delay(4000);

}

void togglePhotoVideo() {
  Serial.println("cameraTogglePhotoVideo");

  //2 second press
  digitalWrite(led, LOW);
  digitalWrite(trigger, LOW);
  digitalWrite(gndtrigger, LOW);
  delay(2000);

  ///Chill
  digitalWrite(led, HIGH);
  digitalWrite(trigger, HIGH);
  digitalWrite(gndtrigger, LOW);
  delay(2000);

}

Charge all your devices

In this build, the MADV has its own internal battery, and the Arduino is connected to its own re-chargeable LIPO

Assemble and Deploy

Now that it is ready, give the PIR’s a quick wave, and test that everything is triggering. Now set it in the target environment, and test it again by jumping in front of it and seeing if you can detect it. (You will have lots of 360 selfies with testing this camera).

If it all seems good, leave it there and come check on it in a couple hours or days and see what you caught!

Panatrap- Servo (Theta V): How-to Build

Code

Code is available in this repo https://github.com/Digital-Naturalism-Laboratories/Panatrap/tree/master/Theta_V

Hardware

Physical Designs are available here for free download: https://a360.co/2m22suQ and included in the repo

Materials

  • Theta V camera (~$350 USD)
  • Teensy 3.2 Arduino Microcontroller (~$15)
  • USB Micro breakout board https://www.sparkfun.com/products/12035
  • PIR motion detector (5V) x2
  • Small Servo motor
  • Proto board
  • Male Header pins
  • USB battery pack with On-Off Switch (~ $24)
    (Specifically this one: Talent Cell 3000 mAh – https://www.amazon.com/gp/product/B01M7Z9Z1N/ref=ppx_yo_dt_b_asin_title_o01_s01?ie=UTF8&psc=1 )
  • 3mm bamboo skewers (you can find these at a grocery store)
  • 3mm Acrylic Sheets and Laser Cutter (preferred for this build)
    OR
  • 3D Printer and Filament

Tools

  • Acrylic Welding Glue (or Super glue)
  • Soldering Iron
  • Wires

Hacking Theory

The Theta cameras (and many other 360 cameras) were substantially more locked up in our attempts to remotely operate it (especially in a way that allows for long term use and have a decently quick trigger).

You can follow our detailed journey through hunting ways to control this camera here:

https://community.theta360.guide/t/usb-api-and-arduino-for-camera-trap/4758/20

Andy put in several weeks of work exploring every possible method of hacking this camera.

The camera runs android, and has an accessible USB Api (to recieve PTP commands) and in theory has all the abilities to remotely A) Turn ON/OFF and B)Trigger a Photo Capture. But the firmware blocks some of these features (even if you switch into developer mode).

Turning On/OFF

It’s tough to turn this camera on. Two ways we have found to hack it on are:

-When you send a fake keyboard command through an OTG cable

or

-When a connected Raspberry Pi boots up (unfortunately this is too slow to work as a camera trap)

Luckily we discovered that sending an emulated Keyboard command (through something like a Teensy Arduino 3.2), can remotely wake up the camera (even though this feature is not documented anywhere!), and then we just need to trigger the photo.

Triggering – Servo Style

The photo triggering is then done by a small servo connected to the Arduino which manually taps the shutter button of the camera. The servo is shut down in between photos to reduce power consumption.

We tried a zillion other ways to remotely trigger this camera, but none will work unless the camera is already turned on manually and set into “Plugin Mode”

Build Process

Create Circuit Board

The circuitry is decently simple for this build. You have two inputs (the 2 PIRs), 1 servo, and a USB breakout you need to connect to the TEENSY 3.2

Solder some headers down the right side of your teensy.

Create extra V+ and GND

Solder a wire to the Vin ( second from the top right pin of teensy) and run these to a space where you can have at least 4 more voltage out pins on the proto board

Solder a wire to the GND (third from top right of teensy), and run these to a separate space where you can have at least 4 more GND pins to plu in the servo, PIRs and USB breakout

Connect peripherals

Servo: Solder 3 male header pins and attach one side to Vin, one side to GND, and the center pin to the Servo pin in your code

PIR x2 : Solder 3 male header pins and attach one side to Vin, one side to GND, and the center pin to the Servo pin in your code

USB breakout: Solder the VCC to the Vin, and the GND to the GND on the teensy

Program the Board

Code is available in this repo https://github.com/Digital-Naturalism-Laboratories/Panatrap/tree/master/Theta_V

You can upload this code for a quick easy way to help debug all the parts

/*
  PanaTrap - Theta V
  Debug Code for the turning the Ricoh Theta V camera
  into a remotely controlled,
  PIR triggered, Camera Trap
  Using the Teensy 3.2
  This camera is unfortunately blocked in lots of parts of its API from being able to just control with digital signals
    so we control it two ways
    Wake up: Sending a Fake Keyboard Command
    Take Photo:
    Trigger a servo to physically press the button
    Note: if the button is held down WHILE the camera is waking up, it will NOT take a photo, thus there is some timing trickiness here
    Note: make sure to set the Teensy on Serial+Keyboard+Mouse+Joystick
    Note: we are using the Teensy 3.2 because it can send virtual HID signals, and it is the only one I had that was 5V tolerant (which i only had 5V PIRs)
*/

#include <Servo.h>
Servo myservo;
// create servo object to control a servo
// twelve servo objects can be created on most boards

int offpos = 160;
int onpos = 180;


//Front PIR motion sensors
int fPIR = A5;
int fPIRval = -1;
int fPIRpower = 22;
int fPIRgnd = 20;

//Back PIR motion sensors
int bPIR = A2;
int bPIRval = -1;
int bPIRpower = 19;
int bPIRgnd = 17;

//LED for debugging display
int led = 13;

int detectionThreshold = 600; // Our PIRs are actually just binary, but setting up infrastructure in case other sensors are used
void setup() {
  //Serial for debugging PIRs
  // initialize serial communication at 9600 bits per second:
  Serial.begin(9600);
  digitalWrite(led, HIGH);

  //Configure the mouse (im not sure we actually have to do this
  Mouse.screenSize(1920, 1080);  // configure screen size


  //Setup Servo and put it into waiting position
  myservo.attach(23);
  myservo.write(offpos);
  delay(1000); // You have to wait before detaching the servo for it to actually move into position
  myservo.detach();



}


void loop() {

  fPIRval = analogRead(fPIR);
  bPIRval = analogRead(bPIR);
  // print out the value you read
  Serial.print("   Front PIR:  ");
  Serial.print( fPIRval);
  Serial.print("    rear:  ");
  Serial.println(bPIRval);
  Serial.println( analogRead(A3));


  //For the debug code we are just gonna run a simple routine, Field code should have timeouts, and all kinds of other stuff
  if (fPIRval > 600 || bPIRval > 600) {
    critterDetected();
  }

  delay(1);        // delay in between reads for stability
}

//For our Debug code, we are going to just Turn the camera on, and take two photos, or
//conveniently, if we are in Video mode, we will start and stop a video

void critterDetected() {
  Serial.println("Critter detected");
  //Turn camera on
  onOffCamera();

  //Take a photo
  takePhoto();
  takePhoto();

  //TODO: Wait to see if other critters still around
  //before we shut off camera

  //The theta cannot turn off, just turns on
  //It is set to go to sleep after 3 mins, so we just leave it on in case we catch another critter
  //onOffCamera();

}

void onOffCamera() {
  Serial.println("cameraONOFF");
  digitalWrite(led, LOW);

  //Wake the Camera with a Pulse
  Mouse.set_buttons(1, 0, 0);
  myservo.attach(23);
  myservo.write(offpos); // This keeps the camera from just swinging to a random position
  //Wake the Camera with a Pulse

  delay(50); // Wait for a quick click
  Mouse.set_buttons(0, 0, 0);


  //Let camera wake up
  delay(900);

  ///Chill
  digitalWrite(led, HIGH);
}



void takePhoto() {
  Serial.println("Take photo");
    myservo.attach(23);
   myservo.write(onpos);
delay (1200);//hold for like 1.2 seconds

  //Take a photo
  digitalWrite(led, LOW);    // turn the LED off by making the voltage LOW

//Set back to resting mode
    myservo.write(offpos);
   delay(200); //Need a delay or else it detaches before it finish moving
    myservo. detach();


  ///Chill 4 secs // Camera needs 4 seconds to process between photos
  delay(4000);
 digitalWrite(led, HIGH);
}

void togglePhotoVideo() {
  Serial.println("cameraTogglePhotoVideo");

  //2 second press
  digitalWrite(led, LOW);

  delay(2000);

  ///Chill
  digitalWrite(led, HIGH);

  delay(2000);

}

Create Housing

If using a 3D printer, use the model provided above, and print it out!

If using a Laser cutter, Cut out the housing pieces (auto-generated from the model by Autodesk’s slicer program)

Assemble

Stack all the slices together, and use the bamboo skewers to hold them together temporarily.

Make sure all your components fit! (or else you might need to dremel some modifications like i did here)

Next you might want to hot glue the circuit board in place.

Now use Female-female jumper wires to connect your PIR sensors to the header pins you installed.

Connect the Servo wires directly to the 3 pin header you installed

If everything looks good, connect the battery pack power to the USB breakout, and the OTG cable to the Theta Camera. Now connect the final USB from the teensy to the OTG cable. Test and see that it all works!

If everything looks good, you can superglue the stacks together. I leave the bamboo skewers in to make sure they are aligned perfect. I use acrylic weld, which works better than superglue, but it is all ok, the superglue will just leave some marks.

Weather Shield

The final part of the housing will be to glue together the weather shield. This is just a big clear plastic box with a roof and some spaces to let the PIR motion detectors look through.

For the weather shield you want this to be as clear as possible, so i leave the protective film on the acrylic as long as possible, and just try to just the acrylic weld with a q-tip. You want to clamp and holder everythign in place. I used some velcro straps and some clamps. Let it sit for an hour at least.

Deploy

Now your device should be ready to be set out in the forest, waiting for interesting creatures to come by!

Future Development

This design is large and durable, but a bit bulky. Future designs and slim it down, and allow for more mobility.

Our DIY weather shielding works great, but it does leave a large gap between the camera and the shield which can cause glare and reflections and cut down on some of your POV.