Camera-traps
The Camera Traps application is both a simulator and IoT device software for utilizing machine learning on the edge in field research. The first implementation specializes in applying computer vision (detection and classification) to wildlife images for animal ecology studies. Two operational modes are supported: "simulation" mode and "demo" mode. When executed in simulation mode, the software serves as a test bed for studying ML models, protocols and techniques that optimize storage, execution time, power and accuracy. It requires an input dataset of images to act as the images that would be generated an IoT camera device; it uses these images to drive the simulation.
Characterizing and Modeling AI-Driven Animal Ecology Studies at the Edge
This repo provides instructions for extracting workload information from AI-Driven Animal Ecology (ADAE) studies.
CT-Controller
The ctcontroller tool can be used to manage the provisioning and releasing of edge hardware as well as running and shutting down the camera-traps application.
Explanation
Figure: Workflow of AI-Driven Animal Ecology Study.png)
Explanation
Architectual Overview
Explanation
Architecture Overview
Explanation
TapisUI provides a research oriented frontend to interact with Tapis and tenant components. In this case, the ICICLE extension extends TapisUI with custom branding
Explanation
Definitions of Key Terms and Concepts
Explanation
System Architecture and Design Philosophy
How-To Guides
Step 1: Extract arrival rates from real ADAE Studies
How-To Guides
Quick Start
How-To Guides
Installation
How-To Guides
Quick Guide
How-To Guides
View the main TapisUI wiki to learn how to deploy and test TapisUI extensions locally.
How-To Guides
This section walks through how to use this repository and its features. It is split into sections based on the types of tasks you're looking to accomplish.
How-To Guides
Continuing Multi-Session Missions
OpenPass
The Decentralized Microservice Drone System for Digital Agriculture is a distributed, scalable platform designed to orchestrate autonomous drone operations for agricultural field missions. The system captures, processes, and analyzes aerial imagery and video data to support precision agriculture, crop monitoring, and field management operations.
Tapis UI Extension
This tapis ui extension enables additional icicle specific branding and tabs on tapisui.
Tutorials
Prerequisites
Tutorials
Getting Started with WildWing
Video-Based Animal Re-Identification (VARe-ID) from Multiview Spatio-Temporal Track Clustering
This work presents a modular software pipeline and end-to-end workflow for video-based animal re-identification, which assigns consistent individual IDs by clustering multiview spatio-temporal tracks with minimal human intervention. Starting from raw video, the system detects and tracks animals, scores and selects informative left/right views, computes embeddings, clusters annotations by viewpoint, and then links clusters across time and varying perspectives using spatio-temporal continuity. Automated consistency checks resolve remaining ambiguities. Preliminary experiments demonstrate near-perfect identification accuracy with very limited manual verification. The workflow is designed to be generalizable across species. Currently, trained models support Grevy’s and Plains zebras, with plans to expand to a broader range of species.
WildWing
An open-source, autonomous and affordable UAS for animal behaviour video monitoring using Parrot Anafi drones to track group-living animals.