Camera-traps
The Camera Traps application is both a simulator and IoT device software for utilizing machine learning on the edge in field research. The first implementation specializes in applying computer vision (detection and classification) to wildlife images for animal ecology studies. Two operational modes are supported: "simulation" mode and "demo" mode. When executed in simulation mode, the software serves as a test bed for studying ML models, protocols and techniques that optimize storage, execution time, power and accuracy. It requires an input dataset of images to act as the images that would be generated an IoT camera device; it uses these images to drive the simulation.
Cyberinfrastructure Knowledge Network
The Cyberinfrastructure Knowledge Network (CKN) is an extensible and portable distributed framework designed to optimize AI at the edge—particularly in dynamic environments where workloads may change suddenly (for example, in response to motion detection). CKN enhances edge–cloud collaboration by using historical data, graph representations, and adaptable deployment of AI models to satisfy changing accuracy‑and‑latency demands on edge devices.
Explanation
Architectual Overview
Explanation
CKN facilitates seamless connectivity between edge devices and the cloud through event streaming, enabling real‑time data capture and processing. By leveraging event‑stream processing, it captures, aggregates, and stores historical system‑performance data in a knowledge graph that models application behaviour and guides model selection and deployment at the edge.
Explanation
- This dataset captures both tabular metadata and graph representations from deep learning training workflows, extracted via TensorFlow's XLA compiler.
Explanation
Features
Explanation
TapisUI provides a research oriented frontend to interact with Tapis and tenant components. In this case, the ICICLE extension extends TapisUI with custom branding
Explanation
Definitions of Key Terms and Concepts
Explanation
System Architecture and Design Philosophy
HLO Feature Dataset for AI Resource Estimation
A dataset designed to support AI-driven resource estimation like runtime prediction to support HPC scheduling optimization by leveraging compiler-level High-Level Optimizer (HLO) graph features and deep learning workload metadata.
How to Guides
Getting Started
How-To Guide
See the full documentation for detailed instructions on creating custom plug‑ins and streaming events to the knowledge graph.
How-To Guide
Prerequisites
How-To Guides
Quick Start
How-To Guides
How to Predict Training Time Using Metadata
How-To Guides
Quick Guide
How-To Guides
View the main TapisUI wiki to learn how to deploy and test TapisUI extensions locally.
How-To Guides
This section walks through how to use this repository and its features. It is split into sections based on the types of tasks you're looking to accomplish.
How-To Guides
Continuing Multi-Session Missions
OpenPass
The Decentralized Microservice Drone System for Digital Agriculture is a distributed, scalable platform designed to orchestrate autonomous drone operations for agricultural field missions. The system captures, processes, and analyzes aerial imagery and video data to support precision agriculture, crop monitoring, and field management operations.
Tapis UI Extension
This tapis ui extension enables additional icicle specific branding and tabs on tapisui.
Tutorial
1. Create a CKN Topic
Tutorials
Getting Started with the HLO Feature Dataset
Tutorials
Prerequisites
Tutorials
Getting Started with WildWing
Video-Based Animal Re-Identification (VARe-ID) from Multiview Spatio-Temporal Track Clustering
This work presents a modular software pipeline and end-to-end workflow for video-based animal re-identification, which assigns consistent individual IDs by clustering multiview spatio-temporal tracks with minimal human intervention. Starting from raw video, the system detects and tracks animals, scores and selects informative left/right views, computes embeddings, clusters annotations by viewpoint, and then links clusters across time and varying perspectives using spatio-temporal continuity. Automated consistency checks resolve remaining ambiguities. Preliminary experiments demonstrate near-perfect identification accuracy with very limited manual verification. The workflow is designed to be generalizable across species. Currently, trained models support Grevy’s and Plains zebras, with plans to expand to a broader range of species.
WildWing
An open-source, autonomous and affordable UAS for animal behaviour video monitoring using Parrot Anafi drones to track group-living animals.