This file is a shared place for all participants to leave short notes, updates, and reminders. Please add your message under Today’s notes or create a dated section.
The repository includes a .gitignore rule for:
/data/
Reason: CSV files in ./data can be very large (some exceed GitHub’s size limits).
Please keep large datasets local (or use an external data store such as Zenodo, S3, Drive, or Git LFS if the project decides to adopt it).
I added a Jupyter Notebook called “Trawling4PACE Explorer – v2.1” to explore the CSV files located in ./data.
Trawling4PACE Explorer is an interactive dashboard inside a Jupyter Notebook for exploring bottom trawl survey data and environmental variables:
- Loads CSV files and auto-detects latitude/longitude columns
- Provides filters (species, year, month, depth)
- Supports up to 4 map layers simultaneously (scatter or density), each with:
- Independent opacity/transparency
- Plotly and CMOcean colorscales
- Linear or log scale color mapping
- Reactive updates (changes apply automatically)
- Includes a loading indicator and a Cancel button for long operations
- Disables controls during rendering to prevent race conditions
- Computes an auto bounding box (with a 2° margin) for quick framing
- Compatible with 2i2c JupyterHub and the Plotly MapLibre API (no deprecation warnings)
On load, the notebook tries to pick sensible defaults, e.g.:
SURFTEMPas an environmental/density layer (thermal colormap)EXPCATCHWTas a catch/scatter layer (haline colormap, log scale)
Author: Leandro (USP/IEAPM)
Project: 2026-proj-Trawling4PACE / NASA PACE Hackweek 2026
- YYYY-MM-DD — Add your message here