DroneWQ is a Python package that can be used to analyze multispectral data collected from a drone to derive ocean color radiometry and water quality properties. These scripts are specific for the MicaSense RedEdge and Altum cameras.
For details on the processing and theory of DroneWQ, please visit our readthedocs: https://dronewq.readthedocs.io/
Additional information on the methods can be found in:
Román, A., Heredia, S., Windle, A. E., Tovar-Sánchez, A., & Navarro, G., 2024. Enhancing Georeferencing and Mosaicking Techniques over Water Surfaces with High-Resolution Unmanned Aerial Vehicle (UAV) Imagery. Remote Sensing, 16(2), 290. https://doi.org/10.3390/rs16020290
Gray, P.C., Windle, A.E., Dale, J., Savelyev, I.B., Johnson, Z.I., Silsbe, G.M., Larsen, G.D. and Johnston, D.W., 2022. Robust ocean color from drones: Viewing geometry, sky reflection removal, uncertainty analysis, and a survey of the Gulf Stream front. Limnology and Oceanography: Methods. https://doi.org/10.1002/lom3.10511
Windle, A.E. and Silsbe, G.M., 2021. Evaluation of unoccupied aircraft system (UAS) remote sensing reflectance retrievals for water quality monitoring in coastal waters. Frontiers in Environmental Science, p.182. https://doi.org/10.3389/fenvs.2021.674247
The easiest way to install DroneWQ is using pip:
pip install dronewqIf you want to install from source:
git clone https://github.com/aewindle110/DroneWQ.git
cd DroneWQ
pip install -e .DroneWQ requires Python 3.8-3.12. Some dependencies may require additional system libraries:
- GDAL: Required for geospatial operations
- ExifTool: Required for reading MicaSense image metadata
- ZBar: Required for QR code reading from calibration panels
On Ubuntu/Debian:
sudo apt-get update
sudo apt-get install gdal-bin libgdal-dev exiftool zbar-tools python3-gdal python3-cartopyOn macOS (using Homebrew):
brew install gdal exiftool zbarWe recommend running this package in a Docker container for consistency. The Docker image includes all dependencies pre-configured. See https://docs.docker.com/ for installation files.
With Docker installed and running, launch the container:
docker run -it -v <local directory>:/home/jovyan --rm -p 8888:8888 clifgray/dronewq:v3where <local directory> is where you want data to be saved.
Then launch Jupyter Lab:
jupyter lab --allow-root --ip 0.0.0.0 /home/jovyanCopy the generated URL from the terminal into your web browser.
Alternatively, you can create a conda environment:
conda env create -f environment.yml
conda activate dronewqNote: We have included a modified version of the MicaSense imageprocessing scripts in this repo. Our modifications include:
- Radiance data type expressed as Float32 instead of Uint16
- Image radiance output in milliwatts (mW) instead of watts (W)
- Modified
capture.save_capture_as_stack()to not scale and filter data
These modifications impact the panel_ed calculation. When MicaSense releases a package with user-specified radiance data types, we will revert to their official package.
DroneWQ requires MicaSense images organized in a specific folder structure:
<main_directory>/
├── panel/ # Calibrated reflectance panel images (before/after flight)
├── raw_sky_imgs/ # Sky images (40° from zenith, ~135° azimuth)
├── raw_water_imgs/ # Water images from flight
└── align_img/ # One image capture (5 .tif files) for alignment
Directory descriptions:
- panel/: Contains image captures of the MicaSense calibrated reflectance panel
- raw_sky_imgs/: Contains sky images taken at 40° from zenith and ~135° azimuthal viewing direction
- raw_water_imgs/: Contains all water images captured during the flight
- align_img/: Contains one image capture (5 .tif files, one per band) from
raw_water_imgs/used to compute the warp matrix for aligning all images
You can find a sample dataset (Lake Erie) at Zenodo DOI.
Before processing, configure the main directory path:
import dronewq
# Configure the main directory containing your organized images
dronewq.configure(main_dir="/path/to/your/main_directory")The configure() function automatically sets up all subdirectory paths based on the main directory.
The main processing function converts raw imagery to calibrated remote sensing reflectance:
# Process raw images to Rrs
dronewq.process_raw_to_rrs(
output_csv_path="/path/to/metadata.csv",
lw_method="mobley_rho_method", # Water leaving radiance method
ed_method="dls_ed", # Downwelling irradiance method
mask_pixels=True, # Apply pixel masking
pixel_masking_method="value_threshold",
nir_threshold=0.01, # NIR threshold for masking glint
green_threshold=0.005, # Green threshold for masking shadows
num_workers=4 # Number of parallel workers
)Processing workflow:
- Raw → Lt: Converts raw pixel values to radiance (Lt)
- Lt → Lw: Removes sky reflection to get water-leaving radiance (Lw)
- Lw → Rrs: Normalizes by downwelling irradiance (Ed) to get remote sensing reflectance (Rrs)
- Masking: Optionally masks pixels containing glint, shadows, or vegetation
Apply bio-optical algorithms to retrieve water quality parameters:
# Calculate chlorophyll-a using Gitelson algorithm
dronewq.save_wq_imgs(
wq_alg="chl_gitelson", # Options: chl_gitelson, chl_hu, chl_ocx, chl_hu_ocx, nechad_tsm
num_workers=4
)Georeference individual images and create an orthomosaic:
# Load metadata
import pandas as pd
metadata = pd.read_csv("/path/to/metadata.csv")
# Compute flight lines
flight_lines = dronewq.compute_flight_lines(
captures_yaw=metadata['Yaw'].values,
altitude=metadata['Altitude'].values[0],
pitch=0,
roll=0
)
# Georeference images
dronewq.georeference(
metadata=metadata,
input_dir=dronewq.settings.rrs_dir,
output_dir="/path/to/georeferenced/",
lines=flight_lines
)
# Create mosaic
dronewq.mosaic(
input_dir="/path/to/georeferenced/",
output_path="/path/to/mosaic.tif"
)For detailed documentation on the processing theory and methods, please visit: https://dronewq.readthedocs.io/
Main function to process raw imagery to remote sensing reflectance (Rrs).
dronewq.process_raw_to_rrs(
output_csv_path: str,
lw_method: str = "mobley_rho_method",
mask_pixels: bool = False,
random_n: int = 10,
pixel_masking_method: str = "value_threshold",
mask_std_factor: int = 1,
nir_threshold: float = 0.01,
green_threshold: float = 0.005,
ed_method: str = "dls_ed",
overwrite_lt_lw: bool = False,
clean_intermediates: bool = True,
num_workers: int = 4
)Parameters:
output_csv_path(str): Path to write the metadata CSV filelw_method(str): Method for calculating water-leaving radiance. Options:"mobley_rho_method"(default): Uses Mobley's rho parameter"hedley_method": Hedley/Hochberg sky glint removal"blackpixel_method": Black pixel assumption method
ed_method(str): Method for calculating downwelling irradiance. Options:"dls_ed"(default): Uses DLS sensor data"panel_ed": Uses calibrated reflectance panel"dls_and_panel_ed": DLS corrected by panel
mask_pixels(bool): Whether to apply pixel masking (default: False)pixel_masking_method(str): Masking method -"value_threshold"or"std_threshold"nir_threshold(float): NIR reflectance threshold for masking glint (default: 0.01)green_threshold(float): Green reflectance threshold for masking shadows (default: 0.005)num_workers(int): Number of parallel workers (default: 4)
Calculate water quality parameters from Rrs images.
dronewq.save_wq_imgs(
wq_alg: str = "chl_gitelson",
start: int = 0,
count: int = 10000,
num_workers: int = 4
)Parameters:
wq_alg(str): Water quality algorithm to apply. Options:"chl_gitelson"(default): Gitelson et al. 2007, recommended for coastal waters"chl_hu": Hu et al. 2012, for low chlorophyll (<0.15 mg m⁻³)"chl_ocx": OCx algorithm, for higher chlorophyll (>0.2 mg m⁻³)"chl_hu_ocx": Blended NASA algorithm combining Hu and OCx"nechad_tsm": Nechad et al. 2010 for total suspended matter
start(int): Starting image index (default: 0)count(int): Number of images to process (default: 10000)num_workers(int): Number of parallel workers (default: 4)
# Gitelson (recommended for coastal/Case 2 waters)
chl = dronewq.chl_gitelson(Rrsred, Rrsrededge)
# Hu (for low chlorophyll concentrations)
chl = dronewq.chl_hu(Rrsblue, Rrsgreen, Rrsred)
# OCx (for higher chlorophyll concentrations)
chl = dronewq.chl_ocx(Rrsblue, Rrsgreen)
# Blended Hu-OCx (NASA standard)
chl = dronewq.chl_hu_ocx(Rrsblue, Rrsgreen, Rrsred)# Nechad et al. 2010
tsm = dronewq.tsm_nechad(Rrsred)Compute flight lines from capture metadata.
flight_lines = dronewq.compute_flight_lines(
captures_yaw: np.ndarray,
altitude: float,
pitch: float = 0,
roll: float = 0,
threshold: float = 10
)Georeference images using camera metadata and flight parameters.
dronewq.georeference(
metadata: pd.DataFrame,
input_dir: str,
output_dir: str,
lines: List[Dict] = None,
altitude: float = None,
pitch: float = 0,
roll: float = 0,
yaw: float = None,
num_workers: int = 4
)Create an orthomosaic from georeferenced images.
dronewq.mosaic(
input_dir: str,
output_path: str,
method: str = "mean"
)Configure package settings (main directory and subdirectories).
dronewq.configure(main_dir="/path/to/main_directory")Load images and associated metadata.
images, metadata = dronewq.retrieve_imgs_and_metadata(
img_dir: str,
count: int = 10000,
start: int = 0,
altitude_cutoff: float = 0,
sky: bool = False,
random: bool = False
)Extract and save image metadata to CSV.
dronewq.write_metadata_csv(
img_set: ImageSet,
csv_output_path: str
)Core Modules (dronewq.core):
raw_to_rss.py: Main processing pipeline (raw → Rrs)wq_calc.py: Water quality algorithm implementationsgeoreference.py: Image georeferencing functionsmosaic.py: Orthomosaic creationplot_map.py: Visualization utilities
Water Leaving Radiance Methods (dronewq.lw_methods):
mobley_rho.py: Mobley rho sky reflection removalhedley.py: Hedley/Hochberg methodblackpixel.py: Black pixel assumption method
Downwelling Irradiance Methods (dronewq.ed_methods):
dls_ed.py: DLS-based irradiance calculationpanel_ed.py: Panel-based irradiance calculation
Masking Methods (dronewq.masks):
threshold_masking.py: Threshold-based pixel maskingstd_masking.py: Standard deviation-based masking
Utilities (dronewq.utils):
settings.py: Configuration managementimages.py: Image loading and processing utilitiesmetadata.py: Metadata extraction and management
See the primary_demo.ipynb notebook for a complete example workflow using the Lake Erie dataset. The notebook demonstrates:
- Setting up the workspace
- Extracting and viewing metadata
- Processing raw imagery to Rrs
- Applying bio-optical algorithms
- Georeferencing and creating mosaics
DroneWQ uses a singleton settings object to manage paths and configuration:
import dronewq
# Set main directory (auto-populates subdirectories)
dronewq.configure(main_dir="/path/to/data")
# Access settings
print(dronewq.settings.main_dir)
print(dronewq.settings.rrs_dir)
print(dronewq.settings.wq_dir)
# Or use the settings object directly
from dronewq.utils.settings import settings
settings.configure(main_dir="/path/to/data")- Parallel Processing: Adjust
num_workersbased on your CPU cores (default: 4) - Batch Processing: Use
startandcountparameters to process large datasets in batches - Intermediate Cleanup: Set
clean_intermediates=Trueto save disk space after processing - Memory Management: For very large datasets, process images in smaller batches
Example orthomosaic of UAS images collected over Western Lake Erie, processed to chlorophyll a concentration using DroneWQ.

