torchgeo.transforms¶
TorchGeo transforms.
- class torchgeo.transforms.AppendBNDVI(index_nir, index_blue)[source]¶
Bases:
AppendNormalizedDifferenceIndex
Blue Normalized Difference Vegetation Index (BNDVI).
Computes the following index:
\[\text{BNDVI} = \frac{\text{NIR} - \text{B}}{\text{NIR} + \text{B}}\]If you use this index in your research, please cite the following paper:
New in version 0.3.
- class torchgeo.transforms.AppendGBNDVI(index_nir, index_green, index_blue)[source]¶
Bases:
AppendTriBandNormalizedDifferenceIndex
Green-Blue Normalized Difference Vegetation Index (GBNDVI).
Computes the following index:
\[\text{GBNDVI} = \frac{\text{NIR} - (\text{G} + \text{B})}{\text{NIR} + (\text{G} + \text{B})}\]If you use this index in your research, please cite the following paper:
New in version 0.3.
- class torchgeo.transforms.AppendGNDVI(index_nir, index_green)[source]¶
Bases:
AppendNormalizedDifferenceIndex
Green Normalized Difference Vegetation Index (GNDVI).
Computes the following index:
\[\text{GNDVI} = \frac{\text{NIR} - \text{G}}{\text{NIR} + \text{G}}\]If you use this index in your research, please cite the following paper:
New in version 0.3.
- class torchgeo.transforms.AppendGRNDVI(index_nir, index_green, index_red)[source]¶
Bases:
AppendTriBandNormalizedDifferenceIndex
Green-Red Normalized Difference Vegetation Index (GRNDVI).
Computes the following index:
\[\text{GRNDVI} = \frac{\text{NIR} - (\text{G} + \text{R})}{\text{NIR} + (\text{G} + \text{R})}\]If you use this index in your research, please cite the following paper:
New in version 0.3.
- class torchgeo.transforms.AppendNBR(index_nir, index_swir)[source]¶
Bases:
AppendNormalizedDifferenceIndex
Normalized Burn Ratio (NBR).
Computes the following index:
\[\text{NBR} = \frac{\text{NIR} - \text{SWIR}}{\text{NIR} + \text{SWIR}}\]If you use this index in your research, please cite the following paper:
New in version 0.2.
- class torchgeo.transforms.AppendNDBI(index_swir, index_nir)[source]¶
Bases:
AppendNormalizedDifferenceIndex
Normalized Difference Built-up Index (NDBI).
Computes the following index:
\[\text{NDBI} = \frac{\text{SWIR} - \text{NIR}}{\text{SWIR} + \text{NIR}}\]If you use this index in your research, please cite the following paper:
- class torchgeo.transforms.AppendNDRE(index_nir, index_vre1)[source]¶
Bases:
AppendNormalizedDifferenceIndex
Normalized Difference Red Edge Vegetation Index (NDRE).
Computes the following index:
\[\text{NDRE} = \frac{\text{NIR} - \text{VRE1}}{\text{NIR} + \text{VRE1}}\]If you use this index in your research, please cite the following paper:
New in version 0.3.
- class torchgeo.transforms.AppendNDSI(index_green, index_swir)[source]¶
Bases:
AppendNormalizedDifferenceIndex
Normalized Difference Snow Index (NDSI).
Computes the following index:
\[\text{NDSI} = \frac{\text{G} - \text{SWIR}}{\text{G} + \text{SWIR}}\]If you use this index in your research, please cite the following paper:
- class torchgeo.transforms.AppendNDVI(index_nir, index_red)[source]¶
Bases:
AppendNormalizedDifferenceIndex
Normalized Difference Vegetation Index (NDVI).
Computes the following index:
\[\text{NDVI} = \frac{\text{NIR} - \text{R}}{\text{NIR} + \text{R}}\]If you use this index in your research, please cite the following paper:
- class torchgeo.transforms.AppendNDWI(index_green, index_nir)[source]¶
Bases:
AppendNormalizedDifferenceIndex
Normalized Difference Water Index (NDWI).
Computes the following index:
\[\text{NDWI} = \frac{\text{G} - \text{NIR}}{\text{G} + \text{NIR}}\]If you use this index in your research, please cite the following paper:
- class torchgeo.transforms.AppendNormalizedDifferenceIndex(index_a, index_b)[source]¶
Bases:
IntensityAugmentationBase2D
Append normalized difference index as channel to image tensor.
Computes the following index:
\[\text{NDI} = \frac{A - B}{A + B}\]New in version 0.2.
- apply_transform(input, params, flags, transform=None)[source]¶
Apply the transform.
- Parameters:
input (Tensor) – the input tensor
params (dict[str, torch.Tensor]) – generated parameters
transform (torch.Tensor | None) – the geometric transformation tensor
- Returns:
the augmented input
- Return type:
- class torchgeo.transforms.AppendRBNDVI(index_nir, index_red, index_blue)[source]¶
Bases:
AppendTriBandNormalizedDifferenceIndex
Red-Blue Normalized Difference Vegetation Index (RBNDVI).
Computes the following index:
\[\text{RBNDVI} = \frac{\text{NIR} - (\text{R} + \text{B})}{\text{NIR} + (\text{R} + \text{B})}\]If you use this index in your research, please cite the following paper:
New in version 0.3.
- class torchgeo.transforms.AppendSWI(index_vre1, index_swir2)[source]¶
Bases:
AppendNormalizedDifferenceIndex
Standardized Water-Level Index (SWI).
Computes the following index:
\[\text{SWI} = \frac{\text{VRE1} - \text{SWIR2}}{\text{VRE1} + \text{SWIR2}}\]If you use this index in your research, please cite the following paper:
New in version 0.3.
- class torchgeo.transforms.AppendTriBandNormalizedDifferenceIndex(index_a, index_b, index_c)[source]¶
Bases:
IntensityAugmentationBase2D
Append normalized difference index involving 3 bands as channel to image tensor.
Computes the following index:
\[\text{TBNDI} = \frac{A - (B + C)}{A + (B + C)}\]New in version 0.3.
- apply_transform(input, params, flags, transform=None)[source]¶
Apply the transform.
- Parameters:
input (Tensor) – the input tensor
params (dict[str, torch.Tensor]) – generated parameters
transform (torch.Tensor | None) – the geometric transformation tensor
- Returns:
the augmented input
- Return type:
- class torchgeo.transforms.RandomGrayscale(weights, p=0.1, same_on_batch=False, keepdim=False)[source]¶
Bases:
IntensityAugmentationBase2D
Apply random transformation to grayscale according to a probability p value.
There is no single agreed upon definition of grayscale for MSI. Some possibilities include:
Average of all bands: \(\frac{1}{C}\) where \(C\) is the number of spectral channels.
RGB-only bands: \([0.299, 0.587, 0.114]\) for the RGB channels, 0 for all other channels.
PCA: the first principal component across the spectral axis computed via PCA, minimizes redundant information.
The weight vector you provide will be automatically rescaled to sum to 1 in order to avoid changing the intensity of the image.
New in version 0.5.
- __init__(weights, p=0.1, same_on_batch=False, keepdim=False)[source]¶
Initialize a new RandomGrayscale instance.
- Parameters:
weights (Tensor) – Weights applied to each channel to compute a grayscale representation. Should be the same length as the number of channels.
p (float) – Probability of the image to be transformed to grayscale.
same_on_batch (bool) – Apply the same transformation across the batch.
keepdim (bool) – Whether to keep the output shape the same as input (True) or broadcast it to the batch form (False).
- apply_transform(input, params, flags, transform=None)[source]¶
Apply the transform.
- Parameters:
input (Tensor) – The input tensor.
params (dict[str, torch.Tensor]) – Generated parameters.
flags (dict[str, torch.Tensor]) – Static parameters.
transform (torch.Tensor | None) – The geometric transformation tensor.
- Returns:
The augmented input.
- Return type:
- class torchgeo.transforms.Rearrange(*args, **kwargs)[source]¶
Bases:
GeometricAugmentationBase3D
Rearrange tensor dimensions.
Examples
To insert a time dimension:
Rearrange('b (t c) h w -> b t c h w', c=1)
To collapse the time dimension:
Rearrange('b t c h w -> b (t c) h w')
- __init__(*args, **kwargs)[source]¶
Initialize a Rearrange instance.
- Parameters:
*args (Any) – Positional arguments for
einops.rearrange()
.**kwargs (Any) – Keyword arguments for
einops.rearrange()
.
- apply_transform(input, params, flags, transform=None)[source]¶
Apply the rearrangement to the input tensor.
- Parameters:
input (Tensor) – the input tensor
params (dict[str, torch.Tensor]) – generated parameters
transform (torch.Tensor | None) – the geometric transformation tensor
- Returns:
The rearranged tensor.
- Return type:
- class torchgeo.transforms.SatSlideMix(gamma=1, beta=(0.0, 1.0), p=0.5)[source]¶
Bases:
GeometricAugmentationBase2D
Applies the Sat-SlideMix augmentation to a batch of images and masks.
Sat-SlideMix rolls (circularly shifts) images along either the height or width axis by a random amount.
If you use this method in your research, please cite the following paper:
New in version 0.8.
- __init__(gamma=1, beta=(0.0, 1.0), p=0.5)[source]¶
Initialize a new SatSlideMix instance.
- Parameters:
gamma (int) – The number of augmented samples to create for each input image. The output batch size will be gamma * B.
beta (torch.Tensor | float | tuple[float, float] | list[float]) – The range of percentage (0.0 to 1.0) of the image dimension (height or width) to shift.
p (float) – Probability to apply the augmentation on each sample
- Raises:
AssertionError – If gamma is not a positive integer.
- apply_transform(input, params, flags, transform=None)[source]¶
Apply the transform to the input image or mask.
- Parameters:
input (Tensor) – the input tensor image or mask
params (dict[str, torch.Tensor]) – generated parameters
transform (torch.Tensor | None) – the geometric transformation tensor
- Returns:
the augmented input
- Return type: