mmlearn.datasets.llvip¶
LLVIP dataset.
Classes
Low-Light Visible-Infrared Pair (LLVIP) dataset. |
- class LLVIPDataset(root_dir, train=True, transform=None)[source]¶
Low-Light Visible-Infrared Pair (LLVIP) dataset.
Loads pairs of RGB and THERMAL images from the LLVIP dataset.
- Parameters:
root_dir (str) – Path to the root directory of the dataset. The directory should contain ‘visible’ and ‘infrared’ subdirectories.
train (bool, default=True) – Flag to indicate whether to load the training or test set.
transform (Optional[Callable[[PIL.Image], torch.Tensor]], optional, default=None) – A callable that takes in a PIL image and returns a transformed version of the image as a PyTorch tensor. This is applied to both RGB and thermal images.