Olbedo: An Albedo and Shading Aerial Dataset for Large-Scale Outdoor Environments

Comparison Videos









Abstract

Photogrammetry has revolutionized 3D content creation, yet it remains fundamentally constrained by a critical bottleneck: real-world lighting conditions. Practitioners must capture data under overcast skies to avoid shadows and highlights that become permanently baked into texture maps, breaking physically-based rendering workflows and limiting capture windows. This weather dependency imposes severe logistical and financial burdens on scalable world digitization. We address this challenge through intrinsic image decomposition, specifically targeting albedo recovery for aerial photogrammetry.

We present Olbedo, the first large-scale, real-world dataset designed for weather-agnostic 3D asset creation from aerial imagery. Olbedo comprises high-resolution outdoor scenes captured from unmanned aerial vehicles (UAVs), providing dense ground-truth albedo and shading maps alongside comprehensive auxiliary data including camera poses, depth, normals, and synchronized sky dome HDR captures.

We generate accurate ground truth using an inverse rendering pipeline applied to multi-view aerial datasets, addressing the critical gap in real-world outdoor training data. To demonstrate Olbedo's effectiveness, we develop a diffusion-based single-image albedo decomposition model, fine-tuned from pre-trained indoor networks using LoRA. Extensive evaluations on synthetic outdoor datasets and real aerial imagery show our approach significantly outperforms prior physics-based and unsupervised methods in recovering intricate material properties under varying lighting conditions. By enabling robust de-lighting of photogrammetric textures, Olbedo unlocks flexible, weather-agnostic pipelines that dramatically accelerate the production of photorealistic, relightable digital twins of our world.

Original vs Albedo (Ours) Texture

Drag the slider to compare