No-Code SR Demo is now live!

 

Open In Colab

Super-Resolve Sentinel-2 in Your Browser — No Code Required 🚀

Have you ever looked at a 10 m Sentinel-2 scene and wished you could zoom in without downloading huge stacks of raw data or writing Python? Now you can. The brand-new LDSR-S2 “no-code” Colab notebook lets anyone—researcher, analyst or student—run state-of-the-art latent-diffusion super-resolution entirely in the browser on Google’s free GPUs.

Why it’s worth a try

  • Runs on free Colab hardware – one click, no installs, no GPU at home required.

  • Pick your AOI visually – scroll an OpenStreetMap basemap and click to capture lat/lon, or type coordinates if you already know them.

  • Define your time window & cloud filter – avoid cloudy scenes with a slider (e.g. “max 10 % CC”).

  • Interactive preview – the notebook shows the native 10 m RGB preview and asks “Does this look OK?” before spending GPU time.

  • Instant evaluation – in ~30 s you’ll see a side-by-side LR vs. SR PNG and get ready-to-download GeoTIFFs for both. Load them in QGIS/ArcGIS and decide if LDSR-S2 fits your pipeline.

 

 

Open In Colab

 

Step-by-Step Guide

StepWhat to doTip
1Open the notebook via the badge above.You’ll land in Google Colab.
2Switch to a GPU runtimeRuntime → Change runtime type → GPU.Free tier GPUs are fine.
3Run every cell:Runtime → Run all.Colab installs dependencies & loads the model automatically.
4In the GUI: Select start/end dates and max cloud-cover.Narrow windows fetch faster.
5Click “Enter coordinates” or “Select on map.”

• Typing coords shows input boxes.

• Map lets you pan/zoom and click once.

6Hit “Load Scene.”The S2 10 m preview appears.
7If the preview looks good, press “Use this scene.”Otherwise, choose “Get different scene.”
8Wait ~5 s while the GPU works. 
9View the LR vs. SR comparison PNG.PNG is saved as example.png.
10Download the GeoTIFFs:lr.tif (native) and sr.tif (2.5 m SR).Files sidebar → right-click Download.Both files are fully georeferenced and drop straight into QGIS/ArcGIS.

 

Under the Hood 🔧

  • Data provider: Microsoft Planetary Computer (Sentinel-2 L2A, BOA reflectance).

  • Model: LDSR-S2 v 1.0 (latent diffusion).

  • Patch size: 128 × 128 px → 512 × 512 px (10m -> 2.5 m effective spatial resolution).

  • Multispectral: RGB+NIR  
 
  • Outputs:

    • lr.tif – original 10 m four-band patch.

    • sr.tif – super-resolved 2.5 m four-band patch.

    • example.png – side-by-side LR vs. SR preview.

See if LDSR-S2 Fits Your Use-Case

  • Trying to classify roofs?

  • Need crisper disaster-response maps?

  • Curious if SR helps object detection?

Open the notebook, click a location, and find out in minutes—no coding, no local GPU, zero setup. If it works, grab the GeoTIFFs and drop them into your workflow; if not, change dates or pick another AOI and iterate instantly.

Happy super-resolving! Don’t forget to let us know if these results are useful for you!

Recent Posts

No-Code SR Demo is now live!

This demo, aimed at non-technical users, allows you to enter your coordinates and create a super-resolution product on your custom Sentinel-2 acquisition. Immediately judge wether SR can be useful for you application!

OpenSR Team @Living Planet Symposium

The OpenSR team joined ESA’s Living Planet Symposium 2025 to present our latest advances in Sentinel-2 super-resolution, dataset standards, and workflows. From latent diffusion models to FAIR-compliant data access with TACO, our tools aim to make high-resolution Earth observation more accessible and actionable.

New Release: OpenSR-UseCases Package

A lightweight validation toolkit to benchmark segmentation performance across low-, super-, and high-resolution imagery. Quantifies how well super-resolution models improve object detection and segmentation accuracy in real-world tasks. Ideal for researchers who want to go beyond visual inspection and measure actual downstream performance gains.

New Preprint: A Radiometrically and Spatially Consistent Super-Resolution Framework for Sentinel-2

We’ve published a new preprint presenting SEN2SR, a deep learning framework for super-resolving Sentinel-2 imagery with radiometric and spatial fidelity. The model leverages harmonized synthetic data, hard constraints, and xAI tools to achieve artifact-free enhancements at 2.5 m resolution.

RGB-NIR Latent Diffusion Super-Resolution Model Released!

Our Latent diffusion model, including weights, for the RGB-NIR bands of Sentinel-2 has been released.

New Publication: LDSR-S2 Model Paper

Our diffusion-based super-resolution model for Sentinel-2 imagery has been published in IEEE JSTARS! The open-access paper introduces a latent diffusion approach with pixelwise uncertainty maps—pushing the boundaries of trustworthy generative modeling in Earth observation.

SEN2NAIP v2.0 Released — A Major Boost for Sentinel-2 Super-Resolution

We’ve released SEN2NAIP v2.0, a large-scale dataset designed for training and validating super-resolution models on Sentinel-2 imagery. The dataset includes thousands of real and synthetic HR-LR image pairs, making it a cornerstone for future SR research in Earth Observation.

New Publication: SEN2NAIP published in ‘Scientific Data’

The dataset paper has been published in 'Scientific Data'.

The OpenSR team contributes to Flood Mapping for the Valencian Flash Floods

Our team at the University of Valencia has released an interactive satellite flood map of the recent Valencia flash floods, using Landsat-8 and Sentinel-2 imagery combined with a machine learning segmentation model. Leveraging super-resolution techniques, we enhanced Sentinel-2 data to 2.5m resolution, enabling more precise flood extent mapping for post-disaster analysis.

OpenSR-Utils Preview Released: A package to handle patching, tiling and overlapping for SR Products

We’ve released a preview of OpenSR-Utils, a Python package to apply super-resolution models on raw Sentinel-2 imagery. With multi-GPU support, georeferenced output, and automatic patching, it’s a practical toolkit for real-world remote sensing pipelines.

SUPERIX: Intercomparison Excercise

Presenting SUPERIX: a community-driven benchmark to rigorously compare super-resolution models for Sentinel-2 data. Using real-world datasets and tailored metrics, SUPERIX aims to uncover the true impact of SR techniques on remote sensing accuracy.

Team attends ESA SUREDOS Workshop in Frascati

Our team attended the ESA SUREDOS Workshop to discuss the role of super-resolution in enhancing Earth Observation data. The event explored cutting-edge deep learning techniques and the importance of reliable, domain-specific validation for scientific and operational EO applications.

New Publication: OpenSR-Test theoretical framework has been published

OpenSR-test is now published in IEEE GRSL! Our new paper introduces a rigorous benchmark for evaluating super-resolution in remote sensing with real-world datasets and meaningful metrics.

OpenSR-Test Framework and Datasets Released

Our framework to validate supre-resolution results is now published. It can take any SR model and create sophisticated validation metrics over mutliple datasets, enhancing the comparability of methodologies.

OpenSR-Degradation Released: Package to create Syntehtic Training Data

We’ve released OpenSR-Degradation, a toolkit to generate synthetic Sentinel-2-like imagery from NAIP using statistical, deterministic, and variational models. This open-source pipeline enables large-scale training and benchmarking for cross-sensor super-resolution.