CARS logo

CARS Tutorial¶

CARS Team

Outline¶

  • Tutorial preparation
  • Context
  • How CARS works?
  • High level design
  • Quickstart
  • Command Line Interface examples
  • Step by step framework manipulation

CARS logo

Tutorial preparation¶

Preparation¶

Prerequisites:

python >= 3.8
python-venv or virtualenv
gcc
web access

Cars install from pypi¶

Follow CARS install: https://cars.readthedocs.io/en/latest/install.html

VLFEAT install:

git clone https://github.com/CNES/vlfeat.git
cd vlfeat && make && cd ..
export CFLAGS="-I$PWD/vlfeat"
eport LDFLAGS="-L$PWD/vlfeat/bin/glnxa64"
export LD_LIBRARY_PATH="$PWD/vlfeat/bin/glnxa64:$LD_LIBRARY_PATH"

CARS install:

python -m venv venv # virtualenv venv init
source ./venv/bin/activate # enter the virtualenv
pip install --upgrade "pip<=23.0.1" "numpy>=1.17.0" cython
pip install cars

Jupyter notebook preparation¶

Install Jupyter packages in previous virtualenv:

source ./venv/bin/activate # enter the virtualenv
pip install notebook rise bokeh
# add any tool you may need through pip install

Build Jupyter kernel:

python -m ipykernel install --sys-prefix --name=cars-kernel --display-name=cars-kernel

Jupyter environnement:

jupyter notebook

CARS logo

Context¶

CARS in a nutshell¶

CARS produces Digital Surface Models from satellite imaging by photogrammetry.

Main goals:

  • robust and distributed tool for operational pipelines.
  • capitalizing 3D developments
  • prototyping, tests, r&d evaluation

Be aware that CARS is new and evolving to maturity with CNES roadmaps

License: Apache-2.0

CARS logo

Web sites:

  • https://github.com/cnes/cars/
  • https://cars.readthedocs.io/

Projects context¶

  • CO3D project: four small satellites in the CO3D constellation to map the whole globe in 3D
  • AI4GEO : production of automatic 3D geospatial information based on AI technologies.
  • Internal studies, internships, phd, ...

CO3D logo AI4GEO logo

Authors¶

  • David Youssefi david.youssefi@cnes.fr
  • Emmanuel Dubois emmanuel.dubois@cnes.fr
  • Emmanuelle Sarrazin emmanuelle.sarrazin@cnes.fr
  • Yoann Steux yoann.steux@csgroup.eu
  • Florian Douziech florian.douziech@csgroup.eu
  • Mathis Roux mathis.roux@csgroup.eu

See Authors.md for full contributions in Github repository.

Copyright¶

  • CNES Copyright to ease maintenance with Contributor License Aggrement

Contributions¶

See Contributing.md

Glossary¶

DEM: Digital Elevation Model. Usually means all elevation models in raster: DSM, DTM,…

DSM: Digital Surface Model. Represents the earth’s surface and includes all objects on it. CARS generates DSMs.

DTM: Digital Terrain Model. Represents bare ground surface without any objects like plants and buildings.

ROI: Region of Interest means a subpart of the DSM raster in CARS.

CARS logo

How CARS works?¶

Satellite photogrammetry¶

image

Indirect measure (same as eyes) by passive observation:
Needs at least 2 images!

image

Satellite photogrammetry¶

image

Indirect measure (same as eyes) by passive observation:
Needs at least 2 images and geometric models:

Rational Polynomial Coefficients (RPCs) provide a compact representation of a ground-to-image geometry giving a relationship between:

  • Image coordinates + altitude and ground coordinates (direct model: image to ground)
  • Ground coordinates + altitude and image coordinates (inverse model: ground to image)

Satellite photogrammetry¶

image

The photogrammetric processing can be performed without requiring a physical camera model.

These coefficients are classically contained in the RPC*XML files

Sensor to Dense DSM¶

  • Images in same "eyes" geometry: lines are aligned and
  • Performance driven: 1 dimension research only
dense_steps
image

Sensor to Dense DSM¶

For each point in one image, find the corresponding point in the other image.

dense_steps
image

Sensor to Dense DSM¶

The resulting shifts are transformed into positions in the two images.
This allows us to deduce the lines of sight. The intersection of these lines gives a point in space: longitude.

dense_steps
image

Sensor to Dense DSM¶

The resulting shifts are transformed into positions in the two images.
This allows us to deduce the lines of sight. The intersection of these lines gives a point in space: latitude.

dense_steps
image

Sensor to Dense DSM¶

The resulting shifts are transformed into positions in the two images.
This allows us to deduce the lines of sight. The intersection of these lines gives a point in space: altitude.

dense_steps
image

Sensor to Dense DSM¶

The resulting shifts are transformed into positions in the two images.
This allows us to deduce the lines of sight. The intersection of these lines gives a point in space: altitude (single band pseudo color).

dense_steps
image

Sensor to Dense DSM¶

The resulting shifts are transformed into positions in the two images.
This allows us to deduce the lines of sight. The intersection of these lines gives a point in space: altitude (single band pseudo color and hillshade).

dense_steps
image

Sensor to Dense DSM¶

To obtain a raster image, the final process projects each point in a 2D grid: the altitudes.

dense_steps
image

Sensor to Dense DSM¶

To obtain a raster image, the final process projects each point in a 2D grid: the colors.

dense_steps
image

Sensor to Sparse DSM¶

Matching can also be performed with keypoints (SIFT).

dense_steps
image

Sensor to Sparse DSM¶

Matching can also be performed with keypoints (SIFT).

dense_steps
image

Sensor to Sparse DSM¶

The result is a sparse point cloud...

dense_steps
image

Sensor to Sparse DSM¶

The result is a sparse point cloud...

dense_steps
image

Sensor to Sparse DSM¶

... and a sparse digital surface model.

dense_steps
image

CARS logo

High level design¶

CARS characteristics¶

Objectives:

  • robust and performant methods for mass production.
  • state of the art algorithms
  • satellite data
  • distributed design
  • python3 and co when possible

Technologies used :

  • Epipolar geometry
  • Input DTM
  • Scale Invariant Feature Transform (SIFT) sparse matching [1]
  • Semi Global Matching(SGM) matching optimization [2]

[1] D. G. Lowe. Distinctive image features from scale-invariant keypoints. IJCV, 2(60):91-110, 2004.

[2] H. Hirschmuller, "Stereo Processing by Semiglobal Matching and Mutual Information," in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 30, no. 2, pp. 328-341, Feb. 2008. doi: 10.1109/TPAMI.2007.1166

Main dependencies¶

  • Matching: Pandora and its libs: libsgm, libmc-cnn,...
  • Geometry: shareloc
  • Sparse matching: Vlfeat
  • Image libraries: rasterio, pyproj, Fiona, Shapely, NetCDF4
  • Data libraries: Numpy, Scipi, pandas, Affine, matplotlib.
  • Distributed and structure libraries: xarray, DASK, numba,
  • Python packaging, code quality, documentation: setuptools, pylint, flake8, black, isort, pre-commit, sphinx, ...

Orchestrator and distributed computing¶

DASK framework can be used locally (local_dask) or through PBS on a HPC (pbs_dask). The orchestrator framework separates 3D pipeline from computing distribution. Features:

  • Memory dependent automatic computing of a tile size
  • Epipolar tiles and terrain tiles graph creation to distribute tile on nodes.

Command Line Interface¶


cars -h
    
usage: cars [-h] [--loglevel {DEBUG,INFO,PROGRESS,WARNING,ERROR,CRITICAL}] [--version] conf

CARS: CNES Algorithms to Reconstruct Surface

positional arguments:
  conf                  Inputs Configuration File

optional arguments:
  -h, --help            show this help message and exit
  --loglevel {DEBUG,INFO,PROGRESS,WARNING,ERROR,CRITICAL}
                        Logger level (default: WARNING. Should be one of (DEBUG, INFO, PROGRESS, WARNING, ERROR, CRITICAL)
  --version, -v         show program's version number and exit

CARS Configuration : JSON¶

{
    "inputs": {},
    "orchestrator": {},
    "applications": {},
    "output": {},
    "pipeline": "pipeline_to_use"
}

Pipelines¶

Two possibilities:

  • sensor_to_dense_dsm : main pipeline for a dense high resolution DSM (see details after) (default)
  • sensor_to_sparse_dsm : produce a sparse low resolution DSM based on SIFT

Inputs¶

Set sensors, geometric models, pairing, initial_elevation.

   {
    "inputs": {
        "sensors" : {
            "one": {
                "image": "img1.tif",
                "geomodel": "img1.geom",
                "no_data": 0
            },
            "two": {
                "image": "img2.tif",
                "geomodel": "img2.geom",
                "no_data": 0

            }
        },
        "pairing": [["one", "two"]],
        "initial_elevation": "srtm_dir/N29E031_KHEOPS.tif"
    },

Applications¶

Allows to redefine default parameters for each application used by pipeline and parameter the pipeline.

Orchestrator¶

Define orchestrator parameters that control the distributed computations:

  • mode: Parallelization mode “local_dask”, “pbs_dask” or “mp”

  • nb_workers: Number of workers

  • walltime: dependent on the mode.

Output¶

dependent on the pipeline. For main pipeline example:

"output": {
      "out_dir": "myoutputfolder",
      "dsm_basename": "mydsm.tif"
}
sparse_matching_corrections

CARS 3D specifics¶

  • First SIFT Sparse matching steps for each pair:

    • get vertical epipolar distribution to correct resampling
    • get horizontal disparity distribution for dense matching step
  • use an adapted epipolar geometry : null disparity is based on a reference DTM (SRTM typically)

Pandora logo

Pandora dense matching pipeline¶

  • Independent toolbox inspired by [1]
  • Python implementation, except SGM C++ implementation
  • API or CLI

Web site: https://github.com/CNES/pandora

[1] A Taxonomy and Evaluation of Dense Two-Frame Stereo Correspondence Algorithms, D. Scharstein and R. Szeliski, vol. 47, International Journal of Computer Vision, 2002

pandora_overview

Pandora logo

Pandora dense matching pipeline details¶

pandora_pipeline pandora_methods

CARS logo

Quickstart¶

Quickstart¶

Download CARS Quick Start

mkdir /tmp/cars-tuto/
cd /tmp/cars-tuto/
wget https://raw.githubusercontent.com/CNES/cars/master/tutorials/quick_start_advanced.sh

Warning: Internet needed to download demo data.

Run the downloaded script

./quick_start_advanced.sh
==== Demo CARS installed (advanced) =====
 
- Cars must be installed:
  # cars -v
cars 0.7.0
 
- Get and extract data samples from CARS repository [...]
- Launch CARS with sensor_to_full_resolution_dsm pipeline for img1+img2 and img1+img3 pairs:
  # cars configfile.json
23-06-23 21:35:56 :: PROGRESS :: Check configuration...
23-06-23 21:35:57 :: PROGRESS :: CARS pipeline is started.
23-06-23 21:35:59 :: PROGRESS :: Data list to process: [ epi_matches_left ] ...
Tiles processing: 100%|████████████████████████████████████████████████████████████| 16/16 [00:13<00:00,  1.21it/s]
23-06-23 21:36:16 :: PROGRESS :: Data list to process: [ epi_matches_left ] ...
Tiles processing: 100%|████████████████████████████████████████████████████████████| 16/16 [00:11<00:00,  1.41it/s]
23-06-23 21:36:30 :: PROGRESS :: Data list to process: [ dsm , color ] ...
Tiles processing: 100%|████████████████████████████████████████████████████████████| 4/4 [01:13<00:00, 18.35s/it]
23-06-23 21:37:43 :: PROGRESS :: CARS has successfully completed the pipeline.
- Show resulting DSM:
  # ls -al outresults/
total 37556
-rw-rw-r-- 1 youssefd youssefd      314 juin  23 21:37 23-06-23_21h35m_sensor_to_dense_dsm.log
-rw-rw-r-- 1 youssefd youssefd 25166744 juin  23 21:37 clr.tif
-rw-rw-r-- 1 youssefd youssefd     7268 juin  23 21:36 content.json
-rw-rw-r-- 1 youssefd youssefd     9643 juin  23 21:35 dask_config_unknown.yaml
-rw-rw-r-- 1 youssefd youssefd 16778119 juin  23 21:37 dsm.tif
drwxrwxr-x 2 youssefd youssefd     4096 juin  23 21:37 one_three
drwxrwxr-x 2 youssefd youssefd     4096 juin  23 21:37 one_two
-rw-rw-r-- 1 youssefd youssefd     8151 juin  23 21:36 used_conf.json
drwxrwxr-x 2 youssefd youssefd     4096 juin  23 21:35 workers_log

Quick start results¶

DSM

Color

MIX

dsm.tif clr.tif clr and dsm colored composition

Quick start details¶

  1. See input data
    • sensor images + geometric models
    • initial DTM (SRTM tile)

Quick start details¶

  1. See configuration
cat data_gizeh/configfile.json
{
    "inputs": {
        "sensors" : {
            "one": {
                "image": "img1.tif",
                "geomodel": "img1.geom",
		"color": "color1.tif",
                "no_data": 0
            },
            "two": {
                "image": "img2.tif",
                "geomodel": "img2.geom",
                "no_data": 0
	    },
            "three": {
                "image": "img3.tif",
                "geomodel": "img3.geom",
                "no_data": 0
            }
        },
        "pairing": [["one", "two"],["one", "three"]],
        "initial_elevation": "srtm_dir/N29E031_KHEOPS.tif"
    },
    "output": {
        "out_dir": "outresults"
    }
}

CARS logo

Command Line Interface examples¶

Logging¶

Run CARS with more information:

cars --loglevel INFO configfile.json

CARS orchestration modification : nb_workers¶

  • Add orchestration configuration in input json file:

      "orchestrator": {
              "mode": "local_dask",
              "nb_workers": 4
      },
  • Run CARS again to see 4 workers : cars --loglevel INFO configfile.json

CARS orchestration modification: sequential mode¶

  • Add orchestration configuration in input json file:

      "orchestrator": {
              "mode": "sequential"
      },
  • Run CARS again : cars --loglevel INFO configfile.json

Application configuration: save disparity maps¶

  • Add application configuration in input json file and define parameters for dense matching application

      "applications": {
              "dense_matching":{
                      "method": "census_sgm",
                      "loader": "pandora",
                      "save_disparity_map": true
                    }
      },
  • Run CARS again : cars --loglevel INFO configfile.json

Application configuration: save disparity maps¶

  • Show resulting disparity maps
  # ls -l data_gizeh/outresults/
     total 44580
      -rw-r--r-- 1 carcars carcars        0 août   6 00:42 22-08-05_22h42m_sensor_to_full_res_dsm.log
      -rw-r--r-- 1 carcars carcars 33555362 août   6 00:46 clr.tif
      -rw-r--r-- 1 carcars carcars     9120 août   6 00:43 content.json
      -rw-r--r-- 1 carcars carcars     7864 août   6 00:42 dask_config_unknown.yaml
      -rw-r--r-- 1 carcars carcars 16778119 août   6 00:46 dsm.tif
      drwxr-xr-x 2 carcars carcars     4096 août   6 00:46 one_three
      drwxr-xr-x 2 carcars carcars     4096 août   6 00:46 one_two

  # ls -l data_gizeh/outresults/one_two
      -rw-r--r-- 1 carcars carcars     9120 août   6 00:43 epi_disp_color_left.tif
      -rw-r--r-- 1 carcars carcars     7864 août   6 00:42 epi_disp_left.tif
      -rw-r--r-- 1 carcars carcars 16778119 août   6 00:46 epi_disp_mask_left.tif
  

Application configuration: rasterization parameters¶

  • Add application configuration in input json file and define parameters for rasterization application

      "applications": {
           "point_cloud_rasterization": { 
              "method": "simple_gaussian",
              "dsm_radius": 3,
              "sigma": 0.3
           }
      },
  • Run CARS again : cars --loglevel INFO configfile.json

CARS logo

Step by step framework manipulation¶

Follow this link to access the slides: sensor_to_dense_dsm_step_by_step.slides.html