CARS Team
Follow CARS install: https://cars.readthedocs.io/en/latest/install.html
VLFEAT install:
git clone https://github.com/CNES/vlfeat.git
cd vlfeat && make && cd ..
export CFLAGS="-I$PWD/vlfeat"
eport LDFLAGS="-L$PWD/vlfeat/bin/glnxa64"
export LD_LIBRARY_PATH="$PWD/vlfeat/bin/glnxa64:$LD_LIBRARY_PATH"
CARS install:
python -m venv venv # virtualenv venv init
source ./venv/bin/activate # enter the virtualenv
pip install --upgrade "pip<=23.0.1" "numpy>=1.17.0" cython
pip install cars
Install Jupyter packages in previous virtualenv:
source ./venv/bin/activate # enter the virtualenv
pip install notebook rise bokeh
# add any tool you may need through pip install
Build Jupyter kernel:
python -m ipykernel install --sys-prefix --name=cars-kernel --display-name=cars-kernel
Jupyter environnement:
jupyter notebook
CARS produces Digital Surface Models from satellite imaging by photogrammetry.
Main goals:
Be aware that CARS is new and evolving to maturity with CNES roadmaps
License: Apache-2.0
Web sites:
See Authors.md for full contributions in Github repository.
See Contributing.md
DEM: Digital Elevation Model. Usually means all elevation models in raster: DSM, DTM,…
DSM: Digital Surface Model. Represents the earth’s surface and includes all objects on it. CARS generates DSMs.
DTM: Digital Terrain Model. Represents bare ground surface without any objects like plants and buildings.
ROI: Region of Interest means a subpart of the DSM raster in CARS.
Indirect measure (same as eyes) by passive observation:
Needs at least 2 images!
Indirect measure (same as eyes) by passive observation:
Needs at least 2 images and geometric models:
Rational Polynomial Coefficients (RPCs) provide a compact representation of a ground-to-image geometry giving a relationship between:
The photogrammetric processing can be performed without requiring a physical camera model.
These coefficients are classically contained in the RPC*XML files
The resulting shifts are transformed into positions in the two images.
This allows us to deduce the lines of sight. The intersection of these lines gives a point in space: longitude.
The resulting shifts are transformed into positions in the two images.
This allows us to deduce the lines of sight. The intersection of these lines gives a point in space: latitude.
The resulting shifts are transformed into positions in the two images.
This allows us to deduce the lines of sight. The intersection of these lines gives a point in space: altitude.
The resulting shifts are transformed into positions in the two images.
This allows us to deduce the lines of sight. The intersection of these lines gives a point in space: altitude (single band pseudo color).
The resulting shifts are transformed into positions in the two images.
This allows us to deduce the lines of sight. The intersection of these lines gives a point in space: altitude (single band pseudo color and hillshade).
To obtain a raster image, the final process projects each point in a 2D grid: the altitudes.
To obtain a raster image, the final process projects each point in a 2D grid: the colors.
Objectives:
Technologies used :
[1] D. G. Lowe. Distinctive image features from scale-invariant keypoints. IJCV, 2(60):91-110, 2004.
[2] H. Hirschmuller, "Stereo Processing by Semiglobal Matching and Mutual Information," in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 30, no. 2, pp. 328-341, Feb. 2008. doi: 10.1109/TPAMI.2007.1166
DASK framework can be used locally (local_dask) or through PBS on a HPC (pbs_dask). The orchestrator framework separates 3D pipeline from computing distribution. Features:
cars -h
usage: cars [-h] [--loglevel {DEBUG,INFO,PROGRESS,WARNING,ERROR,CRITICAL}] [--version] conf
CARS: CNES Algorithms to Reconstruct Surface
positional arguments:
conf Inputs Configuration File
optional arguments:
-h, --help show this help message and exit
--loglevel {DEBUG,INFO,PROGRESS,WARNING,ERROR,CRITICAL}
Logger level (default: WARNING. Should be one of (DEBUG, INFO, PROGRESS, WARNING, ERROR, CRITICAL)
--version, -v show program's version number and exit
{
"inputs": {},
"orchestrator": {},
"applications": {},
"output": {},
"pipeline": "pipeline_to_use"
}
Two possibilities:
Set sensors, geometric models, pairing, initial_elevation.
{
"inputs": {
"sensors" : {
"one": {
"image": "img1.tif",
"geomodel": "img1.geom",
"no_data": 0
},
"two": {
"image": "img2.tif",
"geomodel": "img2.geom",
"no_data": 0
}
},
"pairing": [["one", "two"]],
"initial_elevation": "srtm_dir/N29E031_KHEOPS.tif"
},
Allows to redefine default parameters for each application used by pipeline and parameter the pipeline.
Define orchestrator parameters that control the distributed computations:
mode: Parallelization mode “local_dask”, “pbs_dask” or “mp”
nb_workers: Number of workers
walltime: dependent on the mode.
dependent on the pipeline. For main pipeline example:
"output": {
"out_dir": "myoutputfolder",
"dsm_basename": "mydsm.tif"
}
First SIFT Sparse matching steps for each pair:
use an adapted epipolar geometry : null disparity is based on a reference DTM (SRTM typically)
Web site: https://github.com/CNES/pandora
[1] A Taxonomy and Evaluation of Dense Two-Frame Stereo Correspondence Algorithms, D. Scharstein and R. Szeliski, vol. 47, International Journal of Computer Vision, 2002
Download CARS Quick Start
mkdir /tmp/cars-tuto/
cd /tmp/cars-tuto/
wget https://raw.githubusercontent.com/CNES/cars/master/tutorials/quick_start_advanced.sh
Warning: Internet needed to download demo data.
Run the downloaded script
./quick_start_advanced.sh
==== Demo CARS installed (advanced) =====
- Cars must be installed:
# cars -v
cars 0.7.0
- Get and extract data samples from CARS repository [...]
- Launch CARS with sensor_to_full_resolution_dsm pipeline for img1+img2 and img1+img3 pairs:
# cars configfile.json
23-06-23 21:35:56 :: PROGRESS :: Check configuration...
23-06-23 21:35:57 :: PROGRESS :: CARS pipeline is started.
23-06-23 21:35:59 :: PROGRESS :: Data list to process: [ epi_matches_left ] ...
Tiles processing: 100%|████████████████████████████████████████████████████████████| 16/16 [00:13<00:00, 1.21it/s]
23-06-23 21:36:16 :: PROGRESS :: Data list to process: [ epi_matches_left ] ...
Tiles processing: 100%|████████████████████████████████████████████████████████████| 16/16 [00:11<00:00, 1.41it/s]
23-06-23 21:36:30 :: PROGRESS :: Data list to process: [ dsm , color ] ...
Tiles processing: 100%|████████████████████████████████████████████████████████████| 4/4 [01:13<00:00, 18.35s/it]
23-06-23 21:37:43 :: PROGRESS :: CARS has successfully completed the pipeline.
- Show resulting DSM:
# ls -al outresults/
total 37556
-rw-rw-r-- 1 youssefd youssefd 314 juin 23 21:37 23-06-23_21h35m_sensor_to_dense_dsm.log
-rw-rw-r-- 1 youssefd youssefd 25166744 juin 23 21:37 clr.tif
-rw-rw-r-- 1 youssefd youssefd 7268 juin 23 21:36 content.json
-rw-rw-r-- 1 youssefd youssefd 9643 juin 23 21:35 dask_config_unknown.yaml
-rw-rw-r-- 1 youssefd youssefd 16778119 juin 23 21:37 dsm.tif
drwxrwxr-x 2 youssefd youssefd 4096 juin 23 21:37 one_three
drwxrwxr-x 2 youssefd youssefd 4096 juin 23 21:37 one_two
-rw-rw-r-- 1 youssefd youssefd 8151 juin 23 21:36 used_conf.json
drwxrwxr-x 2 youssefd youssefd 4096 juin 23 21:35 workers_log
|
|
|
dsm.tif | clr.tif | clr and dsm colored composition |
cat data_gizeh/configfile.json
{
"inputs": {
"sensors" : {
"one": {
"image": "img1.tif",
"geomodel": "img1.geom",
"color": "color1.tif",
"no_data": 0
},
"two": {
"image": "img2.tif",
"geomodel": "img2.geom",
"no_data": 0
},
"three": {
"image": "img3.tif",
"geomodel": "img3.geom",
"no_data": 0
}
},
"pairing": [["one", "two"],["one", "three"]],
"initial_elevation": "srtm_dir/N29E031_KHEOPS.tif"
},
"output": {
"out_dir": "outresults"
}
}
Add orchestration configuration in input json file:
"orchestrator": {
"mode": "local_dask",
"nb_workers": 4
},
Run CARS again to see 4 workers : cars --loglevel INFO configfile.json
Add orchestration configuration in input json file:
"orchestrator": {
"mode": "sequential"
},
Run CARS again : cars --loglevel INFO configfile.json
Add application configuration in input json file and define parameters for dense matching application
"applications": {
"dense_matching":{
"method": "census_sgm",
"loader": "pandora",
"save_disparity_map": true
}
},
Run CARS again : cars --loglevel INFO configfile.json
# ls -l data_gizeh/outresults/
total 44580
-rw-r--r-- 1 carcars carcars 0 août 6 00:42 22-08-05_22h42m_sensor_to_full_res_dsm.log
-rw-r--r-- 1 carcars carcars 33555362 août 6 00:46 clr.tif
-rw-r--r-- 1 carcars carcars 9120 août 6 00:43 content.json
-rw-r--r-- 1 carcars carcars 7864 août 6 00:42 dask_config_unknown.yaml
-rw-r--r-- 1 carcars carcars 16778119 août 6 00:46 dsm.tif
drwxr-xr-x 2 carcars carcars 4096 août 6 00:46 one_three
drwxr-xr-x 2 carcars carcars 4096 août 6 00:46 one_two
# ls -l data_gizeh/outresults/one_two
-rw-r--r-- 1 carcars carcars 9120 août 6 00:43 epi_disp_color_left.tif
-rw-r--r-- 1 carcars carcars 7864 août 6 00:42 epi_disp_left.tif
-rw-r--r-- 1 carcars carcars 16778119 août 6 00:46 epi_disp_mask_left.tif
Add application configuration in input json file and define parameters for rasterization application
"applications": {
"point_cloud_rasterization": {
"method": "simple_gaussian",
"dsm_radius": 3,
"sigma": 0.3
}
},
Run CARS again : cars --loglevel INFO configfile.json
Follow this link to access the slides: sensor_to_dense_dsm_step_by_step.slides.html