Merge branch 'OpenDroneMap:master' into master

pull/1337/head
Saijin-Naib 2021-08-31 17:21:00 -04:00 zatwierdzone przez GitHub
commit 4ae2b7f335
Nie znaleziono w bazie danych klucza dla tego podpisu
ID klucza GPG: 4AEE18F83AFDEB23
65 zmienionych plików z 1549 dodań i 435 usunięć

Wyświetl plik

@ -13,6 +13,10 @@ jobs:
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Set Swap Space
uses: pierotofy/set-swap-space@master
with:
swap-size-gb: 12
- name: Set up QEMU
uses: docker/setup-qemu-action@v1
- name: Set up Docker Buildx
@ -81,4 +85,8 @@ jobs:
run: |
echo "Docker image digest: ${{ steps.docker_build.outputs.digest }}"
echo "WSL AMD64 rootfs URL: ${{ steps.upload-amd64-wsl-rootfs.browser_download_url }}"
# Trigger NodeODM build
- name: Dispatch NodeODM Build Event
id: nodeodm_dispatch
run: |
curl -X POST -u "${{secrets.PAT_USERNAME}}:${{secrets.PAT_TOKEN}}" -H "Accept: application/vnd.github.everest-preview+json" -H "Content-Type: application/json" https://api.github.com/repos/OpenDroneMap/NodeODM/actions/workflows/publish-docker.yaml/dispatches --data '{"ref": "master"}'

Wyświetl plik

@ -13,6 +13,10 @@ jobs:
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Set Swap Space
uses: pierotofy/set-swap-space@master
with:
swap-size-gb: 12
- name: Set up QEMU
uses: docker/setup-qemu-action@v1
- name: Set up Docker Buildx
@ -30,3 +34,8 @@ jobs:
platforms: linux/amd64
push: true
tags: opendronemap/odm:gpu
# Trigger NodeODM build
- name: Dispatch NodeODM Build Event
id: nodeodm_dispatch
run: |
curl -X POST -u "${{secrets.PAT_USERNAME}}:${{secrets.PAT_TOKEN}}" -H "Accept: application/vnd.github.everest-preview+json" -H "Content-Type: application/json" https://api.github.com/repos/OpenDroneMap/NodeODM/actions/workflows/publish-docker-gpu.yaml/dispatches --data '{"ref": "master"}'

Wyświetl plik

@ -17,11 +17,14 @@ jobs:
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Set Swap Space
uses: pierotofy/set-swap-space@master
with:
swap-size-gb: 12
- name: Build
id: build
uses: diddlesnaps/snapcraft-multiarch-action@v1
with:
snapcraft-args: --enable-experimental-package-repositories
architecture: ${{ matrix.architecture }}
- name: Review
uses: diddlesnaps/snapcraft-review-tools-action@v1

Wyświetl plik

@ -0,0 +1,56 @@
name: Publish Windows Setup
on:
push:
branches:
- master
tags:
- v*
jobs:
build:
runs-on: windows-latest
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Setup Python
uses: actions/setup-python@v2
with:
python-version: '3.8.1'
architecture: 'x64'
- name: Setup Visual C++
uses: ilammy/msvc-dev-cmd@v1
with:
arch: x64
- name: Extract code signing cert
id: code_sign
uses: timheuer/base64-to-file@v1
with:
fileName: 'comodo.pfx'
encodedString: ${{ secrets.CODE_SIGNING_CERT }}
- name: Install venv
run: |
python -m pip install virtualenv
- name: Build sources
run: |
python configure.py build
- name: Create setup
env:
CODE_SIGN_CERT_PATH: ${{ steps.code_sign.outputs.filePath }}
run: |
python configure.py dist --signtool-path $((Get-Command signtool).Source) --code-sign-cert-path $env:CODE_SIGN_CERT_PATH
- name: Upload Setup File
uses: actions/upload-artifact@v2
with:
name: Setup
path: dist\*.exe
- name: Upload Setup to Release
uses: svenstaro/upload-release-action@v2
if: startsWith(github.ref, 'refs/tags/')
with:
repo_token: ${{ secrets.GITHUB_TOKEN }}
file: dist\*.exe
file_glob: true
tag: ${{ github.ref }}
overwrite: true

Wyświetl plik

@ -9,6 +9,10 @@ jobs:
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Set Swap Space
uses: pierotofy/set-swap-space@master
with:
swap-size-gb: 12
- name: Set up QEMU
uses: docker/setup-qemu-action@v1
- name: Set up Docker Buildx
@ -29,14 +33,38 @@ jobs:
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Set Swap Space
uses: pierotofy/set-swap-space@master
with:
swap-size-gb: 12
- name: Build
id: build
uses: diddlesnaps/snapcraft-multiarch-action@v1
with:
snapcraft-args: --enable-experimental-package-repositories
architecture: ${{ matrix.architecture }}
- name: Review
uses: diddlesnaps/snapcraft-review-tools-action@v1
with:
snap: ${{ steps.build.outputs.snap }}
isClassic: 'false'
windows:
runs-on: windows-latest
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Setup Python
uses: actions/setup-python@v2
with:
python-version: '3.8.1'
architecture: 'x64'
- name: Setup Visual C++
uses: ilammy/msvc-dev-cmd@v1
with:
arch: x64
- name: Install venv
run: |
python -m pip install virtualenv
- name: Build sources
run: |
python configure.py build

Wyświetl plik

@ -15,7 +15,7 @@ If you would rather not type commands in a shell and are looking for a friendly
## Quickstart
The easiest way to run ODM is via docker. To install docker, see [docs.docker.com](https://docs.docker.com). Once you have docker installed and [working](https://docs.docker.com/get-started/#test-docker-installation), you can run ODM by placing some images (JPEGs or TIFFs) in a folder named “images” (for example `C:\Users\youruser\datasets\project\images` or `/home/youruser/datasets/project/images`) and simply run from a Command Prompt / Terminal:
The easiest way to run ODM on is via docker. To install docker, see [docs.docker.com](https://docs.docker.com). Once you have docker installed and [working](https://docs.docker.com/get-started/#test-docker-installation), you can run ODM by placing some images (JPEGs or TIFFs) in a folder named “images” (for example `C:\Users\youruser\datasets\project\images` or `/home/youruser/datasets/project/images`) and simply run from a Command Prompt / Terminal:
```bash
# Windows
@ -73,7 +73,15 @@ See http://docs.opendronemap.org for tutorials and more guides.
## Forum
We have a vibrant [community forum](https://community.opendronemap.org/). You can [search it](https://community.opendronemap.org/search?expanded=true) for issues you might be having with ODM and you can post questions there. We encourage users of ODM to partecipate in the forum and to engage with fellow drone mapping users.
We have a vibrant [community forum](https://community.opendronemap.org/). You can [search it](https://community.opendronemap.org/search?expanded=true) for issues you might be having with ODM and you can post questions there. We encourage users of ODM to participate in the forum and to engage with fellow drone mapping users.
## Windows Setup
ODM can be installed natively on Windows. Just download the latest setup from the [releases](https://github.com/OpenDroneMap/ODM/releases) page. After opening the ODM Console you can process datasets by typing:
```bash
run C:\Users\youruser\datasets\project [--additional --parameters --here]
```
## Snap Package
@ -256,7 +264,13 @@ If you have questions, join the developer's chat at https://community.opendronem
2. Submit a pull request with detailed changes and test results
3. Have fun!
### Credits
### Troubleshooting
The dev environment makes use of `opendronemap/nodeodm` by default. You may want to run
`docker pull opendronemap/nodeodm` before running `./start-dev-env.sh` to avoid using an old cached version.
In order to make a clean build, remove `~/.odm-dev-home` and `ODM/.setupdevenv`.
## Credits
ODM makes use of [several libraries](https://github.com/OpenDroneMap/ODM/blob/master/snap/snapcraft.yaml#L36) and other awesome open source projects to perform its tasks. Among them we'd like to highlight:
@ -270,6 +284,6 @@ ODM makes use of [several libraries](https://github.com/OpenDroneMap/ODM/blob/ma
- [PoissonRecon](https://github.com/mkazhdan/PoissonRecon)
### Citation
## Citation
> *OpenDroneMap Authors* ODM - A command line toolkit to generate maps, point clouds, 3D models and DEMs from drone, balloon or kite images. **OpenDroneMap/ODM GitHub Page** 2020; [https://github.com/OpenDroneMap/ODM](https://github.com/OpenDroneMap/ODM)

Wyświetl plik

@ -10,15 +10,6 @@ endif()
# Setup SuperBuild root location
set(SB_ROOT_DIR ${CMAKE_CURRENT_SOURCE_DIR})
# Path to additional CMake modules
set(CMAKE_MODULE_PATH ${SB_ROOT_DIR}/cmake)
include(ExternalProject)
include(ExternalProject-Setup)
option(ODM_BUILD_SLAM "Build SLAM module" OFF)
################################
# Setup SuperBuild directories #
################################
@ -35,6 +26,7 @@ message(STATUS "SuperBuild files will be downloaded to: ${SB_DOWNLOAD_DIR}")
set(SB_SOURCE_DIR "${SB_ROOT_DIR}/src"
CACHE PATH "Location where source tar-balls are (will be).")
mark_as_advanced(SB_SOURCE_DIR)
set(SB_BUILD_DIR "${SB_ROOT_DIR}/build")
message(STATUS "SuperBuild source files will be extracted to: ${SB_SOURCE_DIR}")
@ -54,6 +46,40 @@ mark_as_advanced(SB_BINARY_DIR)
message(STATUS "SuperBuild binary files will be located to: ${SB_BINARY_DIR}")
if (WIN32)
if (NOT DEFINED CMAKE_TOOLCHAIN_FILE)
message(FATAL_ERROR "CMAKE_TOOLCHAIN_FILE not set. You need to set it to the path of vcpkg.cmake")
endif()
get_filename_component(CMAKE_TOOLCHAIN_DIR ${CMAKE_TOOLCHAIN_FILE} DIRECTORY)
get_filename_component(VCPKG_ROOT "${CMAKE_TOOLCHAIN_DIR}/../../" ABSOLUTE)
set(WIN32_CMAKE_ARGS "-DCMAKE_TOOLCHAIN_FILE=${CMAKE_TOOLCHAIN_FILE}")
set(PYTHON_HOME "${SB_ROOT_DIR}/../venv")
set(PYTHON_EXE_PATH "${PYTHON_HOME}/Scripts/python")
# Use the GDAL version that comes with pip
set(GDAL_ROOT "${PYTHON_HOME}/Lib/site-packages/osgeo")
set(GDAL_LIBRARY "${GDAL_ROOT}/lib/gdal_i.lib")
set(GDAL_INCLUDE_DIR "${GDAL_ROOT}/include/gdal")
# Also download missing headers :/
if (NOT EXISTS "${GDAL_INCLUDE_DIR}/ogrsf_frmts.h")
file(DOWNLOAD "https://raw.githubusercontent.com/OSGeo/gdal/release/3.2/gdal/ogr/ogrsf_frmts/ogrsf_frmts.h" "${GDAL_INCLUDE_DIR}/ogrsf_frmts.h")
endif()
message("Copying VCPKG DLLs...")
file(GLOB COPY_DLLS "${VCPKG_ROOT}/installed/x64-windows/bin/*.dll")
file(COPY ${COPY_DLLS} DESTINATION "${SB_INSTALL_DIR}/bin")
set(WIN32_GDAL_ARGS -DGDAL_FOUND=TRUE -DGDAL_LIBRARY=${GDAL_LIBRARY} -DGDAL_INCLUDE_DIR=${GDAL_INCLUDE_DIR})
else()
set(PYTHON_EXE_PATH "/usr/bin/python3")
endif()
# Path to additional CMake modules
set(CMAKE_MODULE_PATH ${SB_ROOT_DIR}/cmake)
include(ExternalProject)
include(ExternalProject-Setup)
#########################################
# Download and install third party libs #
@ -108,11 +134,25 @@ set(custom_libs OpenSfM
LASzip
PDAL
Untwine
Entwine
MvsTexturing
OpenMVS
)
# Build entwine only on Linux
if (NOT WIN32)
set(custom_libs ${custom_libs} Entwine)
endif()
externalproject_add(mve
GIT_REPOSITORY https://github.com/OpenDroneMap/mve.git
GIT_TAG 250
UPDATE_COMMAND ""
SOURCE_DIR ${SB_SOURCE_DIR}/mve
CMAKE_ARGS ${WIN32_CMAKE_ARGS}
BUILD_IN_SOURCE 1
INSTALL_COMMAND ""
)
foreach(lib ${custom_libs})
SETUP_EXTERNAL_PROJECT_CUSTOM(${lib})
endforeach()
@ -120,15 +160,23 @@ endforeach()
include(ProcessorCount)
ProcessorCount(nproc)
if (WIN32)
set (POISSON_BUILD_CMD ${CMAKE_MAKE_PROGRAM} ${SB_SOURCE_DIR}/PoissonRecon/PoissonRecon.vcxproj /p:configuration=${CMAKE_BUILD_TYPE} /p:PlatformToolset=${CMAKE_VS_PLATFORM_TOOLSET} /p:WindowsTargetPlatformVersion=${CMAKE_VS_WINDOWS_TARGET_PLATFORM_VERSION})
set (POISSON_BIN_PATH "x64/${CMAKE_BUILD_TYPE}/PoissonRecon.exe")
else()
set (POISSON_BUILD_CMD make -j${nproc} poissonrecon)
set (POISSON_BIN_PATH "Linux/PoissonRecon")
endif()
externalproject_add(poissonrecon
GIT_REPOSITORY https://github.com/mkazhdan/PoissonRecon.git
GIT_TAG ce5005ae3094d902d551a65a8b3131e06f45e7cf
GIT_REPOSITORY https://github.com/OpenDroneMap/PoissonRecon.git
GIT_TAG 257
PREFIX ${SB_BINARY_DIR}/PoissonRecon
SOURCE_DIR ${SB_SOURCE_DIR}/PoissonRecon
UPDATE_COMMAND ""
CONFIGURE_COMMAND ""
BUILD_IN_SOURCE 1
BUILD_COMMAND make -j${nproc} poissonrecon
INSTALL_COMMAND ""
BUILD_COMMAND ${POISSON_BUILD_CMD}
INSTALL_COMMAND ${CMAKE_COMMAND} -E copy ${SB_SOURCE_DIR}/PoissonRecon/Bin/${POISSON_BIN_PATH} ${SB_INSTALL_DIR}/bin
)
externalproject_add(dem2mesh
@ -137,6 +185,7 @@ externalproject_add(dem2mesh
PREFIX ${SB_BINARY_DIR}/dem2mesh
SOURCE_DIR ${SB_SOURCE_DIR}/dem2mesh
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX:PATH=${SB_INSTALL_DIR}
${WIN32_GDAL_ARGS}
)
externalproject_add(dem2points
@ -145,6 +194,7 @@ externalproject_add(dem2points
PREFIX ${SB_BINARY_DIR}/dem2points
SOURCE_DIR ${SB_SOURCE_DIR}/dem2points
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX:PATH=${SB_INSTALL_DIR}
${WIN32_GDAL_ARGS}
)
externalproject_add(odm_orthophoto
@ -154,17 +204,13 @@ externalproject_add(odm_orthophoto
PREFIX ${SB_BINARY_DIR}/odm_orthophoto
SOURCE_DIR ${SB_SOURCE_DIR}/odm_orthophoto
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX:PATH=${SB_INSTALL_DIR}
${WIN32_CMAKE_ARGS} ${WIN32_GDAL_ARGS}
)
externalproject_add(lastools
GIT_REPOSITORY https://github.com/LAStools/LAStools.git
GIT_TAG 2ef44281645999ec7217facec84a5913bbbbe165
GIT_REPOSITORY https://github.com/OpenDroneMap/LAStools.git
GIT_TAG 250
PREFIX ${SB_BINARY_DIR}/lastools
SOURCE_DIR ${SB_SOURCE_DIR}/lastools
CONFIGURE_COMMAND ""
CMAKE_COMMAND ""
CMAKE_GENERATOR ""
UPDATE_COMMAND ""
BUILD_IN_SOURCE 1
BUILD_COMMAND make -C LASlib -j${nproc} CXXFLAGS='-std=c++11' && make -C src -j${nproc} CXXFLAGS='-std=c++11' lasmerge
INSTALL_COMMAND install -m755 -D -t ${SB_INSTALL_DIR}/bin ${SB_SOURCE_DIR}/lastools/bin/lasmerge
CMAKE_ARGS -DCMAKE_INSTALL_PREFIX:PATH=${SB_INSTALL_DIR}
)

Wyświetl plik

@ -18,7 +18,10 @@ ExternalProject_Add(${_proj_name}
-DCMAKE_CXX_FLAGS=-fPIC
-DBUILD_EXAMPLES=OFF
-DBUILD_TESTING=OFF
-DMINIGLOG=ON
-DMINIGLOG_MAX_LOG_LEVEL=-100
-DCMAKE_INSTALL_PREFIX:PATH=${SB_INSTALL_DIR}
${WIN32_CMAKE_ARGS}
#--Build step-----------------
BINARY_DIR ${_SB_BINARY_DIR}
#--Install step---------------

Wyświetl plik

@ -1,6 +1,10 @@
set(_proj_name entwine)
set(_SB_BINARY_DIR "${SB_BINARY_DIR}/${_proj_name}")
if (NOT WIN32)
set(EXTRA_CMAKE_ARGS -DCMAKE_CXX_FLAGS=-isystem\ ${SB_SOURCE_DIR}/pdal)
endif()
ExternalProject_Add(${_proj_name}
DEPENDS pdal
PREFIX ${_SB_BINARY_DIR}
@ -9,13 +13,13 @@ ExternalProject_Add(${_proj_name}
#--Download step--------------
DOWNLOAD_DIR ${SB_DOWNLOAD_DIR}
GIT_REPOSITORY https://github.com/OpenDroneMap/entwine/
GIT_TAG 2411
GIT_TAG 250
#--Update/Patch step----------
UPDATE_COMMAND ""
#--Configure step-------------
SOURCE_DIR ${SB_SOURCE_DIR}/${_proj_name}
CMAKE_ARGS
-DCMAKE_CXX_FLAGS=-isystem\ ${SB_SOURCE_DIR}/pdal
${EXTRA_CMAKE_ARGS}
-DADDITIONAL_LINK_DIRECTORIES_PATHS=${SB_INSTALL_DIR}/lib
-DWITH_TESTS=OFF
-DWITH_ZSTD=OFF

Wyświetl plik

@ -15,6 +15,7 @@ ExternalProject_Add(${_proj_name}
SOURCE_DIR ${SB_SOURCE_DIR}/${_proj_name}
CMAKE_ARGS
-DCMAKE_INSTALL_PREFIX:PATH=${SB_INSTALL_DIR}
${WIN32_GDAL_ARGS}
#--Build step-----------------
BINARY_DIR ${_SB_BINARY_DIR}
#--Install step---------------

Wyświetl plik

@ -2,14 +2,14 @@ set(_proj_name mvstexturing)
set(_SB_BINARY_DIR "${SB_BINARY_DIR}/${_proj_name}")
ExternalProject_Add(${_proj_name}
DEPENDS
DEPENDS mve
PREFIX ${_SB_BINARY_DIR}
TMP_DIR ${_SB_BINARY_DIR}/tmp
STAMP_DIR ${_SB_BINARY_DIR}/stamp
#--Download step--------------
DOWNLOAD_DIR ${SB_DOWNLOAD_DIR}/${_proj_name}
GIT_REPOSITORY https://github.com/OpenDroneMap/mvs-texturing
GIT_TAG 221
GIT_TAG 250
#--Update/Patch step----------
UPDATE_COMMAND ""
#--Configure step-------------
@ -18,6 +18,7 @@ ExternalProject_Add(${_proj_name}
-DRESEARCH=OFF
-DCMAKE_BUILD_TYPE:STRING=Release
-DCMAKE_INSTALL_PREFIX:PATH=${SB_INSTALL_DIR}
${WIN32_CMAKE_ARGS}
#--Build step-----------------
BINARY_DIR ${_SB_BINARY_DIR}
#--Install step---------------

Wyświetl plik

@ -1,78 +0,0 @@
set(_proj_name orb_slam2)
set(_SB_BINARY_DIR "${SB_BINARY_DIR}/${_proj_name}")
ExternalProject_Add(${_proj_name}
DEPENDS opencv pangolin
PREFIX ${_SB_BINARY_DIR}
TMP_DIR ${_SB_BINARY_DIR}/tmp
STAMP_DIR ${_SB_BINARY_DIR}/stamp
#--Download step--------------
DOWNLOAD_DIR ${SB_DOWNLOAD_DIR}
URL https://github.com/paulinus/ORB_SLAM2/archive/7c11f186a53a75560cd17352d327b0bc127a82de.zip
#--Update/Patch step----------
UPDATE_COMMAND ""
#--Configure step-------------
SOURCE_DIR ${SB_SOURCE_DIR}/${_proj_name}
CMAKE_ARGS
-DCMAKE_INSTALL_PREFIX:PATH=${SB_INSTALL_DIR}
#--Build step-----------------
BINARY_DIR ${_SB_BINARY_DIR}
#--Install step---------------
INSTALL_COMMAND ""
#--Output logging-------------
LOG_DOWNLOAD OFF
LOG_CONFIGURE OFF
LOG_BUILD OFF
)
# DBoW2
set(DBoW2_BINARY_DIR "${SB_BINARY_DIR}/DBoW2")
file(MAKE_DIRECTORY "${DBoW2_BINARY_DIR}")
ExternalProject_Add_Step(${_proj_name} build_DBoW2
COMMAND make -j2
DEPENDEES configure_DBoW2
DEPENDERS configure
WORKING_DIRECTORY ${DBoW2_BINARY_DIR}
ALWAYS 1
)
ExternalProject_Add_Step(${_proj_name} configure_DBoW2
COMMAND ${CMAKE_COMMAND} <SOURCE_DIR>/Thirdparty/DBoW2
-DOpenCV_DIR=${SB_INSTALL_DIR}/share/OpenCV
-DCMAKE_BUILD_TYPE=Release
DEPENDEES download
DEPENDERS build_DBoW2
WORKING_DIRECTORY ${DBoW2_BINARY_DIR}
ALWAYS 1
)
# g2o
set(g2o_BINARY_DIR "${SB_BINARY_DIR}/g2o")
file(MAKE_DIRECTORY "${g2o_BINARY_DIR}")
ExternalProject_Add_Step(${_proj_name} build_g2o
COMMAND make -j2
DEPENDEES configure_g2o
DEPENDERS configure
WORKING_DIRECTORY ${g2o_BINARY_DIR}
ALWAYS 1
)
ExternalProject_Add_Step(${_proj_name} configure_g2o
COMMAND ${CMAKE_COMMAND} <SOURCE_DIR>/Thirdparty/g2o
-DCMAKE_BUILD_TYPE=Release
DEPENDEES download
DEPENDERS build_g2o
WORKING_DIRECTORY ${g2o_BINARY_DIR}
ALWAYS 1
)
# Uncompress Vocabulary
ExternalProject_Add_Step(${_proj_name} uncompress_vocabulary
COMMAND tar -xf ORBvoc.txt.tar.gz
DEPENDEES download
DEPENDERS configure
WORKING_DIRECTORY <SOURCE_DIR>/Vocabulary
ALWAYS 1
)

Wyświetl plik

@ -1,6 +1,15 @@
set(_proj_name opencv)
set(_SB_BINARY_DIR "${SB_BINARY_DIR}/${_proj_name}")
if (WIN32)
set(WIN32_CMAKE_EXTRA_ARGS -DPYTHON3_NUMPY_INCLUDE_DIRS=${PYTHON_HOME}/lib/site-packages/numpy/core/include
-DPYTHON3_PACKAGES_PATH=${PYTHON_HOME}/lib/site-packages
-DPYTHON3_EXECUTABLE=${PYTHON_EXE_PATH}
-DWITH_MSMF=OFF
-DOPENCV_LIB_INSTALL_PATH=${SB_INSTALL_DIR}/lib
-DOPENCV_BIN_INSTALL_PATH=${SB_INSTALL_DIR}/bin)
endif()
ExternalProject_Add(${_proj_name}
PREFIX ${_SB_BINARY_DIR}
TMP_DIR ${_SB_BINARY_DIR}/tmp
@ -25,10 +34,10 @@ ExternalProject_Add(${_proj_name}
-DBUILD_opencv_objdetect=ON
-DBUILD_opencv_photo=ON
-DBUILD_opencv_legacy=ON
-DBUILD_opencv_python=ON
-DWITH_FFMPEG=${ODM_BUILD_SLAM}
-DBUILD_opencv_python3=ON
-DWITH_FFMPEG=OFF
-DWITH_CUDA=OFF
-DWITH_GTK=${ODM_BUILD_SLAM}
-DWITH_GTK=OFF
-DWITH_VTK=OFF
-DWITH_EIGEN=OFF
-DWITH_OPENNI=OFF
@ -50,6 +59,8 @@ ExternalProject_Add(${_proj_name}
-DOPENCV_ALLOCATOR_STATS_COUNTER_TYPE=int64_t
-DCMAKE_BUILD_TYPE:STRING=Release
-DCMAKE_INSTALL_PREFIX:PATH=${SB_INSTALL_DIR}
${WIN32_CMAKE_ARGS}
${WIN32_CMAKE_EXTRA_ARGS}
#--Build step-----------------
BINARY_DIR ${_SB_BINARY_DIR}
#--Install step---------------

Wyświetl plik

@ -20,7 +20,7 @@ ExternalProject_Add(${_proj_name}
#--Download step--------------
DOWNLOAD_DIR ${SB_DOWNLOAD_DIR}
GIT_REPOSITORY https://github.com/OpenDroneMap/openMVS
GIT_TAG 2412
GIT_TAG 256
#--Update/Patch step----------
UPDATE_COMMAND ""
#--Configure step-------------
@ -30,6 +30,7 @@ ExternalProject_Add(${_proj_name}
-DVCG_ROOT=${SB_SOURCE_DIR}/vcg
-DCMAKE_BUILD_TYPE=Release
-DCMAKE_INSTALL_PREFIX=${SB_INSTALL_DIR}
${WIN32_CMAKE_ARGS}
#--Build step-----------------
BINARY_DIR ${_SB_BINARY_DIR}
#--Install step---------------

Wyświetl plik

@ -1,5 +1,15 @@
set(_proj_name opensfm)
set(_SB_BINARY_DIR "${SB_BINARY_DIR}/${_proj_name}")
include(ProcessorCount)
ProcessorCount(nproc)
if(WIN32)
set(OpenCV_DIR "${SB_INSTALL_DIR}/x64/vc16/lib")
set(BUILD_CMD ${CMAKE_COMMAND} --build "${SB_BUILD_DIR}/opensfm" --config "${CMAKE_BUILD_TYPE}")
else()
set(OpenCV_DIR "${SB_INSTALL_DIR}/lib/cmake/opencv4")
set(BUILD_CMD make "-j${nproc}")
endif()
ExternalProject_Add(${_proj_name}
DEPENDS ceres opencv gflags
@ -9,17 +19,19 @@ ExternalProject_Add(${_proj_name}
#--Download step--------------
DOWNLOAD_DIR ${SB_DOWNLOAD_DIR}
GIT_REPOSITORY https://github.com/OpenDroneMap/OpenSfM/
GIT_TAG 2410
GIT_TAG 261
#--Update/Patch step----------
UPDATE_COMMAND git submodule update --init --recursive
#--Configure step-------------
SOURCE_DIR ${SB_SOURCE_DIR}/${_proj_name}
CONFIGURE_COMMAND cmake <SOURCE_DIR>/${_proj_name}/src
SOURCE_DIR ${SB_INSTALL_DIR}/bin/${_proj_name}
CONFIGURE_COMMAND ${CMAKE_COMMAND} <SOURCE_DIR>/${_proj_name}/src
-DCERES_ROOT_DIR=${SB_INSTALL_DIR}
-DOpenCV_DIR=${SB_INSTALL_DIR}/lib/cmake/opencv4
-DOpenCV_DIR=${OpenCV_DIR}
-DADDITIONAL_INCLUDE_DIRS=${SB_INSTALL_DIR}/include
-DOPENSFM_BUILD_TESTS=off
-DPYTHON_EXECUTABLE=/usr/bin/python3
-DPYTHON_EXECUTABLE=${PYTHON_EXE_PATH}
${WIN32_CMAKE_ARGS}
BUILD_COMMAND ${BUILD_CMD}
#--Build step-----------------
BINARY_DIR ${_SB_BINARY_DIR}
#--Install step---------------

Wyświetl plik

@ -42,6 +42,8 @@ ExternalProject_Add(${_proj_name}
-DCMAKE_BUILD_TYPE=Release
-DPCL_VERBOSITY_LEVEL=Error
-DCMAKE_INSTALL_PREFIX:PATH=${SB_INSTALL_DIR}
-DPCL_BUILD_WITH_FLANN_DYNAMIC_LINKING_WIN32=ON
${WIN32_CMAKE_ARGS}
#--Build step-----------------
BINARY_DIR ${_SB_BINARY_DIR}
#--Install step---------------

Wyświetl plik

@ -1,6 +1,12 @@
set(_proj_name pdal)
set(_SB_BINARY_DIR "${SB_BINARY_DIR}/${_proj_name}")
if (WIN32)
set(LASZIP_LIB "${SB_INSTALL_DIR}/lib/laszip.lib")
else()
set(LASZIP_LIB "${SB_INSTALL_DIR}/lib/liblaszip.so")
endif()
ExternalProject_Add(${_proj_name}
DEPENDS hexer laszip
PREFIX ${_SB_BINARY_DIR}
@ -8,7 +14,7 @@ ExternalProject_Add(${_proj_name}
STAMP_DIR ${_SB_BINARY_DIR}/stamp
#--Download step--------------
DOWNLOAD_DIR ${SB_DOWNLOAD_DIR}
URL https://github.com/PDAL/PDAL/archive/2.2.0.zip
URL https://github.com/PDAL/PDAL/archive/refs/tags/2.3RC1.zip
#--Update/Patch step----------
UPDATE_COMMAND ""
#--Configure step-------------
@ -34,13 +40,15 @@ ExternalProject_Add(${_proj_name}
-DWITH_GEOTIFF=ON
-DWITH_LASZIP=ON
-DLASZIP_FOUND=TRUE
-DLASZIP_LIBRARIES=${SB_INSTALL_DIR}/lib/liblaszip.so
-DLASZIP_LIBRARIES=${LASZIP_LIB}
-DLASZIP_VERSION=3.1.1
-DLASZIP_INCLUDE_DIR=${SB_INSTALL_DIR}/include
-DLASZIP_LIBRARY=${SB_INSTALL_DIR}/lib/liblaszip.so
-DLASZIP_LIBRARY=${LASZIP_LIB}
-DWITH_TESTS=OFF
-DCMAKE_BUILD_TYPE=Release
-DCMAKE_INSTALL_PREFIX:PATH=${SB_INSTALL_DIR}
${WIN32_CMAKE_ARGS}
${WIN32_GDAL_ARGS}
#--Build step-----------------
BINARY_DIR ${_SB_BINARY_DIR}
#--Install step---------------

Wyświetl plik

@ -1,29 +0,0 @@
set(_proj_name pangolin)
set(_SB_BINARY_DIR "${SB_BINARY_DIR}/${_proj_name}")
ExternalProject_Add(${_proj_name}
PREFIX ${_SB_BINARY_DIR}
TMP_DIR ${_SB_BINARY_DIR}/tmp
STAMP_DIR ${_SB_BINARY_DIR}/stamp
#--Download step--------------
DOWNLOAD_DIR ${SB_DOWNLOAD_DIR}
URL https://github.com/paulinus/Pangolin/archive/b7c66570b336e012bf3124e2a7411d417a1d35f7.zip
URL_MD5 9b7938d1045d26b27a637b663e647aef
#--Update/Patch step----------
UPDATE_COMMAND ""
#--Configure step-------------
SOURCE_DIR ${SB_SOURCE_DIR}/${_proj_name}
CMAKE_ARGS
-DCPP11_NO_BOOST=1
-DCMAKE_INSTALL_PREFIX:PATH=${SB_INSTALL_DIR}
#--Build step-----------------
BINARY_DIR ${_SB_BINARY_DIR}
#--Install step---------------
INSTALL_DIR ${SB_INSTALL_DIR}
#--Output logging-------------
LOG_DOWNLOAD OFF
LOG_CONFIGURE OFF
LOG_BUILD OFF
)

Wyświetl plik

@ -8,8 +8,8 @@ ExternalProject_Add(${_proj_name}
STAMP_DIR ${_SB_BINARY_DIR}/stamp
#--Download step--------------
DOWNLOAD_DIR ${SB_DOWNLOAD_DIR}
GIT_REPOSITORY https://github.com/pierotofy/untwine/
GIT_TAG insttgt
GIT_REPOSITORY https://github.com/hobu/untwine/
GIT_TAG 20243113fc7e9a3056f4ec727cc1f69202669156
#--Update/Patch step----------
UPDATE_COMMAND ""
#--Configure step-------------

Wyświetl plik

@ -1 +1 @@
2.4.13
2.6.1

207
configure.py 100644
Wyświetl plik

@ -0,0 +1,207 @@
import sys, platform
if sys.platform != 'win32':
print("This script is for Windows only! Use configure.sh instead.")
exit(1)
if sys.version_info.major != 3 or sys.version_info.minor != 8:
print("You neeed to use Python 3.8.x (due to the requirements.txt). You are using %s instead." % platform.python_version())
exit(1)
import argparse
import subprocess
import os
import stat
import urllib.request
import shutil
import zipfile
from venv import EnvBuilder
parser = argparse.ArgumentParser(description='ODM Windows Configure Script')
parser.add_argument('action',
type=str,
choices=["build", "clean", "dist", "vcpkg_export"],
help='Action: %(choices)s')
parser.add_argument('--build-vcpkg',
type=bool,
help='Build VCPKG environment from scratch instead of downloading prebuilt one.')
parser.add_argument('--vcpkg-archive-url',
type=str,
default='https://github.com/OpenDroneMap/windows-deps/releases/download/2.5.0/vcpkg-export-250.zip',
required=False,
help='Path to VCPKG export archive')
parser.add_argument('--code-sign-cert-path',
type=str,
default='',
required=False,
help='Path to pfx code signing certificate')
parser.add_argument('--signtool-path',
type=str,
default='',
required=False,
help='Path to signtool.exe')
args = parser.parse_args()
def run(cmd, cwd=os.getcwd()):
env = os.environ.copy()
print(cmd)
p = subprocess.Popen(cmd, shell=True, env=env, cwd=cwd)
retcode = p.wait()
if retcode != 0:
raise Exception("Command returned %s" % retcode)
# https://izziswift.com/shutil-rmtree-fails-on-windows-with-access-is-denied/
def rmtree(top):
for root, dirs, files in os.walk(top, topdown=False):
for name in files:
filename = os.path.join(root, name)
os.chmod(filename, stat.S_IWUSR)
os.remove(filename)
for name in dirs:
os.rmdir(os.path.join(root, name))
os.rmdir(top)
def vcpkg_requirements():
with open("vcpkg-requirements.txt") as f:
pckgs = list(filter(lambda l: len(l) > 0, map(str.strip, f.read().split("\n"))))
return pckgs
def build():
# Create python virtual env
if not os.path.isdir("venv"):
print("Creating virtual env --> venv/")
ebuilder = EnvBuilder(with_pip=True)
ebuilder.create("venv")
run("venv\\Scripts\\pip install --ignore-installed -r requirements.txt")
# Download / build VCPKG environment
if not os.path.isdir("vcpkg"):
if args.build_vcpkg:
print("TODO")
# git clone vcpkg repo
# bootstrap
# install requirements
else:
if not os.path.exists("vcpkg-env.zip"):
print("Downloading %s" % args.vcpkg_archive_url)
with urllib.request.urlopen(args.vcpkg_archive_url) as response, open( "vcpkg-env.zip", 'wb') as out_file:
shutil.copyfileobj(response, out_file)
if not os.path.exists("vcpkg"):
print("Extracting vcpkg-env.zip --> vcpkg/")
with zipfile.ZipFile("vcpkg-env.zip") as z:
top_dir = z.namelist()[0]
z.extractall(".")
if os.path.exists(top_dir):
os.rename(top_dir, "vcpkg")
else:
print("Warning! Something looks wrong in the VCPKG archive... check the vcpkg/ directory.")
if not os.path.exists(os.path.join("SuperBuild", "build")) or not os.path.exists(os.path.join("SuperBuild", "install")):
print("Compiling SuperBuild")
build_dir = os.path.join("SuperBuild", "build")
if not os.path.isdir(build_dir):
os.mkdir(build_dir)
toolchain_file = os.path.join(os.getcwd(), "vcpkg", "scripts", "buildsystems", "vcpkg.cmake")
run("cmake .. -DCMAKE_TOOLCHAIN_FILE=\"%s\"" % toolchain_file, cwd=build_dir)
run("cmake --build . --config Release", cwd=build_dir)
def vcpkg_export():
if not os.path.exists("vcpkg"):
print("vcpkg directory does not exist. Did you build the environment?")
exit(1)
pkgs = vcpkg_requirements()
out = "vcpkg-export-%s" % odm_version().replace(".", "")
run("vcpkg\\vcpkg export %s --output=%s --zip" % (" ".join(pkgs), out))
def odm_version():
with open("VERSION") as f:
return f.read().split("\n")[0].strip()
def safe_remove(path):
if os.path.isdir(path):
rmtree(path)
elif os.path.isfile(path):
os.remove(path)
def clean():
safe_remove("vcpkg-download.zip")
safe_remove("vcpkg")
safe_remove("venv")
safe_remove(os.path.join("SuperBuild", "build"))
safe_remove(os.path.join("SuperBuild", "download"))
safe_remove(os.path.join("SuperBuild", "src"))
safe_remove(os.path.join("SuperBuild", "install"))
def dist():
if not os.path.exists("SuperBuild\\download"):
print("You need to run configure.py build before you can run dist")
exit(1)
# Download VC++ runtime
vcredist_path = os.path.join("SuperBuild", "download", "vc_redist.x64.zip")
if not os.path.isfile(vcredist_path):
vcredist_url = "https://github.com/OpenDroneMap/windows-deps/releases/download/2.5.0/VC_redist.x64.zip"
print("Downloading %s" % vcredist_url)
with urllib.request.urlopen(vcredist_url) as response, open(vcredist_path, 'wb') as out_file:
shutil.copyfileobj(response, out_file)
print("Extracting --> vc_redist.x64.exe")
with zipfile.ZipFile(vcredist_path) as z:
z.extractall(os.path.join("SuperBuild", "download"))
# Download portable python
if not os.path.isdir("python38"):
pythonzip_path = os.path.join("SuperBuild", "download", "python38.zip")
python_url = "https://github.com/OpenDroneMap/windows-deps/releases/download/2.5.0/python-3.8.1-embed-amd64-less-pth.zip"
if not os.path.exists(pythonzip_path):
print("Downloading %s" % python_url)
with urllib.request.urlopen(python_url) as response, open( pythonzip_path, 'wb') as out_file:
shutil.copyfileobj(response, out_file)
os.mkdir("python38")
print("Extracting --> python38/")
with zipfile.ZipFile(pythonzip_path) as z:
z.extractall("python38")
# Download innosetup
if not os.path.isdir("innosetup"):
innosetupzip_path = os.path.join("SuperBuild", "download", "innosetup.zip")
innosetup_url = "https://github.com/OpenDroneMap/windows-deps/releases/download/2.5.0/innosetup-portable-win32-6.0.5-3.zip"
if not os.path.exists(innosetupzip_path):
print("Downloading %s" % innosetup_url)
with urllib.request.urlopen(innosetup_url) as response, open(innosetupzip_path, 'wb') as out_file:
shutil.copyfileobj(response, out_file)
os.mkdir("innosetup")
print("Extracting --> innosetup/")
with zipfile.ZipFile(innosetupzip_path) as z:
z.extractall("innosetup")
# Run
cs_flags = ""
if args.code_sign_cert_path and args.signtool_path:
cs_flags = '"/Ssigntool=%s sign /f %s /t http://timestamp.sectigo.com $f"' % (args.signtool_path, args.code_sign_cert_path)
run("innosetup\\iscc /Qp " + cs_flags + " \"innosetup.iss\"")
print("Done! Setup created in dist/")
if args.action == 'build':
build()
elif args.action == 'vcpkg_export':
vcpkg_export()
elif args.action == 'dist':
dist()
elif args.action == 'clean':
clean()
else:
args.print_help()
exit(1)

Wyświetl plik

@ -97,8 +97,8 @@ installruntimedepsonly() {
installdepsfromsnapcraft runtime openmvs
}
install() {
installreqs() {
cd /code
## Set up library paths
@ -123,6 +123,10 @@ install() {
if [ ! -z "$GPU_INSTALL" ]; then
pip install --ignore-installed -r requirements.gpu.txt
fi
}
install() {
installreqs
if [ ! -z "$PORTABLE_INSTALL" ]; then
echo "Replacing g++ and gcc with our scripts for portability..."
@ -166,29 +170,9 @@ reinstall() {
clean() {
rm -rf \
${RUNPATH}/SuperBuild/build/opencv \
${RUNPATH}/SuperBuild/build \
${RUNPATH}/SuperBuild/download \
${RUNPATH}/SuperBuild/src/ceres \
${RUNPATH}/SuperBuild/src/untwine \
${RUNPATH}/SuperBuild/src/entwine \
${RUNPATH}/SuperBuild/src/gflags \
${RUNPATH}/SuperBuild/src/hexer \
${RUNPATH}/SuperBuild/src/lastools \
${RUNPATH}/SuperBuild/src/laszip \
${RUNPATH}/SuperBuild/src/mvstexturing \
${RUNPATH}/SuperBuild/src/opencv \
${RUNPATH}/SuperBuild/src/opengv \
${RUNPATH}/SuperBuild/src/pcl \
${RUNPATH}/SuperBuild/src/pdal \
${RUNPATH}/SuperBuild/src/dem2mesh \
${RUNPATH}/SuperBuild/build/dem2mesh \
${RUNPATH}/SuperBuild/src/dem2points \
${RUNPATH}/SuperBuild/build/dem2points \
${RUNPATH}/SuperBuild/src/openmvs \
${RUNPATH}/SuperBuild/build/openmvs \
${RUNPATH}/SuperBuild/src/odm_orthophoto \
${RUNPATH}/SuperBuild/build/odm_orthophoto \
${RUNPATH}/SuperBuild/src/vcg
${RUNPATH}/SuperBuild/src
# find in /code and delete static libraries and intermediate object files
find ${RUNPATH} -type f -name "*.a" -delete -or -type f -name "*.o" -delete
@ -196,7 +180,7 @@ clean() {
usage() {
echo "Usage:"
echo "bash configure.sh <install|update|uninstall|help> [nproc]"
echo "bash configure.sh <install|update|uninstall|installreqs|help> [nproc]"
echo "Subcommands:"
echo " install"
echo " Installs all dependencies and modules for running OpenDroneMap"
@ -206,6 +190,8 @@ usage() {
echo " Removes SuperBuild and build modules, then re-installs them. Note this does not update OpenDroneMap to the latest version. "
echo " uninstall"
echo " Removes SuperBuild and build modules. Does not uninstall dependencies"
echo " installreqs"
echo " Only installs the requirements (does not build SuperBuild)"
echo " clean"
echo " Cleans the SuperBuild directory by removing temporary files. "
echo " help"
@ -213,7 +199,7 @@ usage() {
echo "[nproc] is an optional argument that can set the number of processes for the make -j tag. By default it uses $(nproc)"
}
if [[ $1 =~ ^(install|installruntimedepsonly|reinstall|uninstall|clean)$ ]]; then
if [[ $1 =~ ^(install|installruntimedepsonly|reinstall|uninstall|installreqs|clean)$ ]]; then
RUNPATH="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
"$1"
else

8
console.bat 100644
Wyświetl plik

@ -0,0 +1,8 @@
@echo off
setlocal
call win32env.bat
start "ODM Console" cmd /k "echo ____________________________ && echo / ____ _____ __ __ \ && echo ^| / __ \ ^| __ \ ^| \/ ^| ^| && echo ^| ^| ^| ^| ^| ^| ^| ^| ^| ^| \ / ^| ^| && echo ^| ^| ^| ^| ^| ^| ^| ^| ^| ^| ^|\/^| ^| ^| && echo ^| ^| ^|__^| ^| ^| ^|__^| ^| ^| ^| ^| ^| ^| && echo ^| \____/ ^|_____/ ^|_^| ^|_^| ^| && echo \____________________________/ && @echo off && FOR /F %%i in (VERSION) do echo version: %%i && @echo on && echo. && run --help
endlocal

151
innosetup.iss 100644
Wyświetl plik

@ -0,0 +1,151 @@
; Script generated by the Inno Setup Script Wizard.
; SEE THE DOCUMENTATION FOR DETAILS ON CREATING INNO SETUP SCRIPT FILES!
#define MyAppName "ODM"
#define VerFile FileOpen("VERSION")
#define MyAppVersion FileRead(VerFile)
#expr FileClose(VerFile)
#undef VerFile
#define MyAppPublisher "OpenDroneMap"
#define MyAppURL "https://opendronemap.org"
[Setup]
; NOTE: The value of AppId uniquely identifies this application.
; Do not use the same AppId value in installers for other applications.
; (To generate a new GUID, click Tools | Generate GUID inside the IDE.)
AppId={{443998BA-9F8F-4A69-9A96-0D8FBC8C6393}
AppName={#MyAppName}
AppVersion={#MyAppVersion}
AppPublisher={#MyAppPublisher}
AppPublisherURL={#MyAppURL}
AppSupportURL={#MyAppURL}
AppUpdatesURL={#MyAppURL}
DefaultDirName=C:\ODM
DefaultGroupName={#MyAppName}
AllowNoIcons=yes
LicenseFile=LICENSE
OutputDir=dist
OutputBaseFilename=ODM_Setup_{#MyAppVersion}
Compression=lzma
SolidCompression=yes
ArchitecturesAllowed=x64
SignTool=signtool
PrivilegesRequired=lowest
UsePreviousAppDir=no
;SetupIconFile=setup.ico
[Languages]
Name: "english"; MessagesFile: "compiler:Default.isl"
[Files]
Source: "contrib\*"; DestDir: "{app}\contrib"; Flags: ignoreversion recursesubdirs createallsubdirs
Source: "licenses\*"; DestDir: "{app}\licenses"; Flags: ignoreversion recursesubdirs createallsubdirs
Source: "opendm\*"; DestDir: "{app}\opendm"; Excludes: "__pycache__"; Flags: ignoreversion recursesubdirs createallsubdirs
Source: "stages\*"; DestDir: "{app}\stages"; Excludes: "__pycache__"; Flags: ignoreversion recursesubdirs createallsubdirs
Source: "SuperBuild\install\bin\*"; DestDir: "{app}\SuperBuild\install\bin"; Excludes: "__pycache__"; Flags: ignoreversion recursesubdirs createallsubdirs
Source: "venv\*"; DestDir: "{app}\venv"; Excludes: "__pycache__,pyvenv.cfg"; Flags: ignoreversion recursesubdirs createallsubdirs
Source: "python38\*"; DestDir: "{app}\python38"; Excludes: "__pycache__"; Flags: ignoreversion recursesubdirs createallsubdirs
Source: "console.bat"; DestDir: "{app}"; Flags: ignoreversion
Source: "VERSION"; DestDir: "{app}"; Flags: ignoreversion
Source: "LICENSE"; DestDir: "{app}"; Flags: ignoreversion
Source: "run.bat"; DestDir: "{app}"; Flags: ignoreversion
Source: "run.py"; DestDir: "{app}"; Flags: ignoreversion
Source: "settings.yaml"; DestDir: "{app}"; Flags: ignoreversion
Source: "win32env.bat"; DestDir: "{app}"; Flags: ignoreversion
Source: "winrun.bat"; DestDir: "{app}"; Flags: ignoreversion
Source: "SuperBuild\download\vc_redist.x64.exe"; DestDir: {tmp}; Flags: dontcopy
[Icons]
Name: {group}\ODM Console; Filename: "{app}\console.bat"; WorkingDir: "{app}"
Name: "{userdesktop}\ODM Console"; Filename: "{app}\console.bat"; WorkingDir: "{app}"; Tasks: desktopicon
[Tasks]
Name: "desktopicon"; Description: "{cm:CreateDesktopIcon}"; GroupDescription: "{cm:AdditionalIcons}"; Flags: unchecked
[Run]
Filename: "{tmp}\vc_redist.x64.exe"; StatusMsg: "Installing Visual C++ Redistributable Packages for Visual Studio 2019"; Parameters: "/quiet"; Check: VC2019RedistNeedsInstall ; Flags: waituntilterminated
Filename: "{app}\console.bat"; Description: {cm:LaunchProgram,ODM Console}; Flags: nowait postinstall skipifsilent
[Code]
function VC2019RedistNeedsInstall: Boolean;
var
Version: String;
begin
if RegQueryStringValue(HKEY_LOCAL_MACHINE,
'SOFTWARE\Microsoft\VisualStudio\14.0\VC\Runtimes\x64', 'Version', Version) then
begin
// Is the installed version at least 14.14 ?
Log('VC Redist Version check : found ' + Version);
Result := (CompareStr(Version, 'v14.14.26429.03')<0);
end
else
begin
// Not even an old version installed
Result := True;
end;
if (Result) then
begin
ExtractTemporaryFile('vc_redist.x64.exe');
end;
end;
function GetUninstallString(): String;
var
sUnInstPath: String;
sUnInstallString: String;
begin
sUnInstPath := ExpandConstant('Software\Microsoft\Windows\CurrentVersion\Uninstall\{#emit SetupSetting("AppId")}_is1');
sUnInstallString := '';
if not RegQueryStringValue(HKLM, sUnInstPath, 'UninstallString', sUnInstallString) then
RegQueryStringValue(HKCU, sUnInstPath, 'UninstallString', sUnInstallString);
Result := sUnInstallString;
end;
function IsUpgrade(): Boolean;
begin
Result := (GetUninstallString() <> '');
end;
function UnInstallOldVersion(): Integer;
var
sUnInstallString: String;
iResultCode: Integer;
begin
{ Return Values: }
{ 1 - uninstall string is empty }
{ 2 - error executing the UnInstallString }
{ 3 - successfully executed the UnInstallString }
{ default return value }
Result := 0;
{ get the uninstall string of the old app }
sUnInstallString := GetUninstallString();
if sUnInstallString <> '' then begin
sUnInstallString := RemoveQuotes(sUnInstallString);
if Exec(sUnInstallString, '/SILENT /NORESTART /SUPPRESSMSGBOXES','', SW_HIDE, ewWaitUntilTerminated, iResultCode) then
Result := 3
else
Result := 2;
end else
Result := 1;
end;
procedure CurStepChanged(CurStep: TSetupStep);
begin
if (CurStep=ssInstall) then
begin
if (IsUpgrade()) then
begin
UnInstallOldVersion();
end;
end;
end;
[UninstallDelete]
Type: filesandordirs; Name: "{app}\SuperBuild"
Type: filesandordirs; Name: "{app}\contrib"
Type: filesandordirs; Name: "{app}\licenses"
Type: filesandordirs; Name: "{app}\opendm"
Type: filesandordirs; Name: "{app}\stages"
Type: filesandordirs; Name: "{app}\venv"

66
opendm/cogeo.py 100644
Wyświetl plik

@ -0,0 +1,66 @@
import os
import shutil
from opendm import system
from opendm.concurrency import get_max_memory
from opendm import io
from opendm import log
def convert_to_cogeo(src_path, blocksize=256, max_workers=1, compression="DEFLATE"):
"""
Guarantee that the .tif passed as an argument is a Cloud Optimized GeoTIFF (cogeo)
The file is destructively converted into a cogeo.
If the file cannot be converted, the function does not change the file
:param src_path: path to GeoTIFF
:return: True on success
"""
if not os.path.isfile(src_path):
logger.warning("Cannot convert to cogeo: %s (file does not exist)" % src_path)
return False
log.ODM_INFO("Optimizing %s as Cloud Optimized GeoTIFF" % src_path)
tmpfile = io.related_file_path(src_path, postfix='_cogeo')
swapfile = io.related_file_path(src_path, postfix='_cogeo_swap')
kwargs = {
'threads': max_workers if max_workers else 'ALL_CPUS',
'blocksize': blocksize,
'max_memory': get_max_memory(),
'src_path': src_path,
'tmpfile': tmpfile,
'compress': compression,
'predictor': '2' if compression in ['LZW', 'DEFLATE'] else '1',
}
try:
system.run("gdal_translate "
"-of COG "
"-co NUM_THREADS={threads} "
"-co BLOCKSIZE={blocksize} "
"-co COMPRESS={compress} "
"-co PREDICTOR={predictor} "
"-co BIGTIFF=IF_SAFER "
"-co RESAMPLING=NEAREST "
"--config GDAL_CACHEMAX {max_memory}% "
"--config GDAL_NUM_THREADS {threads} "
"\"{src_path}\" \"{tmpfile}\" ".format(**kwargs))
except Exception as e:
log.ODM_WARNING("Cannot create Cloud Optimized GeoTIFF: %s" % str(e))
if os.path.isfile(tmpfile):
shutil.move(src_path, swapfile) # Move to swap location
try:
shutil.move(tmpfile, src_path)
except IOError as e:
log.ODM_WARNING("Cannot move %s to %s: %s" % (tmpfile, src_path, str(e)))
shutil.move(swapfile, src_path) # Attempt to restore
if os.path.isfile(swapfile):
os.remove(swapfile)
return True
else:
return False

Wyświetl plik

@ -11,19 +11,12 @@ import sys
# parse arguments
processopts = ['dataset', 'split', 'merge', 'opensfm', 'openmvs', 'odm_filterpoints',
'odm_meshing', 'mvs_texturing', 'odm_georeferencing',
'odm_dem', 'odm_orthophoto', 'odm_report']
'odm_dem', 'odm_orthophoto', 'odm_report', 'odm_postprocess']
with open(os.path.join(context.root_path, 'VERSION')) as version_file:
__version__ = version_file.read().strip()
def alphanumeric_string(string):
import re
if re.match('^[a-zA-Z0-9_-]+$', string) is None:
msg = '{0} is not a valid name. Must use alphanumeric characters.'.format(string)
raise argparse.ArgumentTypeError(msg)
return string
def path_or_json_string(string):
try:
return io.path_or_json_string_to_dict(string)
@ -68,9 +61,14 @@ def config(argv=None, parser=None):
if args is not None and argv is None:
return args
if sys.platform == 'win32':
usage_bin = 'run'
else:
usage_bin = 'run.sh'
if parser is None:
parser = SettingsParser(description='ODM',
usage='%(prog)s [options] <project name>',
parser = SettingsParser(description='ODM is a command line toolkit to generate maps, point clouds, 3D models and DEMs from drone, balloon or kite images.',
usage='%s [options] <dataset name>' % usage_bin,
yaml_file=open(context.settings_path))
parser.add_argument('--project-path',
@ -78,9 +76,9 @@ def config(argv=None, parser=None):
action=StoreValue,
help='Path to the project folder. Your project folder should contain subfolders for each dataset. Each dataset should have an "images" folder.')
parser.add_argument('name',
metavar='<project name>',
metavar='<dataset name>',
action=StoreValue,
type=alphanumeric_string,
type=str,
default='code',
nargs='?',
help='Name of dataset (i.e subfolder name within project folder). Default: %(default)s')
@ -97,7 +95,7 @@ def config(argv=None, parser=None):
parser.add_argument('--end-with', '-e',
metavar='<string>',
action=StoreValue,
default='odm_report',
default='odm_postprocess',
choices=processopts,
help='End processing at this stage. Can be one of: %(choices)s. Default: %(default)s')
@ -279,7 +277,7 @@ def config(argv=None, parser=None):
'Default: %(default)s'))
parser.add_argument('--mesh-octree-depth',
metavar='<positive integer>',
metavar='<integer: 1 <= x <= 14>',
action=StoreValue,
default=11,
type=int,
@ -364,6 +362,13 @@ def config(argv=None, parser=None):
help='Reduce the memory usage needed for depthmap fusion by splitting large scenes into tiles. Turn this on if your machine doesn\'t have much RAM and/or you\'ve set --pc-quality to high or ultra. Experimental. '
'Default: %(default)s')
parser.add_argument('--pc-geometric',
action=StoreTrue,
nargs=0,
default=False,
help='Improve the accuracy of the point cloud by computing geometrically consistent depthmaps. This increases processing time, but can improve results in urban scenes. '
'Default: %(default)s')
parser.add_argument('--smrf-scalar',
metavar='<positive float>',
action=StoreValue,
@ -583,12 +588,24 @@ def config(argv=None, parser=None):
default=False,
help='Build orthophoto overviews for faster display in programs such as QGIS. Default: %(default)s')
parser.add_argument('--cog',
action=StoreTrue,
nargs=0,
default=False,
help='Create Cloud-Optimized GeoTIFFs instead of normal GeoTIFFs. Default: %(default)s')
parser.add_argument('--verbose', '-v',
action=StoreTrue,
nargs=0,
default=False,
help='Print additional messages to the console. '
'Default: %(default)s')
parser.add_argument('--copy-to',
metavar='<path>',
action=StoreValue,
help='Copy output results to this folder after processing.')
parser.add_argument('--time',
action=StoreTrue,

Wyświetl plik

@ -13,19 +13,16 @@ superbuild_bin_path = os.path.join(superbuild_path, 'install', 'bin')
python_packages_paths = [os.path.join(superbuild_path, p) for p in [
'install/lib/python3.8/dist-packages',
'install/lib/python3/dist-packages',
'src/opensfm'
'install/bin/opensfm'
]]
for p in python_packages_paths:
sys.path.append(p)
# define opensfm path
opensfm_path = os.path.join(superbuild_path, "src/opensfm")
opensfm_path = os.path.join(superbuild_bin_path, "opensfm")
# define orb_slam2 path
orb_slam2_path = os.path.join(superbuild_path, "src/orb_slam2")
poisson_recon_path = os.path.join(superbuild_path, 'src', 'PoissonRecon', 'Bin', 'Linux', 'PoissonRecon')
poisson_recon_path = os.path.join(superbuild_bin_path, 'PoissonRecon')
dem2mesh_path = os.path.join(superbuild_bin_path, 'dem2mesh')
dem2points_path = os.path.join(superbuild_bin_path, 'dem2points')
@ -36,14 +33,6 @@ mvstex_path = os.path.join(superbuild_bin_path, "texrecon")
omvs_densify_path = os.path.join(superbuild_bin_path, "OpenMVS", "DensifyPointCloud")
omvs_reconstructmesh_path = os.path.join(superbuild_bin_path, "OpenMVS", "ReconstructMesh")
# define txt2las path
txt2las_path = os.path.join(superbuild_path, 'src/las-tools/bin')
pdal_path = os.path.join(superbuild_path, 'build/pdal/bin')
# define odm modules path
odm_modules_path = os.path.join(root_path, "build/bin")
odm_modules_src_path = os.path.join(root_path, "modules")
odm_orthophoto_path = os.path.join(superbuild_bin_path, "odm_orthophoto")
settings_path = os.path.join(root_path, 'settings.yaml')

Wyświetl plik

@ -37,7 +37,7 @@ class Cropper:
# ext = .tif
original_geotiff = os.path.join(path, "{}.original{}".format(basename, ext))
os.rename(geotiff_path, original_geotiff)
os.replace(geotiff_path, original_geotiff)
try:
kwargs = {
@ -64,7 +64,7 @@ class Cropper:
log.ODM_WARNING('Something went wrong while cropping: {}'.format(e))
# Revert rename
os.rename(original_geotiff, geotiff_path)
os.replace(original_geotiff, geotiff_path)
return geotiff_path
@ -148,7 +148,7 @@ class Cropper:
boundary_file_path = self.path('boundary.json')
run('pdal info --boundary --filters.hexbin.edge_size=1 --filters.hexbin.threshold=0 {0} > {1}'.format(decimated_pointcloud_path, boundary_file_path))
run('pdal info --boundary --filters.hexbin.edge_size=1 --filters.hexbin.threshold=0 "{0}" > "{1}"'.format(decimated_pointcloud_path, boundary_file_path))
pc_geojson_boundary_feature = None
@ -159,8 +159,8 @@ class Cropper:
if pc_geojson_boundary_feature is None: raise RuntimeError("Could not determine point cloud boundaries")
# Write bounds to GeoJSON
bounds_geojson_path = self.path('bounds.geojson')
with open(bounds_geojson_path, "w") as f:
tmp_bounds_geojson_path = self.path('tmp-bounds.geojson')
with open(tmp_bounds_geojson_path, "w") as f:
f.write(json.dumps({
"type": "FeatureCollection",
"features": [{
@ -172,7 +172,7 @@ class Cropper:
# Create a convex hull around the boundary
# as to encompass the entire area (no holes)
driver = ogr.GetDriverByName('GeoJSON')
ds = driver.Open(bounds_geojson_path, 0) # ready-only
ds = driver.Open(tmp_bounds_geojson_path, 0) # ready-only
layer = ds.GetLayer()
# Collect all Geometry
@ -202,7 +202,7 @@ class Cropper:
# Save to a new file
bounds_geojson_path = self.path('bounds.geojson')
if os.path.exists(bounds_geojson_path):
driver.DeleteDataSource(bounds_geojson_path)
os.remove(bounds_geojson_path)
out_ds = driver.CreateDataSource(bounds_geojson_path)
layer = out_ds.CreateLayer("convexhull", geom_type=ogr.wkbPolygon)
@ -219,6 +219,10 @@ class Cropper:
# Remove decimated point cloud
if os.path.exists(decimated_pointcloud_path):
os.remove(decimated_pointcloud_path)
# Remove tmp bounds
if os.path.exists(tmp_bounds_geojson_path):
os.remove(tmp_bounds_geojson_path)
return bounds_geojson_path

Wyświetl plik

@ -4,6 +4,7 @@ import rasterio
import fiona
import numpy as np
import math
import sys
from opendm import log
from opendm import io
from opendm import concurrency
@ -13,9 +14,15 @@ from opendm import system
from skimage.feature import canny
from skimage.draw import line
from skimage.graph import route_through_array
import shapely
from shapely.geometry import LineString, mapping, shape
from shapely.ops import polygonize, unary_union
if sys.platform == 'win32':
# Temporary fix for: ValueError: GEOSGeom_createLinearRing_r returned a NULL pointer
# https://github.com/Toblerity/Shapely/issues/1005
shapely.speedups.disable()
def write_raster(data, file):
profile = {
'driver': 'GTiff',
@ -45,7 +52,7 @@ def compute_cutline(orthophoto_file, crop_area_file, destination, max_concurrenc
system.run("gdal_translate -outsize {}% 0 "
"-co NUM_THREADS={} "
"--config GDAL_CACHEMAX {}% "
"{} {}".format(
'"{}" "{}"'.format(
scale * 100,
max_concurrency,
concurrency.get_max_memory(),
@ -71,6 +78,7 @@ def compute_cutline(orthophoto_file, crop_area_file, destination, max_concurrenc
if len(crop_f) == 0:
log.ODM_WARNING("Crop area is empty, cannot compute cutline")
return
crop_poly = shape(crop_f[1]['geometry'])
crop_f.close()

Wyświetl plik

@ -12,6 +12,7 @@ from opendm import system
from opendm.concurrency import get_max_memory, parallel_map
from scipy import ndimage
from datetime import datetime
from osgeo.utils.gdal_fillnodata import main as gdal_fillnodata
from opendm import log
try:
import Queue as queue
@ -30,7 +31,7 @@ def classify(lasFile, scalar, slope, threshold, window, verbose=False):
except:
log.ODM_WARNING("Error creating classified file %s" % lasFile)
log.ODM_INFO('Created %s in %s' % (os.path.relpath(lasFile), datetime.now() - start))
log.ODM_INFO('Created %s in %s' % (lasFile, datetime.now() - start))
return lasFile
def rectify(lasFile, debug=False, reclassify_threshold=5, min_area=750, min_points=500):
@ -70,7 +71,7 @@ def rectify(lasFile, debug=False, reclassify_threshold=5, min_area=750, min_poin
except Exception as e:
raise Exception("Error rectifying ground in file %s: %s" % (lasFile, str(e)))
log.ODM_INFO('Created %s in %s' % (os.path.relpath(lasFile), datetime.now() - start))
log.ODM_INFO('Created %s in %s' % (lasFile, datetime.now() - start))
return lasFile
error = None
@ -188,7 +189,12 @@ def create_dem(input_point_cloud, dem_type, output_type='max', radiuses=['0.56']
# Create virtual raster
tiles_vrt_path = os.path.abspath(os.path.join(outdir, "tiles.vrt"))
run('gdalbuildvrt "%s" "%s"' % (tiles_vrt_path, '" "'.join(map(lambda t: t['filename'], tiles))))
tiles_file_list = os.path.abspath(os.path.join(outdir, "tiles_list.txt"))
with open(tiles_file_list, 'w') as f:
for t in tiles:
f.write(t['filename'] + '\n')
run('gdalbuildvrt -input_file_list "%s" "%s" ' % (tiles_file_list, tiles_vrt_path))
merged_vrt_path = os.path.abspath(os.path.join(outdir, "merged.vrt"))
geotiff_tmp_path = os.path.abspath(os.path.join(outdir, 'tiles.tmp.tif'))
@ -216,7 +222,7 @@ def create_dem(input_point_cloud, dem_type, output_type='max', radiuses=['0.56']
'-co NUM_THREADS={threads} '
'-co BIGTIFF=IF_SAFER '
'--config GDAL_CACHEMAX {max_memory}% '
'{tiles_vrt} {geotiff_tmp}'.format(**kwargs))
'"{tiles_vrt}" "{geotiff_tmp}"'.format(**kwargs))
# Scale to 10% size
run('gdal_translate '
@ -224,17 +230,17 @@ def create_dem(input_point_cloud, dem_type, output_type='max', radiuses=['0.56']
'-co BIGTIFF=IF_SAFER '
'--config GDAL_CACHEMAX {max_memory}% '
'-outsize 10% 0 '
'{geotiff_tmp} {geotiff_small}'.format(**kwargs))
'"{geotiff_tmp}" "{geotiff_small}"'.format(**kwargs))
# Fill scaled
run('gdal_fillnodata.py '
'-co NUM_THREADS={threads} '
'-co BIGTIFF=IF_SAFER '
'--config GDAL_CACHEMAX {max_memory}% '
'-b 1 '
'-of GTiff '
'{geotiff_small} {geotiff_small_filled}'.format(**kwargs))
gdal_fillnodata(['.',
'-co', 'NUM_THREADS=%s' % kwargs['threads'],
'-co', 'BIGTIFF=IF_SAFER',
'--config', 'GDAL_CACHE_MAX', str(kwargs['max_memory']) + '%',
'-b', '1',
'-of', 'GTiff',
kwargs['geotiff_small'], kwargs['geotiff_small_filled']])
# Merge filled scaled DEM with unfilled DEM using bilinear interpolation
run('gdalbuildvrt -resolution highest -r bilinear "%s" "%s" "%s"' % (merged_vrt_path, geotiff_small_filled_path, geotiff_tmp_path))
run('gdal_translate '
@ -243,7 +249,7 @@ def create_dem(input_point_cloud, dem_type, output_type='max', radiuses=['0.56']
'-co BIGTIFF=IF_SAFER '
'-co COMPRESS=DEFLATE '
'--config GDAL_CACHEMAX {max_memory}% '
'{merged_vrt} {geotiff}'.format(**kwargs))
'"{merged_vrt}" "{geotiff}"'.format(**kwargs))
else:
run('gdal_translate '
'-co NUM_THREADS={threads} '
@ -251,25 +257,25 @@ def create_dem(input_point_cloud, dem_type, output_type='max', radiuses=['0.56']
'-co BIGTIFF=IF_SAFER '
'-co COMPRESS=DEFLATE '
'--config GDAL_CACHEMAX {max_memory}% '
'{tiles_vrt} {geotiff}'.format(**kwargs))
'"{tiles_vrt}" "{geotiff}"'.format(**kwargs))
if apply_smoothing:
median_smoothing(geotiff_path, output_path)
os.remove(geotiff_path)
else:
os.rename(geotiff_path, output_path)
os.replace(geotiff_path, output_path)
if os.path.exists(geotiff_tmp_path):
if not keep_unfilled_copy:
os.remove(geotiff_tmp_path)
else:
os.rename(geotiff_tmp_path, io.related_file_path(output_path, postfix=".unfilled"))
os.replace(geotiff_tmp_path, io.related_file_path(output_path, postfix=".unfilled"))
for cleanup_file in [tiles_vrt_path, merged_vrt_path, geotiff_small_path, geotiff_small_filled_path]:
for cleanup_file in [tiles_vrt_path, tiles_file_list, merged_vrt_path, geotiff_small_path, geotiff_small_filled_path]:
if os.path.exists(cleanup_file): os.remove(cleanup_file)
for t in tiles:
if os.path.exists(t['filename']): os.remove(t['filename'])
log.ODM_INFO('Completed %s in %s' % (output_file, datetime.now() - start))
@ -331,6 +337,6 @@ def median_smoothing(geotiff_path, output_path, smoothing_iterations=1):
with rasterio.open(output_path, 'w', **img.profile) as imgout:
imgout.write(arr, 1)
log.ODM_INFO('Completed smoothing to create %s in %s' % (os.path.relpath(output_path), datetime.now() - start))
log.ODM_INFO('Completed smoothing to create %s in %s' % (output_path, datetime.now() - start))
return output_path

Wyświetl plik

@ -31,13 +31,14 @@
# Library functions for creating DEMs from Lidar data
import os
import sys
import json as jsonlib
import tempfile
from opendm import system
from opendm import log
from opendm.utils import double_quote
from datetime import datetime
from pipes import quote
""" JSON Functions """
@ -156,7 +157,7 @@ def run_pipeline(json, verbose=False):
'pipeline',
'-i %s' % jsonfile
]
if verbose:
if verbose or sys.platform == 'win32':
system.run(' '.join(cmd))
else:
system.run(' '.join(cmd) + ' > /dev/null 2>&1')
@ -190,7 +191,7 @@ def merge_point_clouds(input_files, output_file, verbose=False):
cmd = [
'pdal',
'merge',
' '.join(map(quote, input_files + [output_file])),
' '.join(map(double_quote, input_files + [output_file])),
]
if verbose:

Wyświetl plik

@ -1,6 +1,7 @@
import os
import sys
import shutil
from pipes import quote
from opendm.utils import double_quote
from opendm import io
from opendm import log
from opendm import system
@ -27,13 +28,17 @@ def build(input_point_cloud_files, output_path, max_concurrency=8, rerun=False):
if rerun:
dir_cleanup()
# Attempt with entwine (faster, more memory hungry)
try:
build_entwine(input_point_cloud_files, tmpdir, output_path, max_concurrency=max_concurrency)
except Exception as e:
log.ODM_WARNING("Cannot build EPT using entwine (%s), attempting with untwine..." % str(e))
dir_cleanup()
# On Windows we always use Untwine
if sys.platform == 'win32':
build_untwine(input_point_cloud_files, tmpdir, output_path, max_concurrency=max_concurrency)
else:
# Attempt with entwine (faster, more memory hungry)
try:
build_entwine(input_point_cloud_files, tmpdir, output_path, max_concurrency=max_concurrency)
except Exception as e:
log.ODM_WARNING("Cannot build EPT using entwine (%s), attempting with untwine..." % str(e))
dir_cleanup()
build_untwine(input_point_cloud_files, tmpdir, output_path, max_concurrency=max_concurrency)
if os.path.exists(tmpdir):
shutil.rmtree(tmpdir)
@ -43,7 +48,7 @@ def build_entwine(input_point_cloud_files, tmpdir, output_path, max_concurrency=
kwargs = {
'threads': max_concurrency,
'tmpdir': tmpdir,
'all_inputs': "-i " + " ".join(map(quote, input_point_cloud_files)),
'all_inputs': "-i " + " ".join(map(double_quote, input_point_cloud_files)),
'outputdir': output_path
}
@ -56,7 +61,7 @@ def build_untwine(input_point_cloud_files, tmpdir, output_path, max_concurrency=
kwargs = {
# 'threads': max_concurrency,
'tmpdir': tmpdir,
'files': "--files " + " ".join(map(quote, input_point_cloud_files)),
'files': "--files " + " ".join(map(double_quote, input_point_cloud_files)),
'outputdir': output_path
}

Wyświetl plik

@ -5,6 +5,7 @@ import math
from repoze.lru import lru_cache
from opendm import log
def rounded_gsd(reconstruction_json, default_value=None, ndigits=0, ignore_gsd=False):
"""
:param reconstruction_json path to OpenSfM's reconstruction.json
@ -61,12 +62,15 @@ def image_scale_factor(target_resolution, reconstruction_json, gsd_error_estimat
return 1
def cap_resolution(resolution, reconstruction_json, gsd_error_estimate = 0.1, ignore_gsd=False, ignore_resolution=False, has_gcp=False):
def cap_resolution(resolution, reconstruction_json, gsd_error_estimate = 0.1, gsd_scaling = 1.0, ignore_gsd=False,
ignore_resolution=False, has_gcp=False):
"""
:param resolution resolution in cm / pixel
:param reconstruction_json path to OpenSfM's reconstruction.json
:param gsd_error_estimate percentage of estimated error in the GSD calculation to set an upper bound on resolution.
:param gsd_scaling scaling of estimated GSD.
:param ignore_gsd when set to True, forces the function to just return resolution.
:param ignore_resolution when set to True, forces the function to return a value based on GSD.
:return The max value between resolution and the GSD computed from the reconstruction.
If a GSD cannot be computed, or ignore_gsd is set to True, it just returns resolution. Units are in cm / pixel.
"""
@ -76,14 +80,16 @@ def cap_resolution(resolution, reconstruction_json, gsd_error_estimate = 0.1, ig
gsd = opensfm_reconstruction_average_gsd(reconstruction_json, use_all_shots=has_gcp or ignore_resolution)
if gsd is not None:
gsd = gsd * (1 - gsd_error_estimate)
gsd = gsd * (1 - gsd_error_estimate) * gsd_scaling
if gsd > resolution or ignore_resolution:
log.ODM_WARNING('Maximum resolution set to GSD - {}% ({} cm / pixel, requested resolution was {} cm / pixel)'.format(gsd_error_estimate * 100, round(gsd, 2), round(resolution, 2)))
log.ODM_WARNING('Maximum resolution set to {} * (GSD - {}%) '
'({:.2f} cm / pixel, requested resolution was {:.2f} cm / pixel)'
.format(gsd_scaling, gsd_error_estimate * 100, gsd, resolution))
return gsd
else:
return resolution
else:
log.ODM_WARNING('Cannot calculate GSD, using requested resolution of {}'.format(round(resolution, 2)))
log.ODM_WARNING('Cannot calculate GSD, using requested resolution of {:.2f}'.format(resolution))
return resolution
@ -134,6 +140,7 @@ def opensfm_reconstruction_average_gsd(reconstruction_json, use_all_shots=False)
return None
def calculate_gsd(sensor_width, flight_height, focal_length, image_width):
"""
:param sensor_width in millimeters
@ -154,6 +161,7 @@ def calculate_gsd(sensor_width, flight_height, focal_length, image_width):
else:
return None
def calculate_gsd_from_focal_ratio(focal_ratio, flight_height, image_width):
"""
:param focal_ratio focal length (mm) / sensor_width (mm)

Wyświetl plik

@ -1,22 +1,127 @@
import sys
HEADER = '\033[95m'
OKBLUE = '\033[94m'
OKGREEN = '\033[92m'
DEFAULT = '\033[39m'
WARNING = '\033[93m'
FAIL = '\033[91m'
ENDC = '\033[0m'
import threading
import os
import json
import datetime
import dateutil.parser
import shutil
import multiprocessing
from opendm.loghelpers import double_quote, args_to_dict
from vmem import virtual_memory
if sys.platform == 'win32':
# No colors on Windows, sorry!
HEADER = ''
OKBLUE = ''
OKGREEN = ''
DEFAULT = ''
WARNING = ''
FAIL = ''
ENDC = ''
else:
HEADER = '\033[95m'
OKBLUE = '\033[94m'
OKGREEN = '\033[92m'
DEFAULT = '\033[39m'
WARNING = '\033[93m'
FAIL = '\033[91m'
ENDC = '\033[0m'
lock = threading.Lock()
def odm_version():
with open(os.path.join(os.path.dirname(__file__), "..", "VERSION")) as f:
return f.read().split("\n")[0].strip()
def memory():
mem = virtual_memory()
return {
'total': round(mem.total / 1024 / 1024),
'available': round(mem.available / 1024 / 1024)
}
# logging has too many quirks...
class ODMLogger:
def __init__(self):
self.show_debug = False
self.json = None
self.json_output_file = None
self.start_time = datetime.datetime.now()
def log(self, startc, msg, level_name):
level = ("[" + level_name + "]").ljust(9)
print("%s%s %s%s" % (startc, level, msg, ENDC))
sys.stdout.flush()
with lock:
print("%s%s %s%s" % (startc, level, msg, ENDC))
sys.stdout.flush()
if self.json is not None:
self.json['stages'][-1]['messages'].append({
'message': msg,
'type': level_name.lower()
})
def init_json_output(self, output_files, args):
self.json_output_files = output_files
self.json_output_file = output_files[0]
self.json = {}
self.json['odmVersion'] = odm_version()
self.json['memory'] = memory()
self.json['cpus'] = multiprocessing.cpu_count()
self.json['images'] = -1
self.json['options'] = args_to_dict(args)
self.json['startTime'] = self.start_time.isoformat()
self.json['stages'] = []
self.json['processes'] = []
self.json['success'] = False
def log_json_stage_run(self, name, start_time):
if self.json is not None:
self.json['stages'].append({
'name': name,
'startTime': start_time.isoformat(),
'messages': [],
})
def log_json_images(self, count):
if self.json is not None:
self.json['images'] = count
def log_json_stage_error(self, error, exit_code, stack_trace = ""):
if self.json is not None:
self.json['error'] = {
'code': exit_code,
'message': error
}
self.json['stackTrace'] = list(map(str.strip, stack_trace.split("\n")))
self._log_json_end_time()
def log_json_success(self):
if self.json is not None:
self.json['success'] = True
self._log_json_end_time()
def log_json_process(self, cmd, exit_code, output = []):
if self.json is not None:
d = {
'command': cmd,
'exitCode': exit_code,
}
if output:
d['output'] = output
self.json['processes'].append(d)
def _log_json_end_time(self):
if self.json is not None:
end_time = datetime.datetime.now()
self.json['endTime'] = end_time.isoformat()
self.json['totalTime'] = round((end_time - self.start_time).total_seconds(), 2)
if self.json['stages']:
last_stage = self.json['stages'][-1]
last_stage['endTime'] = end_time.isoformat()
start_time = dateutil.parser.isoparse(last_stage['startTime'])
last_stage['totalTime'] = round((end_time - start_time).total_seconds(), 2)
def info(self, msg):
self.log(DEFAULT, msg, "INFO")
@ -33,6 +138,16 @@ class ODMLogger:
if self.show_debug:
self.log(OKGREEN, msg, "DEBUG")
def close(self):
if self.json is not None and self.json_output_file is not None:
try:
with open(self.json_output_file, 'w') as f:
f.write(json.dumps(self.json, indent=4))
for f in self.json_output_files[1:]:
shutil.copy(self.json_output_file, f)
except Exception as e:
print("Cannot write log.json: %s" % str(e))
logger = ODMLogger()
ODM_INFO = logger.info

Wyświetl plik

@ -0,0 +1,28 @@
from shlex import _find_unsafe
def double_quote(s):
"""Return a shell-escaped version of the string *s*."""
if not s:
return '""'
if _find_unsafe(s) is None:
return s
# use double quotes, and prefix double quotes with a \
# the string $"b is then quoted as "$\"b"
return '"' + s.replace('"', '\\\"') + '"'
def args_to_dict(args):
args_dict = vars(args)
result = {}
for k in sorted(args_dict.keys()):
# Skip _is_set keys
if k.endswith("_is_set"):
continue
# Don't leak token
if k == 'sm_cluster' and args_dict[k] is not None:
result[k] = True
else:
result[k] = args_dict[k]
return result

Wyświetl plik

@ -1,9 +1,10 @@
from __future__ import absolute_import
import os, shutil, sys, struct, random, math
import os, shutil, sys, struct, random, math, platform
from opendm.dem import commands
from opendm import system
from opendm import log
from opendm import context
from opendm import concurrency
from scipy import signal
import numpy as np
@ -64,8 +65,8 @@ def dem_to_points(inGeotiff, outPointCloud, verbose=False):
'verbose': '-verbose' if verbose else ''
}
system.run('{bin} -inputFile {infile} '
'-outputFile {outfile} '
system.run('"{bin}" -inputFile "{infile}" '
'-outputFile "{outfile}" '
'-skirtHeightThreshold 1.5 '
'-skirtIncrements 0.2 '
'-skirtHeightCap 100 '
@ -99,8 +100,8 @@ def dem_to_mesh_gridded(inGeotiff, outMesh, maxVertexCount, verbose=False, maxCo
'maxConcurrency': maxConcurrency,
'verbose': '-verbose' if verbose else ''
}
system.run('{bin} -inputFile {infile} '
'-outputFile {outfile} '
system.run('"{bin}" -inputFile "{infile}" '
'-outputFile "{outfile}" '
'-maxTileLength 2000 '
'-maxVertexCount {maxVertexCount} '
'-maxConcurrency {maxConcurrency} '
@ -123,9 +124,9 @@ def dem_to_mesh_gridded(inGeotiff, outMesh, maxVertexCount, verbose=False, maxCo
'max_faces': maxVertexCount * 2
}
system.run('{reconstructmesh} -i "{infile}" '
system.run('"{reconstructmesh}" -i "{infile}" '
'-o "{outfile}" '
'--remove-spikes 0 --remove-spurious 0 --smooth 0 '
'--remove-spikes 0 --remove-spurious 20 --smooth 0 '
'--target-face-num {max_faces} '.format(**cleanupArgs))
# Delete intermediate results
@ -145,39 +146,67 @@ def screened_poisson_reconstruction(inPointCloud, outMesh, depth = 8, samples =
# ext = .ply
outMeshDirty = os.path.join(mesh_path, "{}.dirty{}".format(basename, ext))
if os.path.isfile(outMeshDirty):
os.remove(outMeshDirty)
# Since PoissonRecon has some kind of a race condition on ppc64el, and this helps...
if platform.machine() == 'ppc64le':
log.ODM_WARNING("ppc64le platform detected, forcing single-threaded operation for PoissonRecon")
threads = 1
poissonReconArgs = {
'bin': context.poisson_recon_path,
'outfile': outMeshDirty,
'infile': inPointCloud,
'depth': depth,
'samples': samples,
'pointWeight': pointWeight,
'threads': threads,
'verbose': '--verbose' if verbose else ''
}
while True:
poissonReconArgs = {
'bin': context.poisson_recon_path,
'outfile': outMeshDirty,
'infile': inPointCloud,
'depth': depth,
'samples': samples,
'pointWeight': pointWeight,
'threads': int(threads),
'memory': int(concurrency.get_max_memory_mb(4, 0.8) // 1024),
'verbose': '--verbose' if verbose else ''
}
# Run PoissonRecon
try:
system.run('"{bin}" --in "{infile}" '
'--out "{outfile}" '
'--depth {depth} '
'--pointWeight {pointWeight} '
'--samplesPerNode {samples} '
'--threads {threads} '
'--maxMemory {memory} '
'--bType 2 '
'--linearFit '
'{verbose}'.format(**poissonReconArgs))
except Exception as e:
log.ODM_WARNING(str(e))
if os.path.isfile(outMeshDirty):
break # Done!
else:
# PoissonRecon will sometimes fail due to race conditions
# on certain machines, especially on Windows
threads //= 2
if threads < 1:
break
else:
log.ODM_WARNING("PoissonRecon failed with %s threads, let's retry with %s..." % (threads, threads // 2))
# Run PoissonRecon
system.run('{bin} --in {infile} '
'--out {outfile} '
'--depth {depth} '
'--pointWeight {pointWeight} '
'--samplesPerNode {samples} '
'--threads {threads} '
'--linearFit '
'{verbose}'.format(**poissonReconArgs))
# Cleanup and reduce vertex count if necessary
cleanupArgs = {
'reconstructmesh': context.omvs_reconstructmesh_path,
'outfile': outMesh,
'infile': outMeshDirty,
'infile':outMeshDirty,
'max_faces': maxVertexCount * 2
}
system.run('{reconstructmesh} -i "{infile}" '
system.run('"{reconstructmesh}" -i "{infile}" '
'-o "{outfile}" '
'--remove-spikes 0 --remove-spurious 0 --smooth 0 '
'--remove-spikes 0 --remove-spurious 20 --smooth 0 '
'--target-face-num {max_faces} '.format(**cleanupArgs))
# Delete intermediate results

Wyświetl plik

@ -12,6 +12,7 @@ from rasterio.transform import Affine, rowcol
from rasterio.mask import mask
from opendm import io
from opendm.tiles.tiler import generate_orthophoto_tiles
from opendm.cogeo import convert_to_cogeo
from osgeo import gdal
@ -31,7 +32,7 @@ def build_overviews(orthophoto_file):
kwargs = {'orthophoto': orthophoto_file}
# Run gdaladdo
system.run('gdaladdo -ro -r average '
system.run('gdaladdo -r average '
'--config BIGTIFF_OVERVIEW IF_SAFER '
'--config COMPRESS_OVERVIEW JPEG '
'{orthophoto} 2 4 8 16'.format(**kwargs))
@ -72,7 +73,7 @@ def post_orthophoto_steps(args, bounds_file_path, orthophoto_file, orthophoto_ti
if args.crop > 0:
Cropper.crop(bounds_file_path, orthophoto_file, get_orthophoto_vars(args), keep_original=not args.optimize_disk_space, warp_options=['-dstalpha'])
if args.build_overviews:
if args.build_overviews and not args.cog:
build_overviews(orthophoto_file)
if args.orthophoto_png:
@ -84,6 +85,8 @@ def post_orthophoto_steps(args, bounds_file_path, orthophoto_file, orthophoto_ti
if args.tiles:
generate_orthophoto_tiles(orthophoto_file, orthophoto_tiles_dir, args.max_concurrency)
if args.cog:
convert_to_cogeo(orthophoto_file, max_workers=args.max_concurrency, compression=args.orthophoto_compression)
def compute_mask_raster(input_raster, vector_mask, output_raster, blend_distance=20, only_max_coords_feature=False):
if not os.path.exists(input_raster):

Wyświetl plik

@ -4,11 +4,15 @@ OpenSfM related utils
import os, shutil, sys, json, argparse
import yaml
import numpy as np
import pyproj
from pyproj import CRS
from opendm import io
from opendm import log
from opendm import system
from opendm import context
from opendm import camera
from opendm import location
from opendm.utils import get_depthmap_resolution
from opendm.photo import find_largest_photo_dim
from opensfm.large import metadataset
@ -18,14 +22,17 @@ from opensfm.dataset import DataSet
from opensfm import report
from opendm.multispectral import get_photos_by_band
from opendm.gpu import has_gpus
from opensfm import multiview
from opensfm.actions.export_geocoords import _get_transformation
class OSFMContext:
def __init__(self, opensfm_project_path):
self.opensfm_project_path = opensfm_project_path
def run(self, command):
system.run('%s/bin/opensfm %s "%s"' %
(context.opensfm_path, command, self.opensfm_project_path))
osfm_bin = os.path.join(context.opensfm_path, 'bin', 'opensfm')
system.run('%s %s "%s"' %
(osfm_bin, command, self.opensfm_project_path))
def is_reconstruction_done(self):
tracks_file = os.path.join(self.opensfm_project_path, 'tracks.csv')
@ -49,13 +56,12 @@ class OSFMContext:
# Check that a reconstruction file has been created
if not self.reconstructed():
log.ODM_ERROR("The program could not process this dataset using the current settings. "
raise system.ExitException("The program could not process this dataset using the current settings. "
"Check that the images have enough overlap, "
"that there are enough recognizable features "
"and that the images are in focus. "
"You could also try to increase the --min-num-features parameter."
"The program will now exit.")
exit(1)
def setup(self, args, images_path, reconstruction, append_config = [], rerun=False):
@ -193,6 +199,7 @@ class OSFMContext:
"optimize_camera_parameters: %s" % ('no' if args.use_fixed_camera_params or args.cameras else 'yes'),
"undistorted_image_format: tif",
"bundle_outlier_filtering_type: AUTO",
"sift_peak_threshold: 0.066",
"align_orientation_prior: vertical",
"triangulation_type: ROBUST",
"retriangulation_ratio: 2",
@ -254,6 +261,10 @@ class OSFMContext:
config_filename = self.get_config_file_path()
with open(config_filename, 'w') as fout:
fout.write("\n".join(config))
# We impose our own reference_lla
if reconstruction.is_georeferenced():
self.write_reference_lla(reconstruction.georef.utm_east_offset, reconstruction.georef.utm_north_offset, reconstruction.georef.proj4())
else:
log.ODM_WARNING("%s already exists, not rerunning OpenSfM setup" % list_path)
@ -348,7 +359,7 @@ class OSFMContext:
# (containing only the primary band)
if os.path.exists(self.recon_file()):
os.remove(self.recon_file())
os.rename(self.recon_backup_file(), self.recon_file())
os.replace(self.recon_backup_file(), self.recon_file())
log.ODM_INFO("Restored reconstruction.json")
def backup_reconstruction(self):
@ -427,6 +438,71 @@ class OSFMContext:
log.ODM_WARNING("Report could not be generated")
else:
log.ODM_WARNING("Report %s already exported" % report_path)
def write_reference_lla(self, offset_x, offset_y, proj4):
reference_lla = self.path("reference_lla.json")
longlat = CRS.from_epsg("4326")
lon, lat = location.transform2(CRS.from_proj4(proj4), longlat, offset_x, offset_y)
with open(reference_lla, 'w') as f:
f.write(json.dumps({
'latitude': lat,
'longitude': lon,
'altitude': 0.0
}, indent=4))
log.ODM_INFO("Wrote reference_lla.json")
def ground_control_points(self, proj4):
"""
Load ground control point information.
"""
ds = DataSet(self.opensfm_project_path)
gcps = ds.load_ground_control_points()
if not gcps:
return []
reconstructions = ds.load_reconstruction()
reference = ds.load_reference()
projection = pyproj.Proj(proj4)
t = _get_transformation(reference, projection, (0, 0))
A, b = t[:3, :3], t[:3, 3]
result = []
for gcp in gcps:
if not gcp.coordinates.has_value:
continue
triangulated = None
for rec in reconstructions:
triangulated = multiview.triangulate_gcp(gcp, rec.shots, 1.0, 0.1)
if triangulated is None:
continue
else:
break
if triangulated is None:
continue
triangulated_topocentric = np.dot(A.T, triangulated)
coordinates_topocentric = np.array(gcp.coordinates.value)
coordinates = np.dot(A, coordinates_topocentric) + b
triangulated = triangulated + b
result.append({
'id': gcp.id,
'observations': [obs.shot_id for obs in gcp.observations],
'triangulated': triangulated,
'coordinates': coordinates,
'error': np.abs(triangulated_topocentric - coordinates_topocentric)
})
return result
def name(self):
return os.path.basename(os.path.abspath(self.path("..")))
@ -450,11 +526,20 @@ def get_submodel_argv(args, submodels_path = None, submodel_name = None):
reading the contents of --cameras
"""
assure_always = ['orthophoto_cutline', 'dem_euclidean_map', 'skip_3dmodel', 'skip_report']
remove_always = ['split', 'split_overlap', 'rerun_from', 'rerun', 'gcp', 'end_with', 'sm_cluster', 'rerun_all', 'pc_csv', 'pc_las', 'pc_ept', 'tiles']
remove_always = ['split', 'split_overlap', 'rerun_from', 'rerun', 'gcp', 'end_with', 'sm_cluster', 'rerun_all', 'pc_csv', 'pc_las', 'pc_ept', 'tiles', 'copy-to', 'cog']
read_json_always = ['cameras']
argv = sys.argv
result = [argv[0]] # Startup script (/path/to/run.py)
# Startup script (/path/to/run.py)
startup_script = argv[0]
# On Windows, make sure we always invoke the "run.bat" file
if sys.platform == 'win32':
startup_script_dir = os.path.dirname(startup_script)
startup_script = os.path.join(startup_script_dir, "run")
result = [startup_script]
args_dict = vars(args).copy()
set_keys = [k[:-len("_is_set")] for k in args_dict.keys() if k.endswith("_is_set")]

Wyświetl plik

@ -201,6 +201,7 @@ class ODM_Photo:
'@Camera:RigCameraIndex', # Parrot Sequoia, Sentera 21244-00_3.2MP-GS-0001
'Camera:RigCameraIndex', # MicaSense Altum
])
self.set_attr_from_xmp_tag('radiometric_calibration', tags, [
'MicaSense:RadiometricCalibration',
])
@ -233,7 +234,8 @@ class ODM_Photo:
], float)
self.set_attr_from_xmp_tag('capture_uuid', tags, [
'@drone-dji:CaptureUUID'
'@drone-dji:CaptureUUID', # DJI
'@Camera:ImageUniqueID', # sentera 6x
])
# Phantom 4 RTK
@ -425,7 +427,7 @@ class ODM_Photo:
def get_photometric_exposure(self):
# H ~= (exposure_time) / (f_number^2)
if self.fnumber is not None and self.exposure_time > 0:
if self.fnumber is not None and self.exposure_time is not None and self.exposure_time > 0 and self.fnumber > 0:
return self.exposure_time / (self.fnumber * self.fnumber)
def get_horizontal_irradiance(self):

Wyświetl plik

@ -6,7 +6,7 @@ from opendm.system import run
from opendm import entwine
from opendm import io
from opendm.concurrency import parallel_map
from pipes import quote
from opendm.utils import double_quote
def ply_info(input_ply):
if not os.path.exists(input_ply):
@ -60,7 +60,7 @@ def split(input_point_cloud, outdir, filename_template, capacity, dims=None):
if filename_template.endswith(".ply"):
cmd += ("--writers.ply.sized_types=false "
"--writers.ply.storage_mode='little endian' ")
"--writers.ply.storage_mode=\"little endian\" ")
if dims is not None:
cmd += '--writers.ply.dims="%s"' % dims
system.run(cmd)
@ -151,7 +151,7 @@ def filter(input_point_cloud, output_point_cloud, standard_deviation=2.5, meank=
"-o \"{outputFile}\" "
"{stages} "
"--writers.ply.sized_types=false "
"--writers.ply.storage_mode='little endian' "
"--writers.ply.storage_mode=\"little endian\" "
"--writers.ply.dims=\"{dims}\" "
"").format(**filterArgs)
@ -159,13 +159,13 @@ def filter(input_point_cloud, output_point_cloud, standard_deviation=2.5, meank=
cmd += "--filters.sample.radius={} ".format(sample_radius)
if 'outlier' in filters:
cmd += ("--filters.outlier.method='statistical' "
cmd += ("--filters.outlier.method=\"statistical\" "
"--filters.outlier.mean_k={} "
"--filters.outlier.multiplier={} ").format(meank, standard_deviation)
if 'range' in filters:
# Remove outliers
cmd += "--filters.range.limits='Classification![7:7]' "
cmd += "--filters.range.limits=\"Classification![7:7]\" "
system.run(cmd)
@ -189,14 +189,14 @@ def get_extent(input_point_cloud):
# We know PLY files do not have --summary support
if input_point_cloud.lower().endswith(".ply"):
fallback = True
run('pdal info {0} > {1}'.format(input_point_cloud, json_file))
run('pdal info "{0}" > "{1}"'.format(input_point_cloud, json_file))
try:
if not fallback:
run('pdal info --summary {0} > {1}'.format(input_point_cloud, json_file))
run('pdal info --summary "{0}" > "{1}"'.format(input_point_cloud, json_file))
except:
fallback = True
run('pdal info {0} > {1}'.format(input_point_cloud, json_file))
run('pdal info "{0}" > "{1}"'.format(input_point_cloud, json_file))
bounds = {}
with open(json_file, 'r') as f:
@ -240,7 +240,7 @@ def merge(input_point_cloud_files, output_file, rerun=False):
os.remove(output_file)
kwargs = {
'all_inputs': " ".join(map(quote, input_point_cloud_files)),
'all_inputs': " ".join(map(double_quote, input_point_cloud_files)),
'output': output_file
}
@ -312,7 +312,7 @@ def merge_ply(input_point_cloud_files, output_file, dims=None):
'--writers.ply.sized_types=false',
'--writers.ply.storage_mode="little endian"',
('--writers.ply.dims="%s"' % dims) if dims is not None else '',
' '.join(map(quote, input_point_cloud_files + [output_file])),
' '.join(map(double_quote, input_point_cloud_files + [output_file])),
]
system.run(' '.join(cmd))

Wyświetl plik

@ -13,7 +13,7 @@ from pyodm import Node, exceptions
from pyodm.utils import AtomicCounter
from pyodm.types import TaskStatus
from opendm.osfm import OSFMContext, get_submodel_args_dict, get_submodel_argv
from pipes import quote
from opendm.utils import double_quote
try:
import queue
@ -45,8 +45,7 @@ class LocalRemoteExecutor:
log.ODM_WARNING("LRE: The node seems to be offline! We'll still process the dataset, but it's going to run entirely locally.")
self.node_online = False
except Exception as e:
log.ODM_ERROR("LRE: An unexpected problem happened while opening the node connection: %s" % str(e))
exit(1)
raise system.ExitException("LRE: An unexpected problem happened while opening the node connection: %s" % str(e))
def set_projects(self, paths):
self.project_paths = paths
@ -474,7 +473,7 @@ class ToolchainTask(Task):
argv = get_submodel_argv(config.config(), submodels_path, submodel_name)
# Re-run the ODM toolchain on the submodel
system.run(" ".join(map(quote, map(str, argv))), env_vars=os.environ.copy())
system.run(" ".join(map(double_quote, map(str, argv))), env_vars=os.environ.copy())
# This will only get executed if the command above succeeds
self.touch(completed_file)

Wyświetl plik

@ -6,6 +6,8 @@ import sys
import subprocess
import string
import signal
import io
from collections import deque
from opendm import context
from opendm import log
@ -15,6 +17,9 @@ class SubprocessException(Exception):
super().__init__(msg)
self.errorCode = errorCode
class ExitException(Exception):
pass
def get_ccd_widths():
"""Return the CCD Width of the camera listed in the JSON defs file."""
with open(context.ccd_widths_path) as f:
@ -47,7 +52,10 @@ def exit_gracefully():
for sp in running_subprocesses:
log.ODM_WARNING("Sending TERM signal to PID %s..." % sp.pid)
os.killpg(os.getpgid(sp.pid), signal.SIGTERM)
if sys.platform == 'win32':
os.kill(sp.pid, signal.CTRL_C_EVENT)
else:
os.killpg(os.getpgid(sp.pid), signal.SIGTERM)
os._exit(1)
@ -63,18 +71,34 @@ def run(cmd, env_paths=[context.superbuild_bin_path], env_vars={}, packages_path
log.ODM_INFO('running %s' % cmd)
env = os.environ.copy()
sep = ":"
if sys.platform == 'win32':
sep = ";"
if len(env_paths) > 0:
env["PATH"] = env["PATH"] + ":" + ":".join(env_paths)
env["PATH"] = env["PATH"] + sep + sep.join(env_paths)
if len(packages_paths) > 0:
env["PYTHONPATH"] = env.get("PYTHONPATH", "") + ":" + ":".join(packages_paths)
env["PYTHONPATH"] = env.get("PYTHONPATH", "") + sep + sep.join(packages_paths)
for k in env_vars:
env[k] = str(env_vars[k])
p = subprocess.Popen(cmd, shell=True, env=env, preexec_fn=os.setsid)
p = subprocess.Popen(cmd, shell=True, env=env, start_new_session=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
running_subprocesses.append(p)
lines = deque()
for line in io.TextIOWrapper(p.stdout):
print(line, end="")
lines.append(line.strip())
if len(lines) == 11:
lines.popleft()
retcode = p.wait()
log.logger.log_json_process(cmd, retcode, list(lines))
running_subprocesses.remove(p)
if retcode < 0:
raise SubprocessException("Child was terminated by signal {}".format(-retcode), -retcode)

Wyświetl plik

@ -1,11 +1,12 @@
import os
import sys
from opendm import log
from opendm import system
from opendm import io
def generate_tiles(geotiff, output_dir, max_concurrency):
gdal2tiles = os.path.join(os.path.dirname(__file__), "gdal2tiles.py")
system.run('python3 "%s" --processes %s -z 5-21 -n -w none "%s" "%s"' % (gdal2tiles, max_concurrency, geotiff, output_dir))
system.run('%s "%s" --processes %s -z 5-21 -n -w none "%s" "%s"' % (sys.executable, gdal2tiles, max_concurrency, geotiff, output_dir))
def generate_orthophoto_tiles(geotiff, output_dir, max_concurrency):
try:
@ -29,7 +30,7 @@ def generate_colored_hillshade(geotiff):
system.run('gdaldem color-relief "%s" "%s" "%s" -alpha -co ALPHA=YES' % (geotiff, relief_file, colored_dem))
system.run('gdaldem hillshade "%s" "%s" -z 1.0 -s 1.0 -az 315.0 -alt 45.0' % (geotiff, hillshade_dem))
system.run('python3 "%s" "%s" "%s" "%s"' % (hsv_merge_script, colored_dem, hillshade_dem, colored_hillshade_dem))
system.run('%s "%s" "%s" "%s" "%s"' % (sys.executable, hsv_merge_script, colored_dem, hillshade_dem, colored_hillshade_dem))
return outputs
except Exception as e:

Wyświetl plik

@ -314,8 +314,10 @@ class ODM_Stage:
def run(self, outputs = {}):
start_time = system.now_raw()
log.ODM_INFO('Running %s stage' % self.name)
log.logger.log_json_stage_run(self.name, start_time)
log.ODM_INFO('Running %s stage' % self.name)
self.process(self.args, outputs)
# The tree variable should always be populated at this point
@ -357,6 +359,13 @@ class ODM_Stage:
progressbc.send_update(self.previous_stages_progress() +
(self.delta_progress() / 100.0) * float(progress))
def last_stage(self):
if self.next_stage:
return self.next_stage.last_stage()
else:
return self
def process(self, args, outputs):
raise NotImplementedError

Wyświetl plik

@ -1,6 +1,8 @@
import os, shutil
from opendm import log
from opendm.photo import find_largest_photo_dim
from osgeo import gdal
from opendm.loghelpers import double_quote
def get_depthmap_resolution(args, photos):
if 'depthmap_resolution_is_set' in args:
@ -39,4 +41,45 @@ def get_raster_stats(geotiff):
'stddev': s[3]
})
return stats
return stats
def get_processing_results_paths():
return [
"odm_georeferencing",
"odm_orthophoto",
"odm_dem",
"odm_report",
"odm_texturing",
"entwine_pointcloud",
"dsm_tiles",
"dtm_tiles",
"orthophoto_tiles",
"images.json",
"cameras.json",
"log.json",
]
def copy_paths(paths, destination, rerun):
if not os.path.isdir(destination):
os.makedirs(destination)
for p in paths:
basename = os.path.basename(p)
dst_path = os.path.join(destination, basename)
if rerun:
try:
if os.path.isfile(dst_path) or os.path.islink(dst_path):
os.remove(dst_path)
elif os.path.isdir(dst_path):
shutil.rmtree(dst_path)
except Exception as e:
log.ODM_WARNING("Cannot remove file %s: %s, skipping..." % (dst_path, str(e)))
if not os.path.exists(dst_path):
if os.path.isfile(p):
log.ODM_INFO("Copying %s --> %s" % (p, dst_path))
shutil.copy(p, dst_path)
elif os.path.isdir(p):
shutil.copytree(p, dst_path)
log.ODM_INFO("Copying %s --> %s" % (p, dst_path))

Wyświetl plik

@ -4,7 +4,8 @@ beautifulsoup4==4.9.3
cloudpickle==1.6.0
edt==2.0.2
ExifRead==2.3.2
Fiona==1.8.17
Fiona==1.8.17 ; sys_platform == 'linux' or sys_platform == 'darwin'
https://github.com/OpenDroneMap/windows-deps/raw/main/Fiona-1.8.19-cp38-cp38-win_amd64.whl ; sys_platform == 'win32'
joblib==0.17.0
laspy==1.7.0
lxml==4.6.1
@ -12,17 +13,19 @@ matplotlib==3.3.3
networkx==2.5
numpy==1.19.4
Pillow==8.0.1
vmem==1.0.0
pyodm==1.5.5
vmem==1.0.1
pyodm==1.5.6
pyproj==3.0.0.post1
Pysolar==0.9
pytz==2020.4
PyYAML==5.1
rasterio==1.1.8
rasterio==1.1.8 ; sys_platform == 'linux' or sys_platform == 'darwin'
https://github.com/OpenDroneMap/windows-deps/raw/main/rasterio-1.2.3-cp38-cp38-win_amd64.whl ; sys_platform == 'win32'
https://github.com/OpenDroneMap/windows-deps/raw/main/GDAL-3.2.3-cp38-cp38-win_amd64.whl ; sys_platform == 'win32'
repoze.lru==0.7
scikit-learn==0.23.2
scikit-image==0.17.2
scipy==1.5.4
xmltodict==0.12.0
fpdf2==2.2.0rc2
Shapely==1.7.1
Shapely==1.7.1

9
run.bat 100644
Wyświetl plik

@ -0,0 +1,9 @@
@echo off
rem Bypass "Terminate Batch Job" prompt.
setlocal
cd /d %~dp0
winrun.bat %* <NUL
endlocal

36
run.py
Wyświetl plik

@ -11,9 +11,10 @@ from opendm import config
from opendm import system
from opendm import io
from opendm.progress import progressbc
from opendm.utils import double_quote, get_processing_results_paths
from opendm.loghelpers import args_to_dict
import os
from pipes import quote
from stages.odm_app import ODMApp
@ -23,18 +24,10 @@ if __name__ == '__main__':
log.ODM_INFO('Initializing ODM - %s' % system.now())
# Print args
args_dict = vars(args)
args_dict = args_to_dict(args)
log.ODM_INFO('==============')
for k in sorted(args_dict.keys()):
# Skip _is_set keys
if k.endswith("_is_set"):
continue
# Don't leak token
if k == 'sm_cluster' and args_dict[k] is not None:
log.ODM_INFO('%s: True' % k)
else:
log.ODM_INFO('%s: %s' % (k, args_dict[k]))
for k in args_dict.keys():
log.ODM_INFO('%s: %s' % (k, args_dict[k]))
log.ODM_INFO('==============')
progressbc.set_project_name(args.name)
@ -49,19 +42,12 @@ if __name__ == '__main__':
if args.rerun_all:
log.ODM_INFO("Rerun all -- Removing old data")
os.system("rm -rf " +
" ".join([
quote(os.path.join(args.project_path, "odm_georeferencing")),
quote(os.path.join(args.project_path, "odm_meshing")),
quote(os.path.join(args.project_path, "odm_orthophoto")),
quote(os.path.join(args.project_path, "odm_dem")),
quote(os.path.join(args.project_path, "odm_report")),
quote(os.path.join(args.project_path, "odm_texturing")),
quote(os.path.join(args.project_path, "opensfm")),
quote(os.path.join(args.project_path, "odm_filterpoints")),
quote(os.path.join(args.project_path, "odm_texturing_25d")),
quote(os.path.join(args.project_path, "openmvs")),
quote(os.path.join(args.project_path, "entwine_pointcloud")),
quote(os.path.join(args.project_path, "submodels")),
" ".join([double_quote(os.path.join(args.project_path, p)) for p in get_processing_results_paths()] + [
double_quote(os.path.join(args.project_path, "odm_meshing")),
double_quote(os.path.join(args.project_path, "opensfm")),
double_quote(os.path.join(args.project_path, "odm_texturing_25d")),
double_quote(os.path.join(args.project_path, "odm_filterpoints")),
double_quote(os.path.join(args.project_path, "submodels")),
]))
app = ODMApp(args)

Wyświetl plik

@ -23,8 +23,6 @@ architectures:
- build-on: amd64
run-on: amd64
# Requires snapcraft to be called with --enable-experimental-package-repositories
# until the feature is released
package-repositories:
- type: apt
ppa: ubuntugis/ubuntugis-unstable
@ -187,27 +185,9 @@ parts:
chmod -R u=rwX,go=rX $PYTHONUSERBASE/lib/python*
stage:
# strip the temporary build files and sources
- -odm/SuperBuild/build/opencv
- -odm/SuperBuild/build/openmvs
- -odm/SuperBuild/build/odm_orthophoto
- -odm/SuperBuild/build
- -odm/SuperBuild/download
- -odm/SuperBuild/src/ceres
- -odm/SuperBuild/src/untwine
- -odm/SuperBuild/src/entwine
- -odm/SuperBuild/src/gflags
- -odm/SuperBuild/src/hexer
- -odm/SuperBuild/src/lastools
- -odm/SuperBuild/src/laszip
- -odm/SuperBuild/src/mvstexturing
- -odm/SuperBuild/src/opencv
- -odm/SuperBuild/src/opengv
- -odm/SuperBuild/src/openmvs
- -odm/SuperBuild/src/pcl
- -odm/SuperBuild/src/pdal
- -odm/SuperBuild/src/vcg
- -odm/SuperBuild/src/dem2mesh
- -odm/SuperBuild/src/dem2points
- -odm/SuperBuild/src/odm_orthophoto
- -odm/SuperBuild/src
prime:
# remove any static-libraries
- -**/*.a

Wyświetl plik

@ -90,6 +90,9 @@ class ODMLoadDatasetStage(types.ODM_Stage):
# check if we rerun cell or not
images_database_file = os.path.join(tree.root_path, 'images.json')
if not io.file_exists(images_database_file) or self.rerun():
if not os.path.exists(images_dir):
raise system.ExitException("There are no images in %s! Make sure that your project path and dataset name is correct. The current is set to: %s" % (images_dir, args.project_path))
files, rejects = get_images(images_dir)
if files:
# create ODMPhoto list
@ -126,13 +129,13 @@ class ODMLoadDatasetStage(types.ODM_Stage):
# Save image database for faster restart
save_images_database(photos, images_database_file)
else:
log.ODM_ERROR('Not enough supported images in %s' % images_dir)
exit(1)
raise system.ExitException('Not enough supported images in %s' % images_dir)
else:
# We have an images database, just load it
photos = load_images_database(images_database_file)
log.ODM_INFO('Found %s usable images' % len(photos))
log.logger.log_json_images(len(photos))
# Create reconstruction object
reconstruction = types.ODM_Reconstruction(photos)

Wyświetl plik

@ -105,7 +105,7 @@ class ODMMvsTexStage(types.ODM_Stage):
shutil.rmtree(mvs_tmp_dir)
# run texturing binary
system.run('{bin} {nvm_file} {model} {out_dir} '
system.run('"{bin}" "{nvm_file}" "{model}" "{out_dir}" '
'-d {dataTerm} -o {outlierRemovalType} '
'-t {toneMapping} '
'{intermediate} '

Wyświetl plik

@ -1,4 +1,4 @@
import os, traceback
import os, traceback, sys
from opendm import context
from opendm import types
@ -18,6 +18,8 @@ from stages.odm_filterpoints import ODMFilterPoints
from stages.splitmerge import ODMSplitStage, ODMMergeStage
from stages.odm_report import ODMReport
from stages.odm_postprocess import ODMPostProcess
class ODMApp:
def __init__(self, args):
@ -27,6 +29,12 @@ class ODMApp:
if args.debug:
log.logger.show_debug = True
json_log_paths = [os.path.join(args.project_path, "log.json")]
if args.copy_to:
json_log_paths.append(args.copy_to)
log.logger.init_json_output(json_log_paths, args)
dataset = ODMLoadDatasetStage('dataset', args, progress=5.0,
verbose=args.verbose)
split = ODMSplitStage('split', args, progress=75.0)
@ -36,7 +44,7 @@ class ODMApp:
filterpoints = ODMFilterPoints('odm_filterpoints', args, progress=52.0)
meshing = ODMeshingStage('odm_meshing', args, progress=60.0,
max_vertex=args.mesh_size,
oct_tree=args.mesh_octree_depth,
oct_tree=max(1, min(14, args.mesh_octree_depth)),
samples=1.0,
point_weight=4.0,
max_concurrency=args.max_concurrency,
@ -55,7 +63,9 @@ class ODMApp:
max_concurrency=args.max_concurrency,
verbose=args.verbose)
orthophoto = ODMOrthoPhotoStage('odm_orthophoto', args, progress=98.0)
report = ODMReport('odm_report', args, progress=100.0)
report = ODMReport('odm_report', args, progress=99.0)
postprocess = ODMPostProcess('odm_postprocess', args, progress=100.0)
# Normal pipeline
self.first_stage = dataset
@ -76,21 +86,25 @@ class ODMApp:
.connect(georeferencing) \
.connect(dem) \
.connect(orthophoto) \
.connect(report)
.connect(report) \
.connect(postprocess)
def execute(self):
try:
self.first_stage.run()
log.logger.log_json_success()
return 0
except system.SubprocessException as e:
print("")
print("===== Dumping Info for Geeks (developers need this to fix bugs) =====")
print(str(e))
traceback.print_exc()
stack_trace = traceback.format_exc()
print(stack_trace)
print("===== Done, human-readable information to follow... =====")
print("")
code = e.errorCode
log.logger.log_json_stage_error(str(e), code, stack_trace)
if code == 139 or code == 134 or code == 1:
# Segfault
@ -107,3 +121,12 @@ class ODMApp:
# TODO: more?
return code
except system.ExitException as e:
log.ODM_ERROR(str(e))
log.logger.log_json_stage_error(str(e), 1, traceback.format_exc())
sys.exit(1)
except Exception as e:
log.logger.log_json_stage_error(str(e), 1, traceback.format_exc())
raise e
finally:
log.logger.close()

Wyświetl plik

@ -11,6 +11,8 @@ from opendm.dem import commands, utils
from opendm.cropper import Cropper
from opendm import pseudogeo
from opendm.tiles.tiler import generate_dem_tiles
from opendm.cogeo import convert_to_cogeo
class ODMDEMStage(types.ODM_Stage):
def process(self, args, outputs):
@ -27,11 +29,15 @@ class ODMDEMStage(types.ODM_Stage):
ignore_resolution = True
pseudo_georeference = True
# It is probably not reasonable to have accurate DEMs a the same resolution as the source photos, so reduce it
# by a factor!
gsd_scaling = 2.0
resolution = gsd.cap_resolution(args.dem_resolution, tree.opensfm_reconstruction,
gsd_error_estimate=-3,
ignore_gsd=args.ignore_gsd,
ignore_resolution=ignore_resolution,
has_gcp=reconstruction.has_gcp())
gsd_scaling=gsd_scaling,
ignore_gsd=args.ignore_gsd,
ignore_resolution=ignore_resolution,
has_gcp=reconstruction.has_gcp())
log.ODM_INFO('Classify: ' + str(args.pc_classify))
log.ODM_INFO('Create DSM: ' + str(args.dsm))
@ -126,6 +132,9 @@ class ODMDEMStage(types.ODM_Stage):
if args.tiles:
generate_dem_tiles(dem_geotiff_path, tree.path("%s_tiles" % product), args.max_concurrency)
if args.cog:
convert_to_cogeo(dem_geotiff_path, max_workers=args.max_concurrency)
progress += 30
self.update_progress(progress)
else:

Wyświetl plik

@ -31,5 +31,6 @@ class ODMFilterPoints(types.ODM_Stage):
log.ODM_WARNING('Found a valid point cloud file in: %s' %
tree.filtered_point_cloud)
if args.optimize_disk_space:
os.remove(inputPointCloud)
if args.optimize_disk_space and inputPointCloud:
if os.path.isfile(inputPointCloud):
os.remove(inputPointCloud)

Wyświetl plik

@ -1,6 +1,9 @@
import os
import struct
import pipes
import fiona
import fiona.crs
from collections import OrderedDict
from opendm import io
from opendm import log
@ -10,12 +13,70 @@ from opendm import context
from opendm.cropper import Cropper
from opendm import point_cloud
from opendm.multispectral import get_primary_band_name
from opendm.osfm import OSFMContext
class ODMGeoreferencingStage(types.ODM_Stage):
def process(self, args, outputs):
tree = outputs['tree']
reconstruction = outputs['reconstruction']
# Export GCP information if available
gcp_export_file = tree.path("odm_georeferencing", "ground_control_points.gpkg")
gcp_gml_export_file = tree.path("odm_georeferencing", "ground_control_points.gml")
if reconstruction.has_gcp() and (not io.file_exists(gcp_export_file) or self.rerun()):
octx = OSFMContext(tree.opensfm)
gcps = octx.ground_control_points(reconstruction.georef.proj4())
if len(gcps):
gcp_schema = {
'geometry': 'Point',
'properties': OrderedDict([
('id', 'str'),
('observations_count', 'int'),
('observations_list', 'str'),
('triangulated_x', 'float'),
('triangulated_y', 'float'),
('triangulated_z', 'float'),
('error_x', 'float'),
('error_y', 'float'),
('error_z', 'float'),
])
}
# Write GeoPackage
with fiona.open(gcp_export_file, 'w', driver="GPKG",
crs=fiona.crs.from_string(reconstruction.georef.proj4()),
schema=gcp_schema) as f:
for gcp in gcps:
f.write({
'geometry': {
'type': 'Point',
'coordinates': gcp['coordinates'],
},
'properties': OrderedDict([
('id', gcp['id']),
('observations_count', len(gcp['observations'])),
('observations_list', ",".join(gcp['observations'])),
('triangulated_x', gcp['triangulated'][0]),
('triangulated_y', gcp['triangulated'][1]),
('triangulated_z', gcp['triangulated'][2]),
('error_x', gcp['error'][0]),
('error_y', gcp['error'][1]),
('error_z', gcp['error'][2]),
])
})
# Write GML
try:
system.run('ogr2ogr -of GML "{}" "{}"'.format(gcp_gml_export_file, gcp_export_file))
except Exception as e:
log.ODM_WARNING("Cannot generate ground control points GML file: %s" % str(e))
else:
log.ODM_WARNING("GCPs could not be loaded for writing to %s" % gcp_export_file)
if not io.file_exists(tree.odm_georeferencing_model_laz) or self.rerun():
cmd = ('pdal translate -i "%s" -o \"%s\"' % (tree.filtered_point_cloud, tree.odm_georeferencing_model_laz))
stages = ["ferry"]
@ -35,6 +96,12 @@ class ODMGeoreferencingStage(types.ODM_Stage):
'--writers.las.offset_z=0',
'--writers.las.a_srs="%s"' % reconstruction.georef.proj4()
]
if reconstruction.has_gcp() and io.file_exists(gcp_gml_export_file):
log.ODM_INFO("Embedding GCP info in point cloud")
params += [
'--writers.las.vlrs="{\\\"filename\\\": \\\"%s\\\", \\\"user_id\\\": \\\"ODM_GCP\\\", \\\"description\\\": \\\"Ground Control Points (GML)\\\"}"' % gcp_gml_export_file
]
system.run(cmd + ' ' + ' '.join(stages) + ' ' + ' '.join(params))
@ -68,7 +135,7 @@ class ODMGeoreferencingStage(types.ODM_Stage):
else:
log.ODM_WARNING('Found a valid georeferenced model in: %s'
% tree.odm_georeferencing_model_laz)
if args.optimize_disk_space and io.file_exists(tree.odm_georeferencing_model_laz) and io.file_exists(tree.filtered_point_cloud):
os.remove(tree.filtered_point_cloud)

Wyświetl plik

@ -9,10 +9,11 @@ from opendm import gsd
from opendm import orthophoto
from opendm.concurrency import get_max_memory
from opendm.cutline import compute_cutline
from pipes import quote
from opendm.utils import double_quote
from opendm import pseudogeo
from opendm.multispectral import get_primary_band_name
class ODMOrthoPhotoStage(types.ODM_Stage):
def process(self, args, outputs):
tree = outputs['tree']
@ -23,18 +24,11 @@ class ODMOrthoPhotoStage(types.ODM_Stage):
system.mkdir_p(tree.odm_orthophoto)
if not io.file_exists(tree.odm_orthophoto_tif) or self.rerun():
gsd_error_estimate = 0.1
ignore_resolution = False
if not reconstruction.is_georeferenced():
# Match DEMs
gsd_error_estimate = -3
ignore_resolution = True
resolution = 1.0 / (gsd.cap_resolution(args.orthophoto_resolution, tree.opensfm_reconstruction,
gsd_error_estimate=gsd_error_estimate,
ignore_gsd=args.ignore_gsd,
ignore_resolution=ignore_resolution,
has_gcp=reconstruction.has_gcp()) / 100.0)
ignore_gsd=args.ignore_gsd,
ignore_resolution=not reconstruction.is_georeferenced(),
has_gcp=reconstruction.has_gcp()) / 100.0)
# odm_orthophoto definitions
kwargs = {
@ -63,16 +57,16 @@ class ODMOrthoPhotoStage(types.ODM_Stage):
if not primary:
subdir = band['name'].lower()
models.append(os.path.join(base_dir, subdir, model_file))
kwargs['bands'] = '-bands %s' % (','.join([quote(b['name']) for b in reconstruction.multi_camera]))
kwargs['bands'] = '-bands %s' % (','.join([double_quote(b['name']) for b in reconstruction.multi_camera]))
else:
models.append(os.path.join(base_dir, model_file))
kwargs['models'] = ','.join(map(quote, models))
kwargs['models'] = ','.join(map(double_quote, models))
# run odm_orthophoto
system.run('{odm_ortho_bin} -inputFiles {models} '
'-logFile {log} -outputFile {ortho} -resolution {res} {verbose} '
'-outputCornerFile {corners} {bands}'.format(**kwargs))
system.run('"{odm_ortho_bin}" -inputFiles {models} '
'-logFile "{log}" -outputFile "{ortho}" -resolution {res} {verbose} '
'-outputCornerFile "{corners}" {bands}'.format(**kwargs))
# Create georeferenced GeoTiff
geotiffcreated = False
@ -114,7 +108,7 @@ class ODMOrthoPhotoStage(types.ODM_Stage):
'-a_srs \"{proj}\" '
'--config GDAL_CACHEMAX {max_memory}% '
'--config GDAL_TIFF_INTERNAL_MASK YES '
'{input} {output} > {log}'.format(**kwargs))
'"{input}" "{output}" > "{log}"'.format(**kwargs))
bounds_file_path = os.path.join(tree.odm_georeferencing, 'odm_georeferenced_model.bounds.gpkg')
@ -147,7 +141,7 @@ class ODMOrthoPhotoStage(types.ODM_Stage):
if io.file_exists(tree.odm_orthophoto_render):
pseudogeo.add_pseudo_georeferencing(tree.odm_orthophoto_render)
log.ODM_INFO("Renaming %s --> %s" % (tree.odm_orthophoto_render, tree.odm_orthophoto_tif))
os.rename(tree.odm_orthophoto_render, tree.odm_orthophoto_tif)
os.replace(tree.odm_orthophoto_render, tree.odm_orthophoto_tif)
else:
log.ODM_WARNING("Could not generate an orthophoto (it did not render)")
else:

Wyświetl plik

@ -0,0 +1,51 @@
import os
from osgeo import gdal
from opendm import io
from opendm import log
from opendm import types
from opendm.utils import copy_paths, get_processing_results_paths
class ODMPostProcess(types.ODM_Stage):
def process(self, args, outputs):
tree = outputs['tree']
reconstruction = outputs['reconstruction']
log.ODM_INFO("Post Processing")
if not outputs['large']:
# TODO: support for split-merge?
# Embed GCP info in 2D results via
# XML metadata fields
gcp_gml_export_file = tree.path("odm_georeferencing", "ground_control_points.gml")
if reconstruction.has_gcp() and io.file_exists(gcp_gml_export_file):
skip_embed_gcp = False
gcp_xml = ""
with open(gcp_gml_export_file) as f:
gcp_xml = f.read()
for product in [tree.odm_orthophoto_tif,
tree.path("odm_dem", "dsm.tif"),
tree.path("odm_dem", "dtm.tif")]:
if os.path.isfile(product):
ds = gdal.Open(product)
if ds is not None:
if ds.GetMetadata('xml:GROUND_CONTROL_POINTS') is None or self.rerun():
ds.SetMetadata(gcp_xml, 'xml:GROUND_CONTROL_POINTS')
ds = None
log.ODM_INFO("Wrote xml:GROUND_CONTROL_POINTS metadata to %s" % product)
else:
skip_embed_gcp = True
log.ODM_WARNING("Already embedded ground control point information")
break
else:
log.ODM_WARNING("Cannot open %s for writing, skipping GCP embedding" % product)
if args.copy_to:
try:
copy_paths([os.path.join(args.project_path, p) for p in get_processing_results_paths()], args.copy_to, self.rerun())
except Exception as e:
log.ODM_WARNING("Cannot copy to %s: %s" % (args.copy_to, str(e)))

Wyświetl plik

@ -195,4 +195,4 @@ class ODMReport(types.ODM_Stage):
else:
log.ODM_WARNING("Cannot generate overlap diagram, point cloud stats missing")
octx.export_report(os.path.join(tree.odm_report, "report.pdf"), odm_stats, self.rerun())
octx.export_report(os.path.join(tree.odm_report, "report.pdf"), odm_stats, self.rerun())

Wyświetl plik

@ -20,8 +20,7 @@ class ODMOpenMVSStage(types.ODM_Stage):
octx = OSFMContext(tree.opensfm)
if not photos:
log.ODM_ERROR('Not enough photos in photos array to start OpenMVS')
exit(1)
raise system.ExitException('Not enough photos in photos array to start OpenMVS')
# check if reconstruction was done before
if not io.file_exists(tree.openmvs_model) or self.rerun():
@ -73,6 +72,9 @@ class ODMOpenMVSStage(types.ODM_Stage):
if args.pc_tile:
config.append("--fusion-mode 1")
if not args.pc_geometric:
config.append("--geometric-iters 0")
system.run('%s "%s" %s' % (context.omvs_densify_path,
openmvs_scene_file,
@ -96,8 +98,7 @@ class ODMOpenMVSStage(types.ODM_Stage):
scene_files = glob.glob(os.path.join(tree.openmvs, "scene_[0-9][0-9][0-9][0-9].mvs"))
if len(scene_files) == 0:
log.ODM_ERROR("No OpenMVS scenes found. This could be a bug, or the reconstruction could not be processed.")
exit(1)
raise system.ExitException("No OpenMVS scenes found. This could be a bug, or the reconstruction could not be processed.")
log.ODM_INFO("Fusing depthmaps for %s scenes" % len(scene_files))
@ -144,7 +145,7 @@ class ODMOpenMVSStage(types.ODM_Stage):
log.ODM_ERROR("Could not compute dense point cloud (no PLY files available).")
if len(scene_ply_files) == 1:
# Simply rename
os.rename(scene_ply_files[0], tree.openmvs_model)
os.replace(scene_ply_files[0], tree.openmvs_model)
log.ODM_INFO("%s --> %s"% (scene_ply_files[0], tree.openmvs_model))
else:
# Merge
@ -159,8 +160,7 @@ class ODMOpenMVSStage(types.ODM_Stage):
]
system.run('%s %s' % (context.omvs_densify_path, ' '.join(config)))
else:
log.ODM_WARNING("Cannot find scene_dense.mvs, dense reconstruction probably failed. Exiting...")
exit(1)
raise system.ExitException("Cannot find scene_dense.mvs, dense reconstruction probably failed. Exiting...")
# TODO: add support for image masks

Wyświetl plik

@ -17,6 +17,8 @@ from opendm import thermal
from opendm import nvm
from opendm.photo import find_largest_photo
from opensfm.undistort import add_image_format_extension
class ODMOpenSfMStage(types.ODM_Stage):
def process(self, args, outputs):
tree = outputs['tree']
@ -24,8 +26,7 @@ class ODMOpenSfMStage(types.ODM_Stage):
photos = reconstruction.photos
if not photos:
log.ODM_ERROR('Not enough photos in photos array to start OpenSfM')
exit(1)
raise system.ExitException('Not enough photos in photos array to start OpenSfM')
octx = OSFMContext(tree.opensfm)
octx.setup(args, tree.dataset_raw, reconstruction=reconstruction, rerun=self.rerun())
@ -39,7 +40,7 @@ class ODMOpenSfMStage(types.ODM_Stage):
def cleanup_disk_space():
if args.optimize_disk_space:
for folder in ["features", "matches", "exif", "reports"]:
for folder in ["features", "matches", "reports"]:
folder_path = octx.path(folder)
if os.path.exists(folder_path):
if os.path.islink(folder_path):
@ -70,7 +71,7 @@ class ODMOpenSfMStage(types.ODM_Stage):
geocoords_flag_file = octx.path("exported_geocoords.txt")
if reconstruction.is_georeferenced() and (not io.file_exists(geocoords_flag_file) or self.rerun()):
octx.run('export_geocoords --reconstruction --proj \'%s\' --offset-x %s --offset-y %s' %
octx.run('export_geocoords --reconstruction --proj "%s" --offset-x %s --offset-y %s' %
(reconstruction.georef.proj4(), reconstruction.georef.utm_east_offset, reconstruction.georef.utm_north_offset))
# Destructive
shutil.move(tree.opensfm_geocoords_reconstruction, tree.opensfm_reconstruction)
@ -159,6 +160,7 @@ class ODMOpenSfMStage(types.ODM_Stage):
# We finally restore the original files later
added_shots_file = octx.path('added_shots_done.txt')
s2p, p2s = None, None
if not io.file_exists(added_shots_file) or self.rerun():
primary_band_name = multispectral.get_primary_band_name(reconstruction.multi_camera, args.primary_band)
@ -214,12 +216,12 @@ class ODMOpenSfMStage(types.ODM_Stage):
# Primary band maps to itself
if band['name'] == primary_band_name:
img_map[fname + '.tif'] = fname + '.tif'
img_map[add_image_format_extension(fname, 'tif')] = add_image_format_extension(fname, 'tif')
else:
band_filename = next((p.filename for p in p2s[fname] if p.band_name == band['name']), None)
if band_filename is not None:
img_map[fname + '.tif'] = band_filename + '.tif'
img_map[add_image_format_extension(fname, 'tif')] = add_image_format_extension(band_filename, 'tif')
else:
log.ODM_WARNING("Cannot find %s band equivalent for %s" % (band, fname))

Wyświetl plik

@ -17,8 +17,9 @@ from opendm.concurrency import get_max_memory
from opendm.remote import LocalRemoteExecutor
from opendm.shots import merge_geojson_shots
from opendm import point_cloud
from pipes import quote
from opendm.utils import double_quote
from opendm.tiles.tiler import generate_dem_tiles
from opendm.cogeo import convert_to_cogeo
class ODMSplitStage(types.ODM_Stage):
def process(self, args, outputs):
@ -43,9 +44,9 @@ class ODMSplitStage(types.ODM_Stage):
log.ODM_INFO("Large dataset detected (%s photos) and split set at %s. Preparing split merge." % (len(photos), args.split))
config = [
"submodels_relpath: ../submodels/opensfm",
"submodel_relpath_template: ../submodels/submodel_%04d/opensfm",
"submodel_images_relpath_template: ../submodels/submodel_%04d/images",
"submodels_relpath: " + os.path.join("..", "submodels", "opensfm"),
"submodel_relpath_template: " + os.path.join("..", "submodels", "submodel_%04d", "opensfm"),
"submodel_images_relpath_template: " + os.path.join("..", "submodels", "submodel_%04d", "images"),
"submodel_size: %s" % args.split,
"submodel_overlap: %s" % args.split_overlap,
]
@ -219,7 +220,7 @@ class ODMSplitStage(types.ODM_Stage):
argv = get_submodel_argv(args, tree.submodels_path, sp_octx.name())
# Re-run the ODM toolchain on the submodel
system.run(" ".join(map(quote, map(str, argv))), env_vars=os.environ.copy())
system.run(" ".join(map(double_quote, map(str, argv))), env_vars=os.environ.copy())
else:
lre.set_projects([os.path.abspath(os.path.join(p, "..")) for p in submodel_paths])
lre.run_toolchain()
@ -242,8 +243,8 @@ class ODMMergeStage(types.ODM_Stage):
if outputs['large']:
if not os.path.exists(tree.submodels_path):
log.ODM_ERROR("We reached the merge stage, but %s folder does not exist. Something must have gone wrong at an earlier stage. Check the log and fix possible problem before restarting?" % tree.submodels_path)
exit(1)
raise system.ExitException("We reached the merge stage, but %s folder does not exist. Something must have gone wrong at an earlier stage. Check the log and fix possible problem before restarting?" % tree.submodels_path)
# Merge point clouds
if args.merge in ['all', 'pointcloud']:
@ -337,6 +338,9 @@ class ODMMergeStage(types.ODM_Stage):
if args.tiles:
generate_dem_tiles(dem_file, tree.path("%s_tiles" % human_name.lower()), args.max_concurrency)
if args.cog:
convert_to_cogeo(dem_file, max_workers=args.max_concurrency)
else:
log.ODM_WARNING("Cannot merge %s, %s was not created" % (human_name, dem_file))
@ -363,8 +367,9 @@ class ODMMergeStage(types.ODM_Stage):
else:
log.ODM_WARNING("Found merged shots.geojson in %s" % tree.odm_report)
# Stop the pipeline short! We're done.
self.next_stage = None
# Stop the pipeline short by skipping to the postprocess stage.
# Afterwards, we're done.
self.next_stage = self.last_stage()
else:
log.ODM_INFO("Normal dataset, nothing to merge.")
self.progress = 0.0

Wyświetl plik

@ -55,7 +55,7 @@ if [ "$1" = "--setup" ]; then
# Misc aliases
echo "alias pdal=/code/SuperBuild/install/bin/pdal" >> $HOME/.bashrc
echo "alias opensfm=/code/SuperBuild/src/opensfm/bin/opensfm" >> $HOME/.bashrc
echo "alias opensfm=/code/SuperBuild/install/bin/opensfm/bin/opensfm" >> $HOME/.bashrc
su -c bash $2

Wyświetl plik

@ -0,0 +1,24 @@
eigen3:x64-windows
suitesparse:x64-windows
lapack:x64-windows
tbb:x64-windows
glog:x64-windows
curl:x64-windows
libxml2:x64-windows
zlib:x64-windows
libpng:x64-windows
libjpeg-turbo:x64-windows
tiff:x64-windows
flann:x64-windows
boost-filesystem:x64-windows
boost-date-time:x64-windows
boost-iostreams:x64-windows
boost-foreach:x64-windows
boost-signals2:x64-windows
boost-interprocess:x64-windows
boost-graph:x64-windows
boost-asio:x64-windows
boost-program-options:x64-windows
libgeotiff:x64-windows
cgal:x64-windows
yasm-tool:x86-windows

51
win32env.bat 100644
Wyświetl plik

@ -0,0 +1,51 @@
@echo off
rem This file is UTF-8 encoded, so we need to update the current code page while executing it
for /f "tokens=2 delims=:." %%a in ('"%SystemRoot%\System32\chcp.com"') do (
set _OLD_CODEPAGE=%%a
)
if defined _OLD_CODEPAGE (
"%SystemRoot%\System32\chcp.com" 65001 > nul
)
set ODMBASE=%~dp0
set GDALBASE=%ODMBASE%venv\Lib\site-packages\osgeo
set OSFMBASE=%ODMBASE%SuperBuild\install\bin\opensfm\bin
set SBBIN=%ODMBASE%SuperBuild\install\bin
set PATH=%GDALBASE%;%SBBIN%;%OSFMBASE%
set PROJ_LIB=%GDALBASE%\data\proj
set VIRTUAL_ENV=%ODMBASE%venv
set PYTHONPATH=%VIRTUAL_ENV%
set PYENVCFG=%VIRTUAL_ENV%\pyvenv.cfg
rem Hot-patching pyvenv.cfg
echo home = %ODMBASE%\python38> %PYENVCFG%
echo include-system-site-packages = false>> %PYENVCFG%
rem Hot-patching cv2 extension configs
echo BINARIES_PATHS = [r"%SBBIN%"] + BINARIES_PATHS> venv\Lib\site-packages\cv2\config.py
echo PYTHON_EXTENSIONS_PATHS = [r'%VIRTUAL_ENV%\lib\site-packages\cv2\python-3.8'] + PYTHON_EXTENSIONS_PATHS> venv\Lib\site-packages\cv2\config-3.8.py
if not defined PROMPT set PROMPT=$P$G
if defined _OLD_VIRTUAL_PROMPT set PROMPT=%_OLD_VIRTUAL_PROMPT%
if defined _OLD_VIRTUAL_PYTHONHOME set PYTHONHOME=%_OLD_VIRTUAL_PYTHONHOME%
set _OLD_VIRTUAL_PROMPT=%PROMPT%
set PROMPT=(venv) %PROMPT%
if defined PYTHONHOME set _OLD_VIRTUAL_PYTHONHOME=%PYTHONHOME%
set PYTHONHOME=
if defined _OLD_VIRTUAL_PATH set PATH=%_OLD_VIRTUAL_PATH%
if not defined _OLD_VIRTUAL_PATH set _OLD_VIRTUAL_PATH=%PATH%
set PATH=%VIRTUAL_ENV%\Scripts;%PATH%
:END
if defined _OLD_CODEPAGE (
"%SystemRoot%\System32\chcp.com" %_OLD_CODEPAGE% > nul
set _OLD_CODEPAGE=
)

12
winrun.bat 100644
Wyświetl plik

@ -0,0 +1,12 @@
@echo off
setlocal
call win32env.bat
python "%ODMBASE%\run.py" %*
endlocal
if defined ODM_NONINTERACTIVE (
exit
)