From ff0d2e1e643e860cf1e4eee1d3dc93b31edf53bf Mon Sep 17 00:00:00 2001
From: boxanm <47394922+boxanm@users.noreply.github.com>
Date: Mon, 9 Dec 2024 13:03:44 -0500
Subject: [PATCH 1/3] Add the DREAM lab to list of libpoitnmatcher applications
(#592)
---
README.md | 27 ++++++++++++++-------------
1 file changed, 14 insertions(+), 13 deletions(-)
diff --git a/README.md b/README.md
index d879a3de..4901ed71 100644
--- a/README.md
+++ b/README.md
@@ -30,7 +30,7 @@ The library is written in C++ for efficiency with [bindings in Python](doc/index
[](https://github.com/ahundt/awesome-robotics#point-clouds)
-
+
LIDAR
@@ -48,9 +48,9 @@ The library is written in C++ for efficiency with [bindings in Python](doc/index
### Supported OS And Architecture
_libpointmatcher_ is tested on our build system under the following architecture and OS:
-- Ubuntu bionic (18.04), focal (20.04) and jammy (22.04)
+- Ubuntu bionic (18.04), focal (20.04) and jammy (22.04)
- x86 and arm64/v8
-
+
Note:
- _libpointmatcher_ reportedly works on MacOs OsX (latest) and Windows (latest)
@@ -66,7 +66,7 @@ Execute the following to clone the repository with its submodule:
```shell
git clone --recurse-submodules https://github.com/norlab-ulaval/libpointmatcher.git
```
-If _libpointmatcher_ was previously cloned, execute the following to fetch its new submodule
+If _libpointmatcher_ was previously cloned, execute the following to fetch its new submodule
```shell
git submodule update --remote --recursive --init
```
@@ -79,12 +79,12 @@ on your workstation to speed up your local development workflow.
[//]: # (====Body=================================================================================)
# Documentation and Tutorials
-
+
**Quick link for the tutorial pages: [Tutorials](http://libpointmatcher.readthedocs.org/).
-Those tutorials are written using Markdown syntax and stored in the project's `/doc` folder. Their scope ranges from introductory material on performing point cloud registration to instructions for the more experienced developer on how to extend the library's codebase.
+Those tutorials are written using Markdown syntax and stored in the project's `/doc` folder. Their scope ranges from introductory material on performing point cloud registration to instructions for the more experienced developer on how to extend the library's codebase.
-Libpointmatcher's source code is fully documented based on doxygen to provide an easy API to developers. An example of this API can be found [here](https://norlab.ulaval.ca/libpointmatcher-doc/), but it is suggested to use the one build for your version in `doc/html`.
+Libpointmatcher's source code is fully documented based on doxygen to provide an easy API to developers. An example of this API can be found [here](https://norlab.ulaval.ca/libpointmatcher-doc/), but it is suggested to use the one build for your version in `doc/html`.
libpointmatcher was orginaly developed by [François Pomerleau](mailto:f.pomerleau@gmail.com) and [Stéphane Magnenat](http://stephane.magnenat.net) as part of our work at [ASL-ETH](http://www.asl.ethz.ch).
It is now maintained by the Northern Robotics Laboratory ([Norlab](https://norlab.ulaval.ca/)), led by François Pomerleau.
@@ -102,7 +102,7 @@ The library has a light dependency list:
* [Eigen] version 3, a modern C++ matrix and linear-algebra library,
* [boost] version 1.48 and up, portable C++ source libraries,
* [libnabo] version 1.0.7, a fast K Nearest Neighbour library for low-dimensional spaces,
-
+
and was compiled on:
* Ubuntu ([see how](/doc/CompilationUbuntu.md))
* Mac OS X ([see how](/doc/CompilationMac.md))
@@ -131,9 +131,9 @@ bash lpm_install_docker_tools.bash
```
-### Compilation & Installation
+### Compilation & Installation
-For beginner users unfamiliar with compiling and installing a library in Linux, go [here](doc/CompilationUbuntu.md) for detailed instructions on compiling libpointmatcher from the source code.
+For beginner users unfamiliar with compiling and installing a library in Linux, go [here](doc/CompilationUbuntu.md) for detailed instructions on compiling libpointmatcher from the source code.
For conveniences, you can use the provided installer script for ubuntu
```shell
@@ -214,7 +214,7 @@ and/or
If you are interested in learning more about different registration algorithms, we recently put together a literature review surveying multiple solutions. The review is organized in the same way as the library and many examples are provided based on real deployments.
-F. Pomerleau, F. Colas and R. Siegwart (2015), "_A Review of Point Cloud Registration Algorithms for Mobile Robotics_", __Foundations and Trends® in Robotics__: Vol. 4: No. 1, pp 1-104. https://doi.org/10.1561/2300000035
+F. Pomerleau, F. Colas and R. Siegwart (2015), "_A Review of Point Cloud Registration Algorithms for Mobile Robotics_", __Foundations and Trends® in Robotics__: Vol. 4: No. 1, pp 1-104. https://doi.org/10.1561/2300000035
If you don't have access to the journal, you can download it from [here](https://www.researchgate.net/publication/277558596_A_Review_of_Point_Cloud_Registration_Algorithms_for_Mobile_Robotics).
@@ -226,7 +226,7 @@ We also produced those freely available data sets to test different registration

-You can download the files in CSV or VTK formats, which are directly supported by the library I/O module.
+You can download the files in CSV or VTK formats, which are directly supported by the library I/O module.
# Projects and Partners
@@ -241,7 +241,8 @@ If you are using libpointmatcher in your project and you would like to have it l
* [Norlab](https://norlab.ulaval.ca/) is maintaining and using the library for its research on autonomous navigation in harsh environments.
* [ANYbotics AG](https://www.anybotics.com) is investigating autonomous navigation algorithms using this library.
* [Point Laz Mining LiDAR Expert](https://www.pointlaz.com/) is scanning mine shafts to ensure infrastructure safety.
-
+ * [Point Laz Mining LiDAR Expert](https://www.pointlaz.com/) is scanning mine shafts to ensure infrastructure safety.
+ * [DREAM lab](https://dream.georgiatech-metz.fr/research/woodseer/) use libpointmatcher to reconstruct wood logs in 3D.
For a larger list of work realized with libpointmatcher, please see the page [Applications And Publications](/doc/ApplicationsAndPub.md).
From 8e180b2d9f8a44591d544a7f83f799cee09f3010 Mon Sep 17 00:00:00 2001
From: boxanm <47394922+boxanm@users.noreply.github.com>
Date: Tue, 10 Dec 2024 09:23:14 -0500
Subject: [PATCH 2/3] Add support for binary PLY files (#588)
* Add support for binary-PLY file handling
* Add a utest for binary PLY files
* Basic float data type loading
* Adjust whitespace in build-python.yaml
* Update docs
---
.github/workflows/build-python.yaml | 28 ++---
doc/ImportExport.md | 12 +-
pointmatcher/IO.cpp | 187 ++++++++++++++++++++++------
pointmatcher/IO.h | 45 +++++--
python/pointmatcher/io.cpp | 100 +++++++--------
utest/ui/IO.cpp | 5 +
6 files changed, 256 insertions(+), 121 deletions(-)
diff --git a/.github/workflows/build-python.yaml b/.github/workflows/build-python.yaml
index d5870fca..1e8cdfdd 100644
--- a/.github/workflows/build-python.yaml
+++ b/.github/workflows/build-python.yaml
@@ -32,15 +32,15 @@ jobs:
fail-fast: false
matrix:
python_version: ['3.8', '3.9', '3.10', '3.11']
- os:
+ os:
# - windows-2019 cannot compile on Windows
# (!) TODO: The workflow contains code for Windows building, but we cannot compile it with MS MPI. Changes in the source code are needed.
- ubuntu-20.04
-
+
permissions:
contents: write
-
+
runs-on: ${{ matrix.os }}
steps:
@@ -54,10 +54,10 @@ jobs:
run: |
echo "BOOST_DIR=boost_${{ env.BOOST_MAJOR_VERSION }}_${{ env.BOOST_MINOR_VERSION }}_${{ env.BOOST_PATCH_VERSION }}" >> $GITHUB_ENV
echo "BOOST_VERSION=${{ env.BOOST_MAJOR_VERSION }}.${{ env.BOOST_MINOR_VERSION }}.${{ env.BOOST_PATCH_VERSION }}" >> $GITHUB_ENV
-
+
- name: Init env variable on Linux
if: ${{ runner.os == 'Linux' }}
- run: |
+ run: |
echo "BOOST_ARCHIVE_NAME=${{ env.BOOST_DIR }}.7z" >> $GITHUB_ENV
- name: Init base env variable on Windows
@@ -65,12 +65,12 @@ jobs:
run: |
echo "BOOST_DIR=boost_${{ env.BOOST_MAJOR_VERSION }}_${{ env.BOOST_MINOR_VERSION }}_${{ env.BOOST_PATCH_VERSION }}" | Out-File -FilePath $env:GITHUB_ENV -Append
echo "BOOST_VERSION=${{ env.BOOST_MAJOR_VERSION }}.${{ env.BOOST_MINOR_VERSION }}.${{ env.BOOST_PATCH_VERSION }}" | Out-File -FilePath $env:GITHUB_ENV -Append
-
+
- name: Init env variable on Windows
if: ${{ runner.os == 'Windows' }}
- run: |
+ run: |
echo "BOOST_ARCHIVE_NAME=${{ env.BOOST_DIR }}.zip" | Out-File -FilePath $env:GITHUB_ENV -Append
-
+
- name: Cache boost
id: cache-boost
if: ${{ runner.os == 'Linux' }}
@@ -100,7 +100,7 @@ jobs:
with:
path: ${{ env.PYBIND11_INSTALL_PATH }}
key: ${{ runner.os }}-pybind11-cache--${{ env.PYBIND11_VERSION }}-python-${{ matrix.python_version }}
-
+
- name: Make dirs on Linux
if: ${{ runner.os == 'Linux' }}
run: |
@@ -155,7 +155,7 @@ jobs:
run: |
python -m pip install -U 'pip>=23.0'
pip install 'numpy>=1.20' wheel 'setuptools>=61.0' 'build~=0.10'
-
+
- name: Install dependencies on Windows
if: ${{ runner.os == 'Windows' }}
run: |
@@ -229,7 +229,7 @@ jobs:
run: |
git clone -b ${{ env.PYBIND11_VERSION }} --single-branch https://github.com/pybind/pybind11.git
-
+
- name: Install pybind11 ${{ env.PYBIND11_VERSION }} on Linux
working-directory: ${{ env.PYBIND11_SRC_DIR }}
if: ${{ steps.cache-pybind11.outputs.cache-hit != 'true' && runner.os == 'Linux' }}
@@ -242,7 +242,7 @@ jobs:
-S . -B ./build
cmake --build ./build --target install
rm -r ./build
-
+
- name: Install pybind11 ${{ env.PYBIND11_VERSION }} on Windows
working-directory: ${{ env.PYBIND11_SRC_DIR }}
if: ${{ steps.cache-pybind11.outputs.cache-hit != 'true' && runner.os == 'Windows' }}
@@ -288,7 +288,7 @@ jobs:
-DCMAKE_BUILD_TYPE="RelWithDebInfo" `
-S . -B ${{ env.BUILD_DIR }}
cmake --build ${{ env.BUILD_DIR }} --target install
-
+
- name: Build python wheel
working-directory: ./python
run: |
@@ -298,7 +298,7 @@ jobs:
working-directory: ./python
run: |
pip install ${{ env.PYTHON_WHEEL_DIR }}/*.whl
-
+
- name: Test import
working-directory: ${{ runner.temp }}
run: python -c "from pypointmatcher import *"
diff --git a/doc/ImportExport.md b/doc/ImportExport.md
index 309e64c0..0e56e132 100644
--- a/doc/ImportExport.md
+++ b/doc/ImportExport.md
@@ -6,12 +6,12 @@ There exists a myriad of [graphics file formats](http://en.wikipedia.org/wiki/Ca
## Table of Supported File Formats
-| File Type | Extension | Versions Supported | Descriptors Supported | Additional Information |
-| --------- |:---------:|:------------------:|:---------------------:|---------|
-| Comma Separated Values | .csv | NA | yes (see [table of descriptor labels](#descmaptable)) | |
-| Visualization Toolkit Files | .vtk | Legacy format versions 3.0 and lower (ASCII only) | yes | Only polydata and unstructured grid VTK Datatypes supported. More information can be found [here](http://www.vtk.org/VTK/img/file-formats.pdf).|
-| Polygon File Format | .ply | 1.0 (ASCII only) | yes (see [table of descriptor labels](#descmaptable)) | |
-| Point Cloud Library Format | .pcd | 0.7 (ASCII only) | yes (see [table of descriptor labels](#descmaptable)) | |
+| File Type | Extension | Versions Supported | Descriptors Supported | Additional Information |
+| --------- |:---------:|:-------------------------------------------------:|:---------------------:|---------------------------------------------------------------------------------------------------------------------------------------------------|
+| Comma Separated Values | .csv | NA | yes (see [table of descriptor labels](#descmaptable)) | |
+| Visualization Toolkit Files | .vtk | Legacy format versions 3.0 and lower (ASCII only) | yes | Only polydata and unstructured grid VTK Datatypes supported. More information can be found [here](http://www.vtk.org/VTK/img/file-formats.pdf). |
+| Polygon File Format | .ply | 1.0 (ASCII and binary) | yes (see [table of descriptor labels](#descmaptable)) | Users are encouraged to execute `libpointmatcher` using `double` as the floating precision format to prevent overflows. |
+| Point Cloud Library Format | .pcd | 0.7 (ASCII only) | yes (see [table of descriptor labels](#descmaptable)) | |
## Comma Separated Values (CSV) Files
diff --git a/pointmatcher/IO.cpp b/pointmatcher/IO.cpp
index b9d3c455..3108275e 100644
--- a/pointmatcher/IO.cpp
+++ b/pointmatcher/IO.cpp
@@ -799,14 +799,14 @@ void PointMatcher::DataPoints::save(const std::string& fileName, bool binary,
const string& ext(path.extension().string());
if (boost::iequals(ext, ".vtk"))
return PointMatcherIO::saveVTK(*this, fileName, binary, precision);
+ else if (boost::iequals(ext, ".ply"))
+ return PointMatcherIO::savePLY(*this, fileName, binary, precision);
if (binary)
- throw runtime_error("save(): Binary writing is not supported together with extension \"" + ext + "\". Currently binary writing is only supported with \".vtk\".");
+ throw runtime_error("save(): Binary writing is not supported together with extension \"" + ext + "\". Currently binary writing is only supported with \".vtk\" and \".ply\".");
if (boost::iequals(ext, ".csv"))
return PointMatcherIO::saveCSV(*this, fileName, precision);
- else if (boost::iequals(ext, ".ply"))
- return PointMatcherIO::savePLY(*this, fileName, precision);
else if (boost::iequals(ext, ".pcd"))
return PointMatcherIO::savePCD(*this, fileName, precision);
else
@@ -1337,6 +1337,7 @@ typename PointMatcherIO::DataPoints PointMatcherIO::loadPLY(std::istream&
PLYElement* current_element = NULL;
bool skip_props = false; // flag to skip properties if element is not supported
unsigned elem_offset = 0; // keep track of line position of elements that are supported
+ bool is_binary = false;
string line;
safeGetLine(is, line);
@@ -1374,8 +1375,10 @@ typename PointMatcherIO::DataPoints PointMatcherIO::loadPLY(std::istream&
if (format_str != "ascii" && format_str != "binary_little_endian" && format_str != "binary_big_endian")
throw runtime_error(string("PLY parse error: format <") + format_str + string("> is not supported"));
- if (format_str == "binary_little_endian" || format_str == "binary_big_endian")
- throw runtime_error(string("PLY parse error: binary PLY files are not supported"));
+ if (format_str == "binary_little_endian")
+ is_binary = true;
+ if (format_str == "binary_big_endian")
+ throw runtime_error(string("PLY parse error: only little endian binary PLY files are currently supported"));
if (version_str != "1.0")
{
throw runtime_error(string("PLY parse error: version <") + version_str + string("> of ply is not supported"));
@@ -1386,8 +1389,6 @@ typename PointMatcherIO::DataPoints PointMatcherIO::loadPLY(std::istream&
}
else if (keyword == "element")
{
-
-
string elem_name, elem_num_s;
stringstream >> elem_name >> elem_num_s;
@@ -1528,9 +1529,9 @@ typename PointMatcherIO::DataPoints PointMatcherIO::loadPLY(std::istream&
it->pmRowID = rowIdTime;
timeLabelGen.add(supLabel.internalName);
rowIdTime++;
+ break;
default:
throw runtime_error(string("PLY Implementation Error: encounter a type different from FEATURE, DESCRIPTOR and TIME. Implementation not supported. See the definition of 'enum PMPropTypes'"));
- break;
}
// we stop searching once we have a match
@@ -1573,21 +1574,94 @@ typename PointMatcherIO::DataPoints PointMatcherIO::loadPLY(std::istream&
for(int i=0; i> value))
- {
- throw runtime_error(
- (boost::format("PLY parse error: expected %1% values (%2% points with %3% properties) but only found %4% values.") % nbValues % nbPoints % nbProp % i).str());
- }
- else
- {
- const int row = vertex->properties[propID].pmRowID;
- const PMPropTypes type = vertex->properties[propID].pmType;
-
- // rescale color from [0,254] to [0, 1[
- // FIXME: do we need that?
- if (vertex->properties[propID].name == "red" || vertex->properties[propID].name == "green" || vertex->properties[propID].name == "blue" || vertex->properties[propID].name == "alpha") {
- value /= 255.0;
- }
+ if (is_binary)
+ {
+ switch(vertex->properties[propID].type)
+ {
+ case PLYProperty::PLYPropertyType::INT8:
+ {
+ int8_t temp;
+ is.read(reinterpret_cast(&temp), sizeof(int8_t));
+ value = static_cast(temp);
+ break;
+ }
+ case PLYProperty::PLYPropertyType::UINT8:
+ {
+ uint8_t temp;
+ is.read(reinterpret_cast(&temp), sizeof(uint8_t));
+ value = static_cast(temp);
+ break;
+ }
+ case PLYProperty::PLYPropertyType::INT16:
+ {
+ int16_t temp;
+ is.read(reinterpret_cast(&temp), sizeof(int16_t));
+ value = static_cast(temp);
+ break;
+ }
+ case PLYProperty::PLYPropertyType::UINT16:
+ {
+ uint16_t temp;
+ is.read(reinterpret_cast(&temp), sizeof(uint16_t));
+ value = static_cast(temp);
+ break;
+ }
+ case PLYProperty::PLYPropertyType::INT32:
+ {
+ // TODO what happens if T is float and we overflow?
+ int32_t temp;
+ is.read(reinterpret_cast(&temp), sizeof(int32_t));
+ value = static_cast(temp);
+ break;
+ }
+ case PLYProperty::PLYPropertyType::UINT32:
+ {
+ // TODO what happens if T is float and we overflow?
+ uint32_t temp;
+ is.read(reinterpret_cast(&temp), sizeof(uint32_t));
+ value = static_cast(temp);
+ break;
+ }
+ case PLYProperty::PLYPropertyType::FLOAT32:
+ {
+ float temp;
+ is.read(reinterpret_cast(&temp), sizeof(float));
+ value = static_cast(temp); // Directly assign if T is float, or cast if T is double
+ break;
+ }
+ case PLYProperty::PLYPropertyType::FLOAT64:
+ {
+ // TODO what happens if T is float and we read a double?
+ double temp;
+ is.read(reinterpret_cast(&temp), sizeof(double));
+ value = static_cast(temp); // Directly assign if T is double, or cast if T is float
+ break;
+ }
+ default:
+ {
+ throw runtime_error(
+ (boost::format("Unsupported data type in binary mode %1%.") % static_cast(vertex->properties[propID].type)).str());
+ }
+ }
+ }
+ if ((is_binary && !is) || (!is_binary && !(is >> value)))
+ {
+ throw runtime_error(
+ (boost::format("PLY parse error: expected %1% values (%2% points with %3% properties) but only found %4% values.") % nbValues % nbPoints % nbProp %
+ i).str());
+ }
+ else
+ {
+ const int row = vertex->properties[propID].pmRowID;
+ const PMPropTypes type = vertex->properties[propID].pmType;
+
+ // rescale color from [0,255] to [0, 1[
+ // FIXME: do we need that?
+ if(vertex->properties[propID].name == "red" || vertex->properties[propID].name == "green" || vertex->properties[propID].name == "blue" ||
+ vertex->properties[propID].name == "alpha")
+ {
+ value /= 255.0;
+ }
switch (type)
{
@@ -1615,8 +1689,6 @@ typename PointMatcherIO::DataPoints PointMatcherIO::loadPLY(std::istream&
}
}
-
-
///////////////////////////
// 5- ASSEMBLE FINAL DATAPOINTS
@@ -1646,7 +1718,7 @@ typename PointMatcherIO::DataPoints PointMatcherIO::loadPLY(std::istream&
template
void PointMatcherIO::savePLY(const DataPoints& data,
- const std::string& fileName, unsigned precision)
+ const std::string& fileName, bool binary, unsigned precision)
{
//typedef typename DataPoints::Labels Labels;
@@ -1666,20 +1738,36 @@ void PointMatcherIO::savePLY(const DataPoints& data,
return;
}
- ofs << "ply\n" <<"format ascii 1.0\n";
+ if (binary)
+ ofs << "ply\n" <<"format binary_little_endian 1.0\n";
+ else
+ ofs << "ply\n" <<"format ascii 1.0\n";
+
+ ofs << "comment File created with libpointmatcher\n";
ofs << "element vertex " << pointCount << "\n";
+ std::string dataType;
+
+ using ValueType = typename std::conditional::value, float, double>::type;
+ if (std::is_same::value)
+ dataType = "float";
+ else
+ dataType = "double";
+
for (int f=0; f <(featCount-1); f++)
{
- ofs << "property float " << data.featureLabels[f].text << "\n";
+ ofs << "property " << dataType << " " << data.featureLabels[f].text << "\n";
}
for (size_t i = 0; i < data.descriptorLabels.size(); i++)
{
Label lab = data.descriptorLabels[i];
for (size_t s = 0; s < lab.span; s++)
-
{
- ofs << "property float " << getColLabel(lab,s) << "\n";
+ std::string label = getColLabel(lab,s);
+ if (label == "red" || label == "green" || label == "blue" || label == "alpha")
+ ofs << "property uchar " << label << "\n";
+ else
+ ofs << "property " << dataType << " " << label << "\n";
}
}
@@ -1690,9 +1778,16 @@ void PointMatcherIO::savePLY(const DataPoints& data,
{
for (int f = 0; f < featCount - 1; ++f)
{
- ofs << data.features(f, p);
- if(!(f == featCount-2 && descRows == 0))
- ofs << " ";
+ if (binary)
+ {
+ ofs.write(reinterpret_cast(&data.features(f, p)), sizeof(T));
+ }
+ else
+ {
+ ofs << data.features(f, p);
+ if(!(f == featCount - 2 && descRows == 0))
+ ofs << " ";
+ }
}
bool datawithColor = data.descriptorExists("color");
@@ -1701,34 +1796,44 @@ void PointMatcherIO::savePLY(const DataPoints& data,
for (int d = 0; d < descRows; ++d)
{
if (datawithColor && d >= colorStartingRow && d < colorEndRow) {
- ofs << static_cast(data.descriptors(d, p) * 255.0);
+ if (binary)
+ {
+ char value = static_cast(data.descriptors(d, p) * 255.0);
+ ofs.write(reinterpret_cast(&value), sizeof(char));
+ }
+ else
+ ofs << static_cast(data.descriptors(d, p) * 255.0);
} else {
- ofs << data.descriptors(d, p);
+ if (binary)
+ ofs.write(reinterpret_cast(&data.descriptors(d, p)), sizeof(T));
+ else
+ ofs << data.descriptors(d, p);
}
- if(d != descRows-1)
+ if(d != descRows-1 && !binary)
ofs << " ";
}
- ofs << "\n";
+ if(!binary)
+ ofs << "\n";
}
ofs.close();
}
template
-void PointMatcherIO::savePLY(const DataPoints& data, const std::string& fileName, unsigned precision);
+void PointMatcherIO::savePLY(const DataPoints& data, const std::string& fileName, bool binary, unsigned precision);
template
-void PointMatcherIO::savePLY(const DataPoints& data, const std::string& fileName, unsigned precision);
+void PointMatcherIO::savePLY(const DataPoints& data, const std::string& fileName, bool binary, unsigned precision);
//! @(brief) Regular PLY property constructor
template
PointMatcherIO::PLYProperty::PLYProperty(const std::string& type,
const std::string& name, const unsigned pos) :
name(name),
- type(type),
pos(pos)
{
if (plyPropTypeValid(type))
{
+ this->type = get_type_from_string(type);
is_list = false;
}
else
@@ -1749,12 +1854,12 @@ template
PointMatcherIO::PLYProperty::PLYProperty(const std::string& idx_type,
const std::string& type, const std::string& name, const unsigned pos) :
name(name),
- type(type),
idx_type(idx_type),
pos(pos)
{
if (plyPropTypeValid(idx_type) && plyPropTypeValid(type))
{
+ this->type = get_type_from_string(type);
is_list = true;
}
else
diff --git a/pointmatcher/IO.h b/pointmatcher/IO.h
index 1089f1b0..0aaae940 100644
--- a/pointmatcher/IO.h
+++ b/pointmatcher/IO.h
@@ -57,9 +57,9 @@ struct PointMatcherIO
//! Map to associate common descriptor sublabels to PM descriptor matrix row and labels
//! ex: nx, ny, nz are associated with (0,normals) (1,normals) (2,normals) respectively
typedef std::map SublabelAssociationMap;
-
+
static std::string getColLabel(const Label& label, const int row); //!< convert a descriptor label to an appropriate sub-label
-
+
//! Type of information in a DataPoints. Each type is stored in its own dense matrix.
enum PMPropTypes
{
@@ -112,7 +112,7 @@ struct PointMatcherIO
};
//! Vector containing the mapping of all external names to PointMatcher representation.
- //! The order is important (i.e., nx before ny). This can also be used to remap
+ //! The order is important (i.e., nx before ny). This can also be used to remap
//! 1D descriptor name to a better one.
static const SupportedLabels & getSupportedExternalLabels()
{
@@ -218,7 +218,7 @@ struct PointMatcherIO
static DataPoints loadPLY(const std::string& fileName);
static DataPoints loadPLY(std::istream& is);
- static void savePLY(const DataPoints& data, const std::string& fileName, unsigned precision); //!< save datapoints to PLY point cloud format
+ static void savePLY(const DataPoints& data, const std::string& fileName, bool binary, unsigned precision); //!< save datapoints to PLY point cloud format
// PCD
static DataPoints loadPCD(const std::string& fileName);
@@ -266,16 +266,29 @@ struct PointMatcherIO
//! Interface for PLY property
struct PLYProperty
{
+ enum class PLYPropertyType : uint8_t
+ {
+ INVALID,
+ INT8,
+ UINT8,
+ INT16,
+ UINT16,
+ INT32,
+ UINT32,
+ FLOAT32,
+ FLOAT64
+ };
+
//PLY information:
std::string name; //!< name of PLY property
- std::string type; //!< type of PLY property
+ PLYPropertyType type; //!< type of PLY property
std::string idx_type; //!< for list properties, type of number of elements
unsigned pos; //!< index of the property in element
bool is_list; //!< member is true of property is a list
-
+
//PointMatcher information:
PMPropTypes pmType; //!< type of information in PointMatcher
- int pmRowID; //!< row id used in a DataPoints
+ int pmRowID; //!< row id used in a DataPoints
PLYProperty() { } //!< Default constructor. If used member values must be filled later.
@@ -286,19 +299,31 @@ struct PointMatcherIO
PLYProperty(const std::string& idx_type, const std::string& type, const std::string& name, const unsigned pos); //! list prop ctor
bool operator==(const PLYProperty& other) const; //! compare with other property
+ inline PLYPropertyType get_type_from_string(const std::string &s)
+ {
+ if (s == "int8" || s == "char") return PLYPropertyType::INT8;
+ else if (s == "uint8" || s == "uchar") return PLYPropertyType::UINT8;
+ else if (s == "int16" || s == "short") return PLYPropertyType::INT16;
+ else if (s == "uint16" || s == "ushort") return PLYPropertyType::UINT16;
+ else if (s == "int32" || s == "int") return PLYPropertyType::INT32;
+ else if (s == "uint32" || s == "uint") return PLYPropertyType::UINT32;
+ else if (s == "float32" || s == "float") return PLYPropertyType::FLOAT32;
+ else if (s == "float64" || s == "double") return PLYPropertyType::FLOAT64;
+ return PLYPropertyType::INVALID;
+ }
};
//! Map from a descriptor name to a list PLY property
//! ex: "normals" -> nx, ny ,nz
typedef std::map > PLYDescPropMap;
-
+
//! Vector of properties specific to PLY files
typedef std::vector PLYProperties;
//! Iterator for a vector of PLY properties
typedef typename PLYProperties::iterator it_PLYProp;
- //! Interface for all PLY elements.
+ //! Interface for all PLY elements.
class PLYElement
{
public:
@@ -367,7 +392,7 @@ struct PointMatcherIO
unsigned int size; //!< Size of the property in bytes
char type; //!< Type: I: signed, U: unsigned, F: float
unsigned int count; //!< number of dimension
-
+
//PointMatcher information:
PMPropTypes pmType; //!< type of information in PointMatcher
int pmRowID; //!< row id used in a DataPoints
diff --git a/python/pointmatcher/io.cpp b/python/pointmatcher/io.cpp
index 37cbc327..d04ee010 100644
--- a/python/pointmatcher/io.cpp
+++ b/python/pointmatcher/io.cpp
@@ -74,7 +74,7 @@ The order is important (i.e., nx before ny). This can also be used to remap 1D d
.def_static("saveVTK", (void (*)(const DataPoints&, const std::string&, bool, unsigned precision)) &PMIO::saveVTK, py::arg("data"), py::arg("fileName"), py::arg("binary") = false, py::arg("precision") = 7)
.def_static("loadPLY", (DataPoints (*)(const std::string&)) &PMIO::loadPLY, py::arg("fileName"))
- .def_static("savePLY", (void (*)(const DataPoints&, const std::string&, unsigned precision)) &PMIO::savePLY, py::arg("data"), py::arg("fileName"), py::arg("precision"), "save datapoints to PLY point cloud format")
+ .def_static("savePLY", (void (*)(const DataPoints&, const std::string&, bool binary, unsigned precision)) &PMIO::savePLY, py::arg("data"), py::arg("fileName"), py::arg("binary"), py::arg("precision"), "save datapoints to PLY point cloud format")
.def_static("loadPCD", (DataPoints (*)(const std::string&)) &PMIO::loadPCD, py::arg("fileName"))
.def_static("savePCD", (void (*)(const DataPoints&, const std::string&, unsigned precision)) &PMIO::savePCD, py::arg("data"), py::arg("fileName"), py::arg("precision"), "save datapoints to PCD point cloud format");
@@ -119,59 +119,59 @@ Note that the header must at least contain "reading".
pyPointMatcherIO
.def_static("plyPropTypeValid", &PMIO::plyPropTypeValid, "Check that property defined by type is a valid PLY type note: type must be lowercase");
- using PLYProperty = PMIO::PLYProperty;
- py::class_(pyPointMatcherIO, "PLYProperty", "Interface for PLY property")
- .def_readwrite("name", &PLYProperty::name, "name of PLY property")
- .def_readwrite("type", &PLYProperty::type, "type of PLY property")
- .def_readwrite("idx_type", &PLYProperty::idx_type, "for list properties, type of number of elements")
- .def_readwrite("pos", &PLYProperty::pos, "index of the property in element")
- .def_readwrite("is_list", &PLYProperty::is_list, "member is true of property is a list")
- .def_readwrite("pmType", &PLYProperty::pmType, "type of information in PointMatcher")
- .def_readwrite("pmRowID", &PLYProperty::pmRowID, "row id used in a DataPoints")
-
- .def(py::init<>(), "Default constructor. If used member values must be filled later.")
- .def(py::init(), py::arg("type"), py::arg("name"), py::arg("pos"), "regular property")
- .def(py::init(), py::arg("idx_type"), py::arg("type"), py::arg("name"), py::arg("pos"), "list property")
-
- .def("__eq__", &PLYProperty::operator==, "compare with other property");
-
- using PLYProperties = PMIO::PLYProperties;
- py::bind_vector(pyPointMatcherIO, "PLYProperties", "Vector of properties specific to PLY files");
+// using PLYProperty = PMIO::PLYProperty;
+// py::class_(pyPointMatcherIO, "PLYProperty", "Interface for PLY property")
+// .def_readwrite("name", &PLYProperty::name, "name of PLY property")
+// .def_readwrite("type", &PLYProperty::type, "type of PLY property")
+// .def_readwrite("idx_type", &PLYProperty::idx_type, "for list properties, type of number of elements")
+// .def_readwrite("pos", &PLYProperty::pos, "index of the property in element")
+// .def_readwrite("is_list", &PLYProperty::is_list, "member is true of property is a list")
+// .def_readwrite("pmType", &PLYProperty::pmType, "type of information in PointMatcher")
+// .def_readwrite("pmRowID", &PLYProperty::pmRowID, "row id used in a DataPoints")
+//
+// .def(py::init<>(), "Default constructor. If used member values must be filled later.")
+// .def(py::init(), py::arg("type"), py::arg("name"), py::arg("pos"), "regular property")
+// .def(py::init(), py::arg("idx_type"), py::arg("type"), py::arg("name"), py::arg("pos"), "list property")
+//
+// .def("__eq__", &PLYProperty::operator==, "compare with other property");
+//
+// using PLYProperties = PMIO::PLYProperties;
+// py::bind_vector(pyPointMatcherIO, "PLYProperties", "Vector of properties specific to PLY files");
using PLYDescPropMap = PMIO::PLYDescPropMap;
py::bind_map(pyPointMatcherIO, "PLYDescPropMap", "Map from a descriptor name to a list PLY property\nex: \"normals\" -> nx, ny ,nz");
- using PLYElement = PMIO::PLYElement;
- py::class_(pyPointMatcherIO, "PLYElement", "Interface for all PLY elements.")
- .def_readwrite("name", &PLYElement::name, "name identifying the PLY element")
- .def_readwrite("num", &PLYElement::num, "number of occurences of the element")
- .def_readwrite("total_props", &PLYElement::total_props, "total number of properties in PLY element")
- .def_readwrite("offset", &PLYElement::offset, "line at which data starts")
- .def_readwrite("properties", &PLYElement::properties, "all properties found in the header")
- .def_readwrite("nbFeatures", &PLYElement::nbFeatures, "number of valid features found in the header")
- .def_readwrite("nbDescriptors", &PLYElement::nbDescriptors, "number of valid descriptors found in the header")
-
- .def(py::init(), py::arg("name"), py::arg("num"), py::arg("offset"), R"pbdoc(
-PLY Element constructor
-
-@param name name of the ply element (case-sensitive)
-@param num number of times the element appears in the file
-@param offset if there are several elements, the line offset at which this element begins. Note that, as of writing, only one (vertex) element is supported.
-
-This object holds information about a PLY element contained in the file.
-It is filled out when reading the header and used when parsing the data.
-)pbdoc").def("__eq__", &PLYElement::operator==, "comparison operator for elements");
-
- using PLYVertex = PMIO::PLYVertex;
- py::class_(pyPointMatcherIO, "PLYVertex", "Implementation of PLY vertex element")
- .def(py::init(), py::arg("num"), py::arg("offset"), R"pbdoc(
-Constructor
-
-@param num number of times the element appears in the file
-@param offset if there are several elements, the line offset at which this element begins. Note that, as of writing, only one (vertex) element is supported.
-
-Implementation of PLY element interface for the vertex element
-)pbdoc");
+// using PLYElement = PMIO::PLYElement;
+// py::class_(pyPointMatcherIO, "PLYElement", "Interface for all PLY elements.")
+// .def_readwrite("name", &PLYElement::name, "name identifying the PLY element")
+// .def_readwrite("num", &PLYElement::num, "number of occurences of the element")
+// .def_readwrite("total_props", &PLYElement::total_props, "total number of properties in PLY element")
+// .def_readwrite("offset", &PLYElement::offset, "line at which data starts")
+// .def_readwrite("properties", &PLYElement::properties, "all properties found in the header")
+// .def_readwrite("nbFeatures", &PLYElement::nbFeatures, "number of valid features found in the header")
+// .def_readwrite("nbDescriptors", &PLYElement::nbDescriptors, "number of valid descriptors found in the header")
+//
+// .def(py::init(), py::arg("name"), py::arg("num"), py::arg("offset"), R"pbdoc(
+//PLY Element constructor
+//
+//@param name name of the ply element (case-sensitive)
+//@param num number of times the element appears in the file
+//@param offset if there are several elements, the line offset at which this element begins. Note that, as of writing, only one (vertex) element is supported.
+//
+//This object holds information about a PLY element contained in the file.
+//It is filled out when reading the header and used when parsing the data.
+//)pbdoc").def("__eq__", &PLYElement::operator==, "comparison operator for elements");
+//
+// using PLYVertex = PMIO::PLYVertex;
+// py::class_(pyPointMatcherIO, "PLYVertex", "Implementation of PLY vertex element")
+// .def(py::init(), py::arg("num"), py::arg("offset"), R"pbdoc(
+//Constructor
+//
+//@param num number of times the element appears in the file
+//@param offset if there are several elements, the line offset at which this element begins. Note that, as of writing, only one (vertex) element is supported.
+//
+//Implementation of PLY element interface for the vertex element
+//)pbdoc");
// FIXME : Generate undefined symbol error for "elementSupported" or "createElement" method when importing the module
// using PLYElementF = PMIO::PLYElementF;
diff --git a/utest/ui/IO.cpp b/utest/ui/IO.cpp
index a76d986e..a4fcea8c 100644
--- a/utest/ui/IO.cpp
+++ b/utest/ui/IO.cpp
@@ -463,6 +463,11 @@ TEST_F(IOLoadSaveTest, PLY)
loadSaveTest(dataPath + "unit_test.ply", true);
}
+TEST_F(IOLoadSaveTest, PLYBinary)
+{
+ loadSaveTest(dataPath + "unit_test.bin.ply", true, 1, true);
+}
+
TEST_F(IOLoadSaveTest, PCD)
{
loadSaveTest(dataPath + "unit_test.pcd");
From 164b07fa28162a34715cb56079e4f0f943a0ba0e Mon Sep 17 00:00:00 2001
From: boxanm <47394922+boxanm@users.noreply.github.com>
Date: Tue, 10 Dec 2024 14:00:01 -0500
Subject: [PATCH 3/3] Remove a note mentioning that yaml-cpp is optional from
documentation
---
pointmatcher/Documentation.dox | 6 +++---
1 file changed, 3 insertions(+), 3 deletions(-)
diff --git a/pointmatcher/Documentation.dox b/pointmatcher/Documentation.dox
index 1fbbb3ae..1b85ce96 100644
--- a/pointmatcher/Documentation.dox
+++ b/pointmatcher/Documentation.dox
@@ -62,7 +62,7 @@ You can list the available modules with:
pmicp -l
\endcode
-If you have compiled libpointmatcher with \ref yaml-cpp enabled, you can configure the ICP chain without any recompilation by passing a configuration file to the \c pmicp command using the \c --config switch. An example file is available in \c examples/data/default.yaml.
+Thanks to \ref yaml-cpp , you can configure the ICP chain without any recompilation by passing a configuration file to the \c pmicp command using the \c --config switch. An example file is available in \c examples/data/default.yaml.
\section Understanding Understanding libpointmatcher
@@ -75,7 +75,7 @@ In libpointmatcher, every module is a class that can describe its own possible p
This text-based configuration aids to explicit parameters used and eases the sharing of working setups with others, which ultimately allows for reproducibility and reusability of the solutions.
The ICP chain takes as input two point clouds, in 2D or 3D, and estimates the translation and the rotation parameters that minimize the alignment error.
-We called the first point cloud the \e reference and the second the \e reading.
+We called the first point cloud the \e reference and the second the \e reading.
The ICP algorithm tries to align the reading onto the reference.
To do so, it first applies filtering (PointMatcher::DataPointsFilters) to the point clouds, and then it iterates through a sequence of processing blocks.
For each iteration, it associates points in reading to points in reference (PointMatcher::Matcher), rejects outliers (PointMatcher::OutlierFilters) and finds a transformation (PointMatcher::TransformationParameters) of reading that minimizes the alignment error (PointMatcher::ErrorMinimizer).
@@ -126,7 +126,7 @@ You have to modify 3 files to add a new \ref PointMatcher::DataPointsFilter "Dat
- The types of the parameters are used to properly cast the string value and can be: \c &P::Comp, \c &P::Comp, \c &P::Comp, etc. See \c DataPointsFiltersImpl.h for examples.
- Uncomment and rename the constructor.
-- In \c DataPointsFiltersImpl.cpp, copy the implementation of \c IdentityDataPointsFilter at the end of the file including the explicit instantiation (i.e. \c template \c struct \c DataPointsFiltersImpl::YourFilter; and \c template \c struct
+- In \c DataPointsFiltersImpl.cpp, copy the implementation of \c IdentityDataPointsFilter at the end of the file including the explicit instantiation (i.e. \c template \c struct \c DataPointsFiltersImpl::YourFilter; and \c template \c struct
\c DataPointsFiltersImpl::YourFilter;).
- Add the constructor if needed.
- At this stage, you should let the implementation of the filter function to hold only the statement that returns the input.