copc-lib provides an easy-to-use interface for reading and creating Cloud Optimized Point Clouds

copc-lib

copc-lib is a library which provides an easy-to-use reader and writer interface for COPC point clouds. This project provides a complete interface for handling COPC files, so that no additional LAZ or LAS libraries are required.

Build from source

Dependencies

copc-lib has the following dependencies:

  • laz-perf>=3.0.0
  • Catch2
  • Pybind11

To install dependencies:

conda install -c conda-forge "laz-perf>=3.0" Catch2 pybind11

C++

git clone https://github.com/RockRobotic/copc-lib.git
cd copc-lib
mkdir build && cd build
cmake ..
cmake --build .
sudo cmake --install .

Python

git clone https://github.com/RockRobotic/copc-lib.git
pip install ./copc-lib

Usage

The Reader and Writer objects provide the primary means of interfacing with your COPC files. For more complex use cases, we also provide additional objects such as LAZ Compressors and Decompressors (see example/example-writer.cpp).

For common use cases, see the example and test folders for full examples.

C++

copc-lib is compatible with CMake. Assuming copc-lib and lazperf are installed on the system, you can link with them in your CMakeLists.txt:

find_package(COPCLIB REQUIRED)
find_package(LAZPERF REQUIRED)

add_executable(funlib fun-main.cpp)
target_link_libraries(funlib COPCLIB::copc-lib LAZPERF::lazperf)

Example Files & Unit Tests

To build the copc-lib examples and unit tests along with the main library, you must enable them:

mkdir build && cd build
cmake .. -DWITH_TESTS=ON
cmake --build .

Then, you can run the unit tests and the examples:

ctest # All tests should pass

cd bin
./example_reader
./example_writer

Alternatively, you can build the test and example files from their respective CMakeLists, assuming copc-lib is already installed.

Python

import copclib as copc

# Create a reader object
reader = copc.FileReader("autzen-classified.copc.laz")

# Get the node metadata from the hierarchy
node = reader.FindNode(copc.VoxelKey(0, 0, 0, 0))
# Fetch the points of a node
points = reader.GetPoints(node)

# Iterate through each point
for point in points.Get():
    print(point)

Coming Soon

  • Basic C++ Reader Interface
  • Return Point structures from the reader rather than raw char* arrays, to support a more familiar laspy-like interface.
  • Add writer for COPC data
  • Python bindings
  • JavaScript bindings (not planned, see below)
  • Spatial querying for nodes (given spatial coordinates, retrieve the appropriate Entry object)
  • Conda and pip packages

Conformity to Spec

This version of copc-lib is pinned to a draft version of COPC respective of the state at COPC.io.

extended stats VLR

User ID Record ID
rock_robotic 10001

We additionally add an extended stats extents VLR to store mean and (population) variance values for each dimension. This VLR can be parsed in the same way as the extents VLR defined by the COPC spec.

struct CopcExtentExtended
{
    double mean; // replaces minimum
    double var; // replaces maximum
}

This VLR is optional to process existing COPC files. If present, mean/variance are set appropriately for each dimension in CopcExtents; if not, CopcExtents will have default values of mean=0 and var=1.

This VLR is only written by the Writer if the flag has_extended_stats is true in CopcConfigWriter::CopcExtents.

Helpful Links

Contributing

Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.

Please make sure to update tests as appropriate.

Naming Convention

C++

We mostly use Google's Style Guide.

namespace name
{
class ClassName
{
public:
// Default constructor
ClassName() = default;
ClassName(int public_variable, int private_variable, bool private_read_only)
: public_variable(public_variable), private_variable_(private_variable),
private_read_only_(private_read_only){};

int public_variable{};

// Getters and Setters
void PrivateVariable(int private_variable) { private_variable_ = private_variable; }
int PrivateVariable() const { return private_variable_; }
bool PrivateReadOnly() const { return private_read_only_; }

// Any other function
int PrivateVariablePlusOne() const { return private_variable_ + 1; }
int SumPublicAndPrivate() const { return public_variable + private_variable_; }
static std::string ReturnEmptyString() { return {}; }

private:
int private_variable_{};
bool private_read_only_{false};
};
} // namespace name

Python

test_class = ClassName(public_variable=1,private_variable=2,private_read_only=True)

# using pybind .def_readwrite
test_class.public_variable = 4
assert test_class.public_variable == 4

# using pybind .def_property
test_class.private_variable = 5
assert test_class.private_variable == 5

# using pybind .def_property_readonly
assert test_class.private_read_only == True

# using pybind .def
assert test_class.PrivateVariablePlusOne() == 6
assert test_class.SumPublicAndPrivate() == 9

# using pybind .def_static
assert test_class.ReturnEmptyString == ""

Note that dimension names for points follow the laspy naming scheme, with the exception of scan_angle.

License

Please see LICENSE.md

Credits

copc-lib is created and maintained by Chris Lee, Leo Stanislas and other members of RockRobotic.

The COPC file format is created and maintained by HOBU Inc. Some code has been adopted from PDAL and lazperf, both of which are maintained by HOBU Inc.

Comments
  • cannot build python bindings on ubuntu 20.04LTS

    cannot build python bindings on ubuntu 20.04LTS

    I managed to build the C++ library, but I cannot build the python wrapper: I did install pybind11, but I did not install catch2 since I may skip tests: sudo apt install python3-pybind11

    My python version is 3.8.10

    pip install ./copc-lib
    ..................
    [ 88%] Linking CXX shared library ../../lib.linux-x86_64-3.8/libcopc-lib.so
        [ 88%] Built target copc-lib
        Scanning dependencies of target copclib
        [ 94%] Building CXX object python/CMakeFiles/copclib.dir/bindings.cpp.o
        /tmp/pip-req-build-sl2zyo4s/python/bindings.cpp: In function ‘void pybind11_init_copclib(pybind11::module&)’:
        /tmp/pip-req-build-sl2zyo4s/python/bindings.cpp:72:27: error: could not convert ‘pybind11::detail::self’ from ‘const pybind11::detail::self_t’ to ‘pybind11::handle’
           72 |         .def(py::hash(py::self))
              |                       ~~~~^~~~
              |                           |
              |                           const pybind11::detail::self_t
        make[2]: *** [python/CMakeFiles/copclib.dir/build.make:63: python/CMakeFiles/copclib.dir/bindings.cpp.o] Error 1
        make[1]: *** [CMakeFiles/Makefile2:170: python/CMakeFiles/copclib.dir/all] Error 2
        make: *** [Makefile:130: all] Error 2
        Traceback (most recent call last):
    
  • Configure failure

    Configure failure

    I assume something needs to be committed?

    #!/bin/bash
    
    rm -rf build
    mkdir build
    cd build
    cmake -G Ninja .. -DCMAKE_BUILD_TYPE=Release -Dgtest_force_shared_crt=ON -DCMAKE_INSTALL_PREFIX=$CONDA_PREFIX
    

    I installed catch2 via conda-forge, and I'm on OSX x86-64.

    
    -- The C compiler identification is Clang 11.1.0
    -- The CXX compiler identification is Clang 11.1.0
    -- Detecting C compiler ABI info
    -- Detecting C compiler ABI info - done
    -- Check for working C compiler: /Users/hobu/miniconda3/envs/pdal-build/bin/x86_64-apple-darwin13.4.0-clang - skipped
    -- Detecting C compile features
    -- Detecting C compile features - done
    -- Detecting CXX compiler ABI info
    -- Detecting CXX compiler ABI info - done
    -- Check for working CXX compiler: /Users/hobu/miniconda3/envs/pdal-build/bin/x86_64-apple-darwin13.4.0-clang++ - skipped
    -- Detecting CXX compile features
    -- Detecting CXX compile features - done
    CMake Error at CMakeLists.txt:44 (add_subdirectory):
      The source directory
    
        /Users/hobu/dev/git/copc-lib/lib/Catch2
    
      does not contain a CMakeLists.txt file.
    
    
    CMake Error at CMakeLists.txt:55 (include):
      include could not find requested file:
    
        Catch
    
    
    CMake Error at CMakeLists.txt:56 (catch_discover_tests):
      Unknown CMake command "catch_discover_tests".
    
    
    
  • Required extents vlr

    Required extents vlr

    Hi,

    Thank you for this library, I've been using it for the past few weeks and it's working great.

    I was wondering what your plan was for the extents vlr. From what I understand, it's been removed from the spec, but it seems to be required when opening a COPC file. I found a related discussion here: https://github.com/copcio/copcio.github.io/pull/50

    I've been working with COPC files generated from Untwine and PDAL, but both of these don't produce an extents vlr, so the following error happens when I try to open the files with copc-lib:

    RuntimeError: Reader::ReadCopcExtentsVlr: No COPC Extents VLR found in file.
    

    To work with these files, I removed the extents vlr entirely in this branch : https://github.com/davidcaron/copc-lib/tree/remove-extents-vlr.

    I guess my question is: did you plan on making the extents vlr optional, or to remove it?

  • release 2.1.3 broken : inconsitent laz-perf version 2.1 or 3.0

    release 2.1.3 broken : inconsitent laz-perf version 2.1 or 3.0

    Hi, thanks for sharing this work

    when building the last release of this copc-lib (2.1.3) the Readme says that laz-perf>= commit 4819611 is a pre-requisite

    Current master README says : laz-perf>=3.0.0 is a pre-requisite

    However when building with env : export LAZPERF_DIR=/home/opt/laz-perf-3.0.0/build/CMakeFiles/Export/lib/cmake/LAZPERF I get an error saying that I should use LAZPERF 2.1.0

    -- Detecting C compile features - done
    CMake Error at CMakeLists.txt:84 (find_package):
      Could not find a configuration file for package "LAZPERF" that is
      compatible with requested version "2.1.0".
    
      The following configuration files were considered but not accepted:
    
        /usr/local/lib/cmake/LAZPERF/lazperf-config.cmake, version: 3.0.0
    
    
    
    -- Configuring incomplete, errors occurred!
    See also "/home/opt/copc-lib-2.1.3/build/CMakeFiles/CMakeOutput.log".
    

    When I try to build with laz-perf 2.1 the configure step is ok, but I got error at compile time:

    [  6%] Building CXX object cpp/CMakeFiles/copc-lib.dir/src/copc/info.cpp.o
    In file included from /home/opt/copc-lib-2.1.3/cpp/src/copc/info.cpp:1:
    /home/opt/copc-lib-2.1.3/cpp/include/copc-lib/copc/info.hpp:15:29: error: ‘copc_info_vlr’ in namespace ‘lazperf’ does not name a type; did you mean ‘copc_vlr’?
       15 |     CopcInfo(const lazperf::copc_info_vlr &copc_info_vlr);
          |                             ^~~~~~~~~~~~~
          |                             copc_vlr
    
    

    When I try to build with laz-perf 3.0 the configure step is ok, but I got error at compile time:

    cmake --build .
    Scanning dependencies of target copc-lib
    [  6%] Building CXX object cpp/CMakeFiles/copc-lib.dir/src/copc/info.cpp.o
    [ 12%] Building CXX object cpp/CMakeFiles/copc-lib.dir/src/copc/extents.cpp.o
    In file included from /home/opt/copc-lib-2.1.3/cpp/include/copc-lib/copc/extents.hpp:10,
                     from /home/opt/copc-lib-2.1.3/cpp/src/copc/extents.cpp:1:
    /home/opt/copc-lib-2.1.3/cpp/include/copc-lib/las/vlr.hpp:15:33: error: ‘copc_extents_vlr’ in namespace ‘lazperf’ does not name a type; did you mean ‘copc_info_vlr’?
       15 | using CopcExtentsVlr = lazperf::copc_extents_vlr;
          |                                 ^~~~~~~~~~~~~~~~
          |                                 copc_info_vlr
    
    

    When I try to build copc-lib master branch with laz-perf 3.0 the configure step is OK, and compilation is OK.

  • Writer

    Writer

    Adds COPC writer.

    TODO:

    • [backlog] Set las header min/max and other attributes
    • [backlog] Implement filename reader/writer
    • Integrate point object to writer
    • Fix example files
    • [done] Call "close" on destructor of writer
    • [done] Support EB
  • Add scaled x/y/z support

    Add scaled x/y/z support

    @leo-stan Let me know your thoughts on this. If we want to be able to get the scaled x/y/z values within the point class, we'll need to either store the lasheader or store the offset/scale factors (maybe store pointer or ref so no overhead?)

  • Use scikit-build for Python extension?

    Use scikit-build for Python extension?

    The scikit-build library has great support for running CMake. Additionally, it plays nice(er) with modern Python packaging like configuration pyproject.toml, etc.

    See PDAL's Python bindings for a scikit-build configuration that mixes CMake and Python https://github.com/PDAL/python

  • FEAT-437: General polishing of copclib

    FEAT-437: General polishing of copclib

    • [X] Catch up with new specs (prdf and statistics)
    • [X] Make LasConfig a subset of LasHeader
    • [X] Make Tostring for LasHeader/LasConfig
    • [ ] Make all low level classes picklable
    • [X] Make VoxelKey convertible to tuple and list in python
    • [x] Other remaining github issues
    • [x] Make naming and function signatures consistent
    • [ ] conda/pip install
    • [ ] Clear TODOs
    • [X] General clean-up after limiting point format
    • [ ] Add Point attributes to Points
    • [ ] Run profiler on copclib
    • [ ] Make CPP test cases and section consistent throughout lib
    • [ ] Make smaller test file to run longer tests
    • [ ] Remove const refs for things less than 8 bytes
    • [ ] Make runtime_error messages more clear (e.g. which class is throwing)
    • [X] Merge LasHeader, COPC info, etc
    • [X] Add checks to AddPoints that points are within node spatial bounds
    • [ ] Make a verbose option for copclib to output warnings for certain things (e.g. scan angle value check)
  • memory leak in copc::FileReader

    memory leak in copc::FileReader

    The constructor of copc::FileReader calls initialization functions which may throw an exception. When an exception is thrown, the construction of the copc::FileReader object is interrupted and its deconstructor will not be called. Thus, the memory of std::fstream object 'f_stream' leaks.

  • Make Vector3 Iterable

    Make Vector3 Iterable

    >>> list(reader.copc_config.las_header.min)
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
    TypeError: 'copclib._core.Vector3' object is not iterable
    
  • Better Error for Int Overflow

    Better Error for Int Overflow

    point.red = 65537
    

    The above code will throw this error:

    TypeError: (): incompatible function arguments. The following argument types are supported: 1. (arg0: copclib._core.Point, arg1: int) -> None

    The error message should better describe the issue

  • python example should show how to read a LAS or LAZ and convert it to COPC

    python example should show how to read a LAS or LAZ and convert it to COPC

    Hi, I just like to try copc build by doing read LAS -> write COPC. Unfortunately, if I want to stick to a minimal set of dependencies I think LAZperf which is included in copc-lib should be able to read a LAS or LAZ file.

    However, there is no doc and no example on how to do this :(

    >>> import copclib as copc
    >>> reader = copc.FileReader("/home/jlu/Downloads/autzen-classified.laz")
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
    RuntimeError: Reader::InitReader: Either Info or Hierarchy VLR missing, make sure you are loading a COPC file.
    

    In addition if i read a LAS using for example laspy module, is it still required to :

    • generate manually a copc header
    • generate manually the slippy tiles hierarchy ?

    Do you have a simple LAS -> COPC python converter to share ?

FG-Net: Fast Large-Scale LiDAR Point Clouds Understanding Network Leveraging Correlated Feature Mining and Geometric-Aware Modelling
FG-Net: Fast Large-Scale LiDAR Point Clouds Understanding Network Leveraging Correlated Feature Mining and Geometric-Aware Modelling

FG-Net: Fast Large-Scale LiDAR Point Clouds Understanding Network Leveraging Correlated Feature Mining and Geometric-Aware Modelling Comparisons of Running Time of Our Method with SOTA methods RandLA and KPConv:

Dec 28, 2022
NeeDrop: Self-supervised Shape Representation from Sparse Point Clouds using Needle Dropping
NeeDrop: Self-supervised Shape Representation from Sparse Point Clouds using Needle Dropping

NeeDrop: Self-supervised Shape Representation from Sparse Point Clouds using Needle Dropping by: Alexandre Boulch, Pierre-Alain Langlois, Gilles Puy a

Sep 6, 2022
Ground segmentation and point cloud clustering based on CVC(Curved Voxel Clustering)

my_detection Ground segmentation and point cloud clustering based on CVC(Curved Voxel Clustering) 本项目使用设置地面坡度阈值的方法,滤除地面点,使用三维弯曲体素聚类法完成点云的聚类,包围盒参数由Apol

Jul 15, 2022
An implementation on Fast Ground Segmentation for 3D LiDAR Point Cloud Based on Jump-Convolution-Process.
An implementation on Fast Ground Segmentation for 3D LiDAR Point Cloud Based on Jump-Convolution-Process.

An implementation on "Shen Z, Liang H, Lin L, Wang Z, Huang W, Yu J. Fast Ground Segmentation for 3D LiDAR Point Cloud Based on Jump-Convolution-Process. Remote Sensing. 2021; 13(16):3239. https://doi.org/10.3390/rs13163239"

Jan 5, 2023
A LiDAR point cloud cluster for panoptic segmentation
A LiDAR point cloud cluster for panoptic segmentation

Divide-and-Merge-LiDAR-Panoptic-Cluster A demo video of our method with semantic prior: More information will be coming soon! As a PhD student, I don'

Dec 22, 2022
The code implemented in ROS projects a point cloud obtained by a Velodyne VLP16 3D-Lidar sensor on an image from an RGB camera.
The code implemented in ROS projects a point cloud obtained by a Velodyne VLP16 3D-Lidar sensor on an image from an RGB camera.

PointCloud on Image The code implemented in ROS projects a point cloud obtained by a Velodyne VLP16 3D-Lidar sensor on an image from an RGB camera. Th

Aug 12, 2022
An unified library for fitting primitives from 3D point cloud data with both C++&Python API.
An unified library for fitting primitives from 3D point cloud data with both C++&Python API.

PrimitivesFittingLib An unified library for fitting multiple primitives from 3D point cloud data with both C++&Python API. The supported primitives ty

Jun 30, 2022
DeepI2P - Image-to-Point Cloud Registration via Deep Classification. CVPR 2021
DeepI2P - Image-to-Point Cloud Registration via Deep Classification. CVPR 2021

#DeepI2P: Image-to-Point Cloud Registration via Deep Classification Summary Video PyTorch implementation for our CVPR 2021 paper DeepI2P. DeepI2P solv

Jan 8, 2023
GA-NET: Global Attention Network for Point Cloud Semantic Segmentation
 GA-NET: Global Attention Network for Point Cloud Semantic Segmentation

GA-NET: Global Attention Network for Point Cloud Semantic Segmentation We propose a global attention network, called GA-Net, to obtain global informat

Jul 18, 2022
Dec 20, 2022
High performance, easy-to-use, and scalable machine learning (ML) package, including linear model (LR), factorization machines (FM), and field-aware factorization machines (FFM) for Python and CLI interface.
High performance, easy-to-use, and scalable machine learning (ML) package, including linear model (LR), factorization machines (FM), and field-aware factorization machines (FFM) for Python and CLI interface.

What is xLearn? xLearn is a high performance, easy-to-use, and scalable machine learning package that contains linear model (LR), factorization machin

Dec 23, 2022
A library for creating Artificial Neural Networks, for use in Machine Learning and Deep Learning algorithms.
A library for creating Artificial Neural Networks, for use in Machine Learning and Deep Learning algorithms.

iNeural A library for creating Artificial Neural Networks, for use in Machine Learning and Deep Learning algorithms. What is a Neural Network? Work on

Apr 5, 2022
Deep Learning in C Programming Language. Provides an easy way to create and train ANNs.
Deep Learning in C Programming Language. Provides an easy way to create and train ANNs.

cDNN is a Deep Learning Library written in C Programming Language. cDNN provides functions that can be used to create Artificial Neural Networks (ANN)

Dec 24, 2022
cudnn_frontend provides a c++ wrapper for the cudnn backend API and samples on how to use it

cuDNN Frontend API Introduction The cuDNN Frontend API is a C++ header-only library that demonstrates how to use the cuDNN C backend API. The cuDNN C

Dec 28, 2022
Lite.AI 🚀🚀🌟 is a user-friendly C++ lib for awesome🔥🔥🔥 AI models based on onnxruntime, ncnn or mnn. YOLOX, YoloV5, YoloV4, DeepLabV3, ArcFace, CosFace, Colorization, SSD
Lite.AI 🚀🚀🌟  is a user-friendly C++ lib for awesome🔥🔥🔥 AI models  based on onnxruntime, ncnn or mnn. YOLOX, YoloV5, YoloV4, DeepLabV3, ArcFace, CosFace, Colorization, SSD

Lite.AI ?????? is a user-friendly C++ lib for awesome?????? AI models based on onnxruntime, ncnn or mnn. YOLOX??, YoloV5??, YoloV4??, DeepLabV3??, ArcFace??, CosFace??, Colorization??, SSD??, etc.

Jan 4, 2023
Lite.AI 🚀🚀🌟 is a user friendly C++ lib of 60+ awesome AI models. YOLOX🔥, YoloV5🔥, YoloV4🔥, DeepLabV3🔥, ArcFace🔥, CosFace🔥, RetinaFace🔥, SSD🔥, etc.
Lite.AI 🚀🚀🌟  is a user friendly C++ lib of 60+ awesome AI models. YOLOX🔥, YoloV5🔥, YoloV4🔥, DeepLabV3🔥, ArcFace🔥, CosFace🔥, RetinaFace🔥, SSD🔥, etc.

Lite.AI ?? ?? ?? Introduction. Lite.AI ?? ?? ?? is a simple and user-friendly C++ library of awesome ?? ?? ?? AI models. It's a collection of personal

Dec 28, 2022
WFA2-lib: Wavefront alignment algorithm library v2
WFA2-lib: Wavefront alignment algorithm library v2

WFA2-lib 1. INTRODUCTION 1.1 What is WFA? The wavefront alignment (WFA) algorithm is an exact gap-affine algorithm that takes advantage of homologous

Dec 10, 2022
A tiny C++11 library for reading BVH motion capture data
A tiny C++11 library for reading BVH motion capture data

bvh11 A tiny C++11 library for reading (and writing) BVH motion capture data. Dependencies C++11 standard library Eigen 3 http://eigen.tuxfamily.org/

Dec 19, 2022
percepnet implemented using Keras, still need to be optimized and tuned.

PercepNet (Still need to be tuned) Unofficial implementation of PercepNet : A Perceptually-Motivated Approach for Low-Complexity, Real-Time Enhanceme

Nov 17, 2022