Coordinates in LIDAR data

Hi. I tried to read in a LIDAR file into Matlab, which looks as follows:

base_path= ‘…\Desktop\data\sets\nuscenes\samples\LIDAR_TOP’;
lidar_data =
fopen([base_path ‘n015-2018-10-02-10-50-40+0800__LIDAR_TOP__1538448748547277.pcd.bin’]);
pointcloud = fread(lidar_data);

As far as I know, the pointcloud is structured in the following way:
For each point, there are the (x,y,z)-coordiantes, the intensity as well as a ring index.
I am confused that I only get integer values in the pointcloud.
For example, the first point yields: x = 233, y = 148, z = 59, i = 192, ri = 120.

Do you have any idea, where these integer values come from? I don’t think that those values are just rounded to the next integer, rather that I messed something up.

Thank for your help.

Hi. Unfortunately I don’t have Matlab here to test this. Looking at the documentation (https://www.mathworks.com/help/matlab/ref/fread.html#btp1twt-1-precision), it sems like the default settings is uint8=>double, which is probably why it is only reading the first 8 bits, but not all 32. Try float32 instead.

Hello Sir,
I am facing similar issue while using C++ to read lidar points,
When I take float as datatype for lidar point, I am able to read first lidar point correctly. I am facing issue from second lidar point onwards. Second point x is taken as 0.

With Python using numpy:
first point - [-2.9395928 -0.39564633 -1.7501146 3. ]
second point - [-2.9533656 -0.39456582 -1.6662344 10. ]
third point - [-3.1270466 -0.39405033 -1.669569 22. ]

With C++ considering lidar point as float:
first point - [-2.9395928 -0.39564633 -1.7501146 3. ]
second point - [0, -2.9533656 -0.39456582 -1.6662344 ]
third point - [10, 1, -3.1270466 -0.39405033 ]

With C++ considering lidar point as double:
first point - [-3.1675e-06, 32, -29.0154, -0.332469]

Could you please suggest me, how I can read lidar data files correctly

Thank You

Hi,
unfortunately I never implemented that myself. I think you are better off using existing libraries, e.g.


Thank you Sir for your reply, will look into it.