Generating NuScenes like point clouds for testing purposes with my custom data

Hi, I am trying to test NuScenes trained models with my own raw timestamped successive point clouds.

Since I am only interested in testing I don’t want/have any annotation in my data (i.e I don’t care about samples and the scale method to share annotations with sample and sweep point clouds).

The model I am doing inference over is MMDETECTION3D/CENTERPOINT and its input is a 5 dimensional point cloud, where 1-4 are x,y,z,intensity and the 5th dimension I believe to be NuScene’s pointcloud 5th dimension.

My question is, what is this 5th dimension in NuScenes? I have read the paper and there isn’t much explanation about it.

When testing any model for the NuScene challenge, what is the test data point cloud format like? Does it have 5 dimensions? What is the 5th dimension?

TLDR: I would like to generate the 5th lidar dimension of the NuScenes (which I believe comes from the multi sweeps) in my own data.

If it is not possible or cumbersome, what “default” value should I populate this 5th dimension with? All 0’s?

Thank you in advance!

Hi @albertc, in nuScenes, our pointcloud is a shape of N x 4 (where N is the number of points in the pointcloud), and the four dimensions are x, y, z and intensity.

As a quick example:

import os.path as osp

from nuscenes.nuscenes import NuScenes
from nuscenes.utils.data_classes import LidarPointCloud


# Load the nuScenes dataset (mini-split, in this case).
nusc = NuScenes(version='v1.0-mini', dataroot='/data/sets/nuscenes', verbose=False)

# Load some random sample token.
sd_record = nusc.get('sample', 'ca9a282c9e77460f8360f564131a8af5')
ref_sd_token = sd_record['data']['LIDAR_TOP']
ref_sd_record = nusc.get('sample_data', ref_sd_token)

# Load pointcloud.
pcl_path = osp.join(nusc.dataroot, ref_sd_record['filename'])
pc = LidarPointCloud.from_file(pcl_path)
points = pc.points.T
print(points.shape)  # (34688, 4)

If you saw a 5th dimension in MMDectection3D, it might be better to check with the authors of the repo directly.

Thank you very much!