How to camera translation and rotation for each sample data

I have been trying to get the translation and rotation of the camera for each sample data.
It seems like the value from calibrated_sensor is based from the location and the facing direction of its ego car.
So, would the translation and rotation of the camera be calculated from the value from calibrated_sensor and ego_pose?
If so, would anyone have any suggestion for how to calculate the values?

Thank you!

@chanwutk the translation and rotation of a given camera can be retrieved via the calibrated_sensor record - here’s a concrete example of this is used: https://github.com/nutonomy/nuscenes-devkit/blob/28765b8477dbd3331bacd922fada867c2c4db1d7/python-sdk/nuscenes/nuscenes.py#L299-L300

Thank you so much!
This is very helpful :slight_smile:

1 Like