Skip to main content
The Point Cloud panel renders LiDAR scans, radar returns, and other 3D point data in an interactive 3D viewport. It supports six visualization modes for coloring points, multiple camera perspectives, camera projection overlay, and real-time interaction with large datasets.

Supported Formats

SchemaFormatNotes
sensor_msgs/PointCloud2ROS point cloudStandard LiDAR format with configurable fields
foxglove.PointCloudFoxglove point cloudIndexed points with typed fields
sensor_msgs/LaserScanROS laser scan2D LiDAR rendered as a 3D ring (requires keyword in topic name)

Auto-Detection

A topic is assigned to the Point Cloud panel when:
  • Its schema is sensor_msgs/PointCloud2 or foxglove.PointCloud (schema-matched), or
  • Its topic name contains lidar, pointcloud, velodyne, or points
sensor_msgs/LaserScan and radar_msgs/RadarScan are not schema-matched — they are routed to this panel only when the topic name contains one of the keywords above.

Visualization Modes

The panel supports six modes for coloring points, each designed for a different analysis task. Modes can be switched in real-time without reloading data.
ModeColoringBest For
NeutralUniform single colorStructural overview
IntensityThree-band gradient by return intensitySurface material analysis
Rainbow6 evenly-spaced hues cycling per frameTemporal frame distinction
LabelDeterministic color per semantic classSegmentation review
PanopticHashed color per instance IDInstance segmentation review
Image ProjectionCamera RGB projected onto pointsCamera-LiDAR fusion
See Visualization Modes for detailed documentation of each mode, including the color mapping algorithms and recommended use cases.

View Perspectives

The panel provides preset camera perspectives for common analysis tasks:
ViewDescriptionUse Case
PerspectiveFree-orbit 3D cameraGeneral exploration and annotation
Bird’s-eyeTop-down orthographic viewOverhead scene layout, object spacing
Side viewLateral orthographic viewHeight verification, ground plane alignment
Switch between views using the view selector in the panel toolbar or keyboard shortcuts.

Camera Controls

ActionControl
OrbitLeft-click + drag
PanRight-click + drag (or middle-click + drag)
ZoomScroll wheel
Reset viewPress R

Camera Projection Overlay

When camera calibration data is available, the panel can overlay camera imagery onto the point cloud using the Image Projection visualization mode. The system projects each point into all available cameras and selects the one where the point falls closest to the image center (principal point), then maps that camera’s pixel color onto the point. This produces a photorealistic colored point cloud that combines the spatial accuracy of LiDAR with the visual detail of cameras. See Camera Projection for technical details on how the projection works.

Rendering Pipeline

The Point Cloud panel uses the full rendering pipeline described in Point Cloud Rendering:
  • Octree spatial indexing for hierarchical culling and level-of-detail
  • Chunk-based loading for progressive streaming of large datasets
  • Background workers for off-main-thread octree construction and point processing
  • GPU acceleration via WebGPU when available (compute shaders, frustum culling, LOD selection)
The rendering pipeline adapts to your hardware automatically. WebGPU-capable browsers get GPU-accelerated compute and rendering. Other browsers fall back to WebGL with CPU-based processing.

Interaction

Point Selection

Click on individual points to inspect their attributes (position, intensity, label, instance ID). The viewer uses the octree spatial index for fast nearest-neighbor queries, enabling responsive point picking even in dense clouds.

Synchronized Timeline

The Point Cloud panel is synchronized with all other panels through the shared timeline. When you seek, step, or play the recording, the point cloud updates to show the LiDAR scan at the current timestamp.

Annotation Overlay

When viewing annotated datasets, 3D annotations (cuboids, segmentation masks) are rendered as overlays on the point cloud. The annotations are interactive — click to select, drag to reposition, and use handles to resize.

Common Use Cases

Scene exploration

Navigate large point clouds to understand the 3D environment, identify objects, and assess data quality.

Annotation review

Verify 3D cuboid placement, segmentation coverage, and label accuracy across frames.

Calibration verification

Use Image Projection mode to check that camera-LiDAR calibration produces accurate alignment.

Multi-sensor analysis

View point clouds alongside camera, IMU, and GPS panels for full scene understanding.