Skip to main content
Before annotating, you need to understand what is in your data. Avala’s data exploration tools let you browse datasets, navigate recordings with synchronized multi-panel playback, switch between visualization modes to inspect different aspects of sensor data, and filter items to find exactly the frames you need.

Browsing Datasets and Sequences

Dataset List

The Datasets page shows all datasets in your organization. Each entry displays the dataset name, data type, item count, and last-modified date. Navigate the list with:
ShortcutAction
Up Arrow / Down ArrowNavigate between datasets
EnterOpen the selected dataset
Cmd + EnterOpen in a new tab

Sequence View

Datasets that contain temporal data (video, MCAP, point cloud sequences) group items into sequences. Opening a dataset shows its sequences, each representing a continuous recording or collection run. Click a sequence to open it in the multi-sensor viewer, where all items in the sequence are laid out on the timeline. Once inside a recording, the multi-sensor viewer provides full playback and inspection controls.

Timeline Navigation

The timeline bar at the bottom of the viewer spans the full recording duration. Use it to:
  • Play/pause continuous playback at adjustable speeds (0.25x to 4x)
  • Step frame-by-frame with arrow keys for precise inspection
  • Scrub by clicking and dragging on the timeline
  • Jump to timestamps by clicking the timestamp display and entering a specific time
All panels stay synchronized as you navigate. See Timeline Navigation for the full controls reference.

Panel-Level Inspection

Each panel supports independent zoom, pan, and interaction while remaining locked to the shared timeline:
Panel TypeInspection Actions
ImageZoom, pan, inspect pixel values
3D / Point CloudRotate, pan, zoom, switch visualization modes
PlotHover for data values, zoom into time ranges
MapPan, zoom, follow vehicle position
Raw MessagesExpand nested fields, copy values
LogScroll through timestamped entries
GaugeView current reading and range
StateView transition history

Switching Visualization Modes

The 3D / Point Cloud panel supports six visualization modes, each revealing different information about the same data:
ModeWhat It ShowsWhen to Use
NeutralUniform color for all pointsInspecting point cloud density and coverage
IntensityLiDAR return signal strengthDistinguishing materials (metal vs. fabric vs. pavement)
RainbowCycling hue per frameDistinguishing temporal frames and verifying alignment
LabelSemantic label color per annotation classReviewing labeled data and checking class assignments
PanopticUnique color per annotated instanceVerifying instance separation and tracking IDs
Image ProjectionCamera pixel colors projected onto LiDAR pointsCorrelating 3D geometry with visual appearance
Switch modes using the toolbar in the 3D panel or keyboard shortcuts:
ShortcutMode
1Label color view
2Intensity view
3Image projection view
Use Image Projection mode to visually confirm that your LiDAR-to-camera calibration is accurate. If projected colors do not align with the underlying 3D geometry, your extrinsic calibration may need adjustment.

Cross-Referencing Sensor Streams

One of the most effective exploration techniques is cross-referencing data across panels. The synchronized viewer makes this straightforward: LiDAR + Camera: Open the 3D panel alongside camera panels. As you step through frames, observe how 3D structures in the point cloud correspond to objects in the camera images. Enable LiDAR-to-camera projection for a direct overlay. LiDAR + Plot: Add a plot panel for IMU or velocity data. Correlate vehicle dynamics (acceleration, yaw rate) with what you see in the point cloud or camera views. Camera + Map: Pair camera views with the map panel to understand the geographic context of what the camera is seeing. Useful for fleet data where location matters. Camera + Log: View diagnostic logs alongside camera feeds to correlate software events with sensor observations.

Filtering and Searching

Query Language

Avala provides a structured query language for filtering dataset items. Use the search bar to write filter expressions. Filter by annotation label:
annotation.label = "car"
Filter by metadata:
metadata.weather = "rainy" AND metadata.scene_type = "highway"
Filter by slice:
slice = "validation"
Combine conditions:
(annotation.label = "car" OR annotation.label = "truck") AND annotation.attribute.occluded = "false"
See the Query Language Reference for the full syntax, including operators, logical combinators, and supported field types.

Using Slices

Slices are saved subsets of a dataset. Use them to organize data for exploration:
Slice StrategyExample
By scenariohighway, intersection, parking-lot
By conditionrainy, nighttime, heavy-traffic
By qualityneeds-review, edge-cases, golden-set
By splittraining, validation, test
Slices are virtual — they reference existing items without duplicating data.

AutoTag

AutoTag automatically groups visually similar items using embedding-based similarity. This is useful for discovering patterns in your data without manual tagging:
  • Find clusters of similar scenes (all highway on-ramps, all parking lots)
  • Identify near-duplicates that may skew model training
  • Discover underrepresented scenarios that need more data collection

Exploration Workflow

A typical data exploration workflow before starting annotation:
1

Browse the dataset

Open the dataset and scan through sequences to understand the scope and variety of the data.
2

Play back representative recordings

Open a few sequences in the viewer and play them at 1x or 2x speed to get an overall sense of the data.
3

Switch visualization modes

Toggle between Neutral, Intensity, and Rainbow modes to understand point cloud quality and coverage.
4

Cross-reference sensors

Open camera and LiDAR panels side by side. Enable LiDAR projection to verify calibration accuracy.
5

Filter for specific scenarios

Use the query language or slices to find items matching conditions relevant to your annotation task (e.g., nighttime scenes, crowded intersections).
6

Define your annotation strategy

Based on what you found, set up your project’s label taxonomy, annotation guidelines, and quality control configuration.

Next Steps