Technology: Traffic light tester.
- Timofey Uvarov
- Apr 12
- 5 min read
Updated: Apr 17
As autonomous driving systems evolve, so too must the testing infrastructure that supports them. While much attention is paid to lane detection, object tracking, and general scene understanding, traffic light recognition remains a critical — yet under-tested — aspect of visual autonomy.
At Fourier Image Lab, we set out to change that.
A Purpose-Built Simulation System
We’ve developed a traffic light simulation and testing platform designed from the ground up to evaluate camera perception systems under real-world signal conditions and failure modes.
The product consists of a matrix of 40 LED, each containing red, yellow, green, and white emitters. The brightness of each individual row is independently controllable, allowing us to reproduce the entire range of lighting scenarios — from faint incandescent bulbs found in legacy intersections to high-intensity directional LED signals.

This granular control is crucial, because while modern HDR sensors may advertise 100+ dB dynamic range using 16- or even 24-bit containers, the data is ultimately tone-mapped to 8-bit before reaching the neural network. That tone mapping process introduces compression artifacts and quantization noise, often causing a loss of critical precision where it matters most — in small, bright objects like traffic lights.
Saturation, Hue Drift, Flicker, and Signal Failure
In real-world traffic scenarios, one of the most frequent failure modes is spectral channel saturation — where a red or yellow signal overwhelms one color channel but not the others. This can make it difficult or impossible to distinguish between lights of different colors, especially when the image is tone-mapped and passed through ISP pipelines.
To further challenge perception systems, our platform also allows precise control over flickering parameters, including frequency and duty cycle. This enables testing under regional signal conditions — from low-frequency flickering incandescent bulbs used in parts of the U.S. to high-frequency PWM-modulated LED signals typical in Asia and Europe. By mimicking these flicker styles, we can identify how various sensors and algorithms respond to temporal instability — a major source of missed or ghosted signals.
Our system monitors:
Which channels saturate first under varying light intensities
The hue and saturation values of red vs. yellow signals across the entire brightness range
Delta thresholds between hue/sat values, triggering alerts when color signals become ambiguous or fail classification standards
Signal dropout or misinterpretation under flickering conditions, especially in multi-exposure or rolling shutter sensor architectures
We treat channel saturation and flicker-induced dropout as functional failures, and flag them accordingly. This helps developers not only benchmark their sensor and ISP stack, but also improve resilience in real-world driving environments.
Image Analysis and Signal Verification
The platform supports both automated and manual workflows for extracting meaningful metrics from camera-captured frames.
Automatic Detection

The system detects light positions automatically and aligns them with signal geometry.
Manual Corner Assistance

In edge cases where automatic detection fails, users can manually annotate the grid corners — the system will interpolate the signal points
RGB Overlay Map

RGB values are computed by averaging pixel values
Side-by-Side Comparison of two cameras

Naked eye observations
In the left image, color fidelity is preserved across a wide dynamic range — with most of the signal rows retaining accurate hue and separation. Saturation only occurs at the bottom-most row, where white levels clip as expected under extreme intensity.
In contrast, the right image exhibits early saturation and color breakdown. From the 5th row downward, red, yellow, and green signals begin to blur into each other, and meaningful color distinction is lost. Only the top 3–4 rows appear properly exposed.
Another noticeable artifact is color bleeding, particularly visible on the top (red) and bottom (white) rows. The red signal appears orange-yellow, and the white signal exhibits an unexpected greenish tint. While the exact cause of this distortion is unclear, it may stem from local tone mapping or histogram equalization techniques (such as CLAHE) that operate non-uniformly across image regions — potentially skewing channel balance and local contrast.
These results underscore the importance of controlled signal testing under known brightness and flicker conditions. The visual inconsistencies between sensors highlight how critical both sensor behavior and ISP tone mapping are in preserving safety-critical color information.
Quantitative Analysis: RGB Channel Response to Signal Intensity
To complement the visual comparison, we analyzed the RGB values of red and yellow signals across increasing intensity levels (rows 1–10) from each camera. These values were obtained by averaging pixel intensities from the center of each illuminated signal.
Left Camera - better performer

For the red signal, all three color channels (R, G, B) show gradual progression with saturation beginning only at row 7 — which corresponds with the white-out observed in the last row of the photo. This indicates a healthy dynamic response.
For the yellow signal, however, the green channel saturates first, followed by blue, and finally red. This staggered saturation is a critical insight: when viewing cropped sections of the red and yellow rows from this camera.

We can observe the red signal temporarily adopting a yellow/orange hue before becoming fully saturated. This shift can fool a classifier into reading red as yellow — a safety-critical misclassification scenario.
Right Camera (underperformer)

Here, the data tells a more alarming story. For the red signal, all RGB channels saturate fully by row 3, offering no gradient or usable color discrimination beyond that point.
The yellow signal is even more problematic: green is saturated from the very first row, blue reaches max level by row 3, red climbs and saturates around row 5.
This kind of response indicates that the camera is not capturing the color signal faithfully at all, and that color balancing or tone-mapping inside the ISP may be distorting or flattening the output. In fact, none of the signals were represented correctly — making this camera (an actual automotive-grade sensor with auto-exposure enabled) highly unsuitable for traffic light detection.
These plots, combined with our structured simulator and visualization tools, highlight the urgency for rigorous camera qualification in perception systems. Even among sensors marketed as “HDR-ready,” internal ISP design choices, tone-mapping, and exposure logic can lead to catastrophic misinterpretations of safety-critical cues like traffic signals.
From Beta to Benchmark: Shaping the Future of Traffic Signal Testing
Fourier Image Lab is building a comprehensive database of automotive camera evaluations, with a focus on critical aspects like traffic signal detection accuracy, color fidelity, flicker resilience, and saturation behavior. These datasets are being curated from structured lab tests and will be available to partners and customers on demand.
The testing system itself is designed for automation and repeatability. It integrates with a programmable test framework that enables scripted signal sequences and synchronized image capture, making it possible to validate the entire visual pipeline — from optics and sensors to perception algorithms — using known, reproducible ground-truth patterns.
Despite being in beta, the system is already in use by multiple labs — including one led by a CEO famously fond of leather jackets. Demand for rigorous traffic light testing is growing, and this platform is already helping teams benchmark edge cases, diagnose failure modes, and raise the safety bar for autonomous platforms.
To learn more about our lighting solutions please visit our lighting product page.