libcamera @ELCE 2025¶
The libcamera team is this year presenting at the Embedded Linux Conference Europe with an update on the recent developments in the project. Demonstrations include Camera Synchronisation demo, Software-ISP with GPU acceleration, new development tools including Camshark and new platform enablements including the Renesas R-Car V4H with Dream Chip RPP-X1 ISP.
Camera Synchronisation¶
There has been a long term goal to get libcamera internal functionality to be extendable to support interactions which can be common for applications but sit above the libcamera interface.
To support this, recent work posted by core-team developer Paul Elder is introducing libcamera Layers which will enable a plugin architecture to extend libcamera with use case specific features. This could include future support for actions such as Zero Shutter Lag, or extended debug, tuning and introspection of the camera pipeline.
The first layer being developed in this new system is a Camera Synchronisation layer. The Camera Synchronisation has been developed by the team at Raspberry Pi, implemented directly in the IPA module. However this original design can not be supported in isolated IPA modules and does not work across platforms.
The Synchronisation Layer will bring this feature to all supported libcamera platforms and facilitate cross-platform synchronisation of heterogeneous cameras connected across a network. The implementation relies on time synchronisation between the devices using linux-ptp and can then align captured frames across many cameras. This can run directly on a single platform to lock dual capture for stereoscopic capture use cases or could be run across a larger set for the famous ‘bullet time’ capture.
Presently the synchronisation accuracy is around 1 image line time measured in microseconds, as the algorithm can modify the vertical blanking period but could be further improved to get sub-line synchronisation precision with future adjustments to the horizontal blanking period.
Software ISP and GPU acceleration¶
Development of libcamera has always targetted supporting ‘every camera’ on ‘every platform’ but remains committed to focussing support on devices with drivers upstream. This can leave a large gap in support for consumer devices where many laptop and mobile phone vendors have not yet committed to providing the required driver development to control the ISP in the Linux Kernel.
The Software ISP has been under continous development over the last 18-24 months, working to bridge the gap by implementing a fallback path to process RAW Bayer images as part of the imaging pipeline. Initially this development has commenced with CPU processing only. A team of developers have been collaborating together from Linaro, Red Hat, and Ideas on Board to extend the Soft-ISP integration in libcamera to improve performance.
New upcoming features include multi-stream support for the simple pipeline handler, allowing capturing raw frames when using the Soft-ISP. The SoftISP is also gaining improvements to the 3a algorithm integration and in particular development has been working towards a GPU based Software ISP.
The GPU-ISP now at version 2 on the libcamera-devel mailing list aims to utilise GPU API’s with as wide platform support as possible and currently targets OpenGL ES 2.0. This makes use of existing debayering shaders available in the libcamera project and provides both performance improvements and power reductions over CPU based processing. While the power consumption will never be as low as with a dedicated hardware ISP - this implementation will help support improving users experience with camera processing on otherwise unsupportable platforms.
Development Tools¶
As more and more feature support is enabled in libcamera, including recent developments of WDR (Wide Dynamic Range) and HDR (High Dynamic Range) processing on the i.MX8MP ISP - visualising the pipeline in real time and being able to graph the metadata and statistics live while viewing or manipulating the scene has been a real advantage to developers.
A new tool ‘Camshark’ developed by Stefan Klug of Ideas on Board brings remote access camera control to libcamera and operates easily across networked devices using SSH as a transport mechanism. See Camshark on gitlab.freedesktop.org.
Camshark packs features that help develop and tune camera processing algorithms or support tuning and integrating new camera modules. Colour Tuning can be validated and inspected with a live colour chart detector (Macbeth Chart) and calculation of Delta E colour differences to observe the impact of AWB and CCM corrections. Live graphs can make it easy to see the effect of the AGC, WDR and HDR algorithms adjusting Sensor Gain and Exposure times.
Camshark values image quality at the highest level and transmits images uncompressed, allowing the user to zoom in and easily compare and match impact at the pixel level.
New Hardware Enablement¶
More platforms continue to receive direct support for the libcamera architecture with the ARM Mali-C55 now being supported which can be found on the Renesas RZ/V2H SoC. We’re looking forward to more ARM ISP support in the future and the implementation could be used to support older ARM ISP generations such as the Mali-C52 found in consumer SBCs with some continued effort.
Also covering more platforms from Renesas is ongoing development to enable the DreamChip RPP-X1 ISP found in the Renesas R-Car V4H. Renesas continue to provide fantastic support upstream for their platforms and the DreamChip RPP-X1 support will faciliate more embedded and automotive camera use case integrations with Linux on the latest range of SoCs.
NXP have implemented full feature support for their Neo ISP on the newest NXP i.MX95 with libcamera, and have released their Camera Porting Guide and initially are extending support for Omnivision OX03C10, OX05B12 and OS08A20 cameras. Thanks to the libcamera architecture this can easily be extended to make use of any of the other existing camera module support.
Camera Sensor Support¶
The Camera Sensor support for libcamera continually grows and can always be extended to support custom devices or configurations. Although some integration work may be required, all supported cameras can be used on all supported platforms with corresponding physical hardware connections correctly managed.
Existing support includes the following camera sensors:
Vendor |
Models |
|---|---|
Sony |
IMX214, IMX219, IMX258, IMX283, IMX290, IMX296, IMX327, IMX335, IMX415, IMX462, IMX477, IMX519, IMX708 |
Omnivision |
OV2685, OV2740, OV4689, OV5640, OV5647, OV5670, OV5675, OV5693, OV7251, OV8858, OV8865, OV9281, OV13858, OV64A40 |
On-Semi |
AR0144, AR0521 |
ST-Microelectronics |
VD56G3 |
Galaxy Core |
GC05A2, GC08A3 |