- Kevin Paul
- January 6, 2021
Android Codec 2.0: Developing Multimedia Applications for newer Android Platforms
Tightening security by accelerating updates for security bugs and features has become one of the prime necessities of the connected world. The faster you roll out updates, the better these vulnerabilities can be plugged in. The intent is to create media frameworks which are extensible; partners can continue to extend media codecs and media format extractors.
Prior to Android N (released in 2016), all the modules of the Android Media framework like playback, recording, camera, audio were part of the media server process. But due to the stagefright security vulnerabilities, the media server was split into processes like the media codec, media extractor, camera server, audio server, etc.
In 2018, Google came up with Project Treble, with the aim of reducing the problem of Fragmentation. To understand more about Project Treble, please go through the article at Google’s Official Developer Blog. As part of Google’s plan for a more modularized framework, Google provided a new MediaCodec implementation framework – Codec 2.0 on Android Q (released in 2019).
What is Codec 2.0?
Prior to Android Q, the two main modules of multimedia frameworks were MediaPlayer and MediaCodec. MediaCodec was only responsible for decoding and rendering. MediaCodec called the third-party codec through OpenMAX-IL via the ACodec layer to implement hardware encoding/decoding. Chip manufacturers only need to support the OpenMAX interface developed by Khronos to implement MediaCodec hardware encoding and decoding. Google launched Codec2.0 on Android Q, which was to replace ACodec and OpenMAX. This can be viewed as a new set of middleware that works with MediaCodec.
Why Codec 2.0?
One of the major issues with the earlier issue was on the fragmentation of the Android ecosystem. The changes were cumbersome and varying from customer to customer based on the specific customization support that was required for the device. In order to overcome this challenge, Google launched a new concept of the mainline module and the fact that manufacturers are not allowed to modify the SDK. The mainlines modules can be upgraded through the Google Play store. The mainline module can be modified by submitting code to the mainline.
Google’s Codec 2.0 belongs to the mainline module, and manufacturers cannot modify its code. In this case, Android devices of various manufacturers are consistent in the multimedia middleware module. If the manufacturer wants to add features to the middleware module or fix a bug, manufacturers can only submit codes or patches to Google’s mainline. Mainline being merged and up to date benefits Google and other vendors.
Second, compared with the middleware framework of ACodec+OMX, Codec 2.0 will have several features: component chaining, filters, and configuration querying.
Third, in terms of performance, Codec 2.0 has more advantages compared to the ACodec+OMX framework. The Buffer management mechanism that the decoding and encoding components of Codec 2.0 rely on has a zero-copy feature to avoid performance degradation caused by the copy of large blocks of data.
Codec 2.0 Framework
The code for Codec2.0 is located at /frameworks/av/media/codec2. The top-level directory structure is as follows:
sfplugin/CCodec.cpp is the entry to the codec 2.0 framework. As the name sfplugin suggests, CCodec acts as the interface to the stagefright. This essentially means CCodec.cpp is synonymous with ACodec.cpp in libstagefright. The interface API’s provided by the CCodec is consistent with the ACodec interface which makes CCodec easy to integrate with MediaCodec.
The important member objects of CCodec class are :
- mChannel : CCodecBufferChannel Class, which is mainly responsible for buffer delivery.
- mClient : Codec2Client class, Codec2Client helps the CCodec to interact with the Codec2 components. The entry point is Codec2Client::CreateFromService(), which creates a Codec2Client object and from this Codec2Client, the Interface and the Component objects can be created by calling createComponent() and createInterface(). Component is the representative of a specific decoder/encoder component. Component Interface provides an interactive interface for configuration and parameters, between CCodec and the component. Both the component and its interface are created through a vendor specific ComponentStore.
- mClientListener : Listener class essentially used for input, output and error callbacks.
Sequence Diagrams:
1.Component Interface creation
2. Component creation
3. Playback sequence
Ignitarium’s Expertise on Codec 2.0
For the last several years, Ignitarium has been developing and supporting customers on various multimedia solutions on Linux and Android. We work with product companies, MNCs, and startups to architect and design solutions on multimedia products. The team at Ignitarium has a complete understanding of the Android Multimedia Framework and has been undertaking several R&D projects on Codec 2.0. A few implementations for customers are listed here:
1) Custom system service creation for interaction between application and framework
2) HIDL interface development from scratch
3) Vendor service creation for interaction with system services through HIDL interface
4) SE policy changes
The Ignitarium team is capable of developing Codec 2.0 compliant components for various hardware decoders. Ignitarium team has designed mechanisms to have pre/post-processing modules to any of the decoders without having a dependency on the hardware component vendors while complying to the Treble requirements. This design enables the use of various accelerated processing like OpenGL, etc.
If you are looking for expertise in Android Multimedia Framework and Codec2 specifically, do reach out to us for a discussion.