[Android 10 source code] deeply understand MediaCodec hard decoding initialization

The hard decoding API in Android is implemented by MediaCodec step by step calling hardware. It is usually necessary to call VPU to decode.

The following is a typical hard decoding initialization code. Of course, exception handling is also handled for better fault tolerance.

  1. According to MIME_TYPE (video/avc) creates a decoder and calls createcoderbytype to implement it;
  2. According to video length, width and mime_ Create MediaFormat configuration for type (set decoding color space, Profile Baseline, Profile Level, etc.);
  3. Pass MediaFormat configuration and Surface (the decoded drawn image can be null) into the decoder configuration function configure for configuration.
public class H264Decoder {
    private static final String TAG = "H264Decoder";
    private final static String MIME_TYPE = "video/avc"; // H.264 Advanced Video

    private MediaCodec mDecoderMediaCodec;
   
    private int mFps;
    private Surface mSurface;
    private int mWidth;
    private int mHeight;
    private int mInitMediaCodecTryTimes = 0;

    public H264Decoder(int width, int height, int fps, Surface surface) {
        mWidth = width;
        mHeight = height;
        mFps = fps;
        mSurface = surface;

        initMediaCodec(width, height, surface);
    }

    private void initMediaCodec(int width, int height, Surface surface) {
        try {
            mDecoderMediaCodec = MediaCodec.createDecoderByType(MIME_TYPE);
            //Create configuration
            MediaFormat mediaFormat = MediaFormat.createVideoFormat(MIME_TYPE, width, height);
            mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Flexible);
            //mediaFormat.setInteger(MediaFormat.KEY_PROFILE, MediaCodecInfo.CodecProfileLevel.AVCProfileBaseline);
            //mediaFormat.setInteger("level", MediaCodecInfo.CodecProfileLevel.AVCLevel4); // Level 4

            mDecoderMediaCodec.configure(mediaFormat, surface, null, 0);
        } catch (Exception e) {
            e.printStackTrace();
            //Failed to create decoding
            Log.e(TAG, "init MediaCodec fail.");
            // Try again six times
            if (mInitMediaCodecTryTimes < 6) {
                mInitMediaCodecTryTimes++;
                try {
                    Thread.sleep(20);
                } catch (InterruptedException ie) {
                    ie.printStackTrace();
                }
                initMediaCodec(mWidth, mHeight, mSurface);
            }
        }
    }
    ......
}

Now the first step of the analysis, how does createcoderbytype find the instantiated MediaCodec?

Instantiates the preferred decoder that supports the given mime type.

The following is a partial list of mime types defined and their semantics:

"video/x-vnd.on2. vp8" - VP8 video (e.g. video in webm)

"video/x-vnd.on2. vp9" - VP9 video (e.g. video in webm)

"video/avc" - H.264/AVC video

"video/hevc" - H.265/HEVC video

"video/mp4v-es" - MPEG4 video

"video/3gpp" - H.263 video

"audio/3gpp" - AMR narrowband audio

"Audio / AMR WB" - AMR broadband audio

"audio/mpeg" - MPEG1/2 Audio Layer III

"Audio / mp4a LATM" - aac audio (note that this is the original AAC package, not encapsulated in LATM)

"audio/vorbis" - vorbis audio

"Audio / g711 alaw" - G.711 alaw audio

"Audio / g711 mlaw" - G.711 ulaw audio

The MediaCodec constructor is actually called internally, but the last two parameters are written dead. nameIsType (true) is the type as the name suggests, and encoder (false) is the encoder.

frameworks/base/media/java/android/media/MediaCodec.java

final public class MediaCodec {
    ......
    @NonNull
    public static MediaCodec createDecoderByType(@NonNull String type)
            throws IOException {
        return new MediaCodec(type, true /* nameIsType */, false /* encoder */);
    }
    ......
}
  1. To get the Looper object, if it is called in the ordinary thread, then Looper.myLooper() must not be null, then use the Looper object to construct EventHandler. Otherwise, use Looper.getMainLooper() to get the main thread Looper object and construct EventHandler.
  2. Then, assign both mCallbackHandler (callback "handle") and mOnFrameRenderedHandler (frame rendering "handle") to the EventHandler object just created. This EventHandler is used to process events, which will be used later to analyze the specific events it handles;
  3. Create a Buffer Lock to synchronize related;
  4. mNameAtCreation indicates whether to save the name used during creation. Here, ture is passed in, so we don't need it because we already know the name;
  5. Finally, call native_setup to make jni call for further initialization.

frameworks/base/media/java/android/media/MediaCodec.java

final public class MediaCodec {
    ......
    private EventHandler mEventHandler;
    private EventHandler mOnFrameRenderedHandler;
    private EventHandler mCallbackHandler;
    ......
    final private Object mBufferLock;
    ......
    private MediaCodec(
            @NonNull String name, boolean nameIsType, boolean encoder) {
        Looper looper;
        if ((looper = Looper.myLooper()) != null) {
            mEventHandler = new EventHandler(this, looper);
        } else if ((looper = Looper.getMainLooper()) != null) {
            mEventHandler = new EventHandler(this, looper);
        } else {
            mEventHandler = null;
        }
        mCallbackHandler = mEventHandler;
        mOnFrameRenderedHandler = mEventHandler;

        mBufferLock = new Object();

        // save name used at creation
        mNameAtCreation = nameIsType ? null : name;

        native_setup(name, nameIsType, encoder);
    }
    
    private String mNameAtCreation;
    ......
}
  1. Create JMediaCodec object;
  2. Check whether there is an error in creating JMediaCodec object. If there is an error, branch to the corresponding statement and throw an exception to the java layer;
  3. Call JMediaCodec registerSelf() to register yourself;
  4. Call the setMediaCodec(...) function to assign the pointer to the JMediaCodec object to the java layer field (mediacodec. java mnatecontext).

frameworks/base/media/jni/android_media_MediaCodec.cpp

static void android_media_MediaCodec_native_setup(
        JNIEnv *env, jobject thiz,
        jstring name, jboolean nameIsType, jboolean encoder) {
    if (name == NULL) {
        jniThrowException(env, "java/lang/NullPointerException", NULL);
        return;
    }

    const char *tmp = env->GetStringUTFChars(name, NULL);

    if (tmp == NULL) {
        return;
    }

    sp<JMediaCodec> codec = new JMediaCodec(env, thiz, tmp, nameIsType, encoder);

    const status_t err = codec->initCheck();
    if (err == NAME_NOT_FOUND) {
        // fail and do not try again.
        jniThrowException(env, "java/lang/IllegalArgumentException",
                String8::format("Failed to initialize %s, error %#x", tmp, err));
        env->ReleaseStringUTFChars(name, tmp);
        return;
    } if (err == NO_MEMORY) {
        throwCodecException(env, err, ACTION_CODE_TRANSIENT,
                String8::format("Failed to initialize %s, error %#x", tmp, err));
        env->ReleaseStringUTFChars(name, tmp);
        return;
    } else if (err != OK) {
        // believed possible to try again
        jniThrowException(env, "java/io/IOException",
                String8::format("Failed to find matching codec %s, error %#x", tmp, err));
        env->ReleaseStringUTFChars(name, tmp);
        return;
    }

    env->ReleaseStringUTFChars(name, tmp);

    codec->registerSelf();

    setMediaCodec(env,thiz, codec);
}
......
static const JNINativeMethod gMethods[] = {
    ......
    { "native_setup", "(Ljava/lang/String;ZZ)V",
      (void *)android_media_MediaCodec_native_setup },
    ......
};
  1. Cache some jni objects for subsequent use;
  2. Create an ALooper object, start it, and handle events with the newly created LooperThread (runOnCallingThread is false, indicating that processing on the thread is no longer called);
  3. If nameIsType is equal to true, call the Native MediaCodec CreateByType method to create a Native MediaCodec object.

frameworks/base/media/jni/android_media_MediaCodec.cpp

JMediaCodec::JMediaCodec(
        JNIEnv *env, jobject thiz,
        const char *name, bool nameIsType, bool encoder)
    : mClass(NULL),
      mObject(NULL) {
    jclass clazz = env->GetObjectClass(thiz);
    CHECK(clazz != NULL);

    mClass = (jclass)env->NewGlobalRef(clazz);
    mObject = env->NewWeakGlobalRef(thiz);

    cacheJavaObjects(env);

    mLooper = new ALooper;
    mLooper->setName("MediaCodec_looper");

    mLooper->start(
            false,      // runOnCallingThread
            true,       // canCallJava
            ANDROID_PRIORITY_VIDEO);

    if (nameIsType) {
        mCodec = MediaCodec::CreateByType(mLooper, name, encoder, &mInitStatus);
        if (mCodec == nullptr || mCodec->getName(&mNameAtCreation) != OK) {
            mNameAtCreation = "(null)";
        }
    } else {
        mCodec = MediaCodec::CreateByComponentName(mLooper, name, &mInitStatus);
        mNameAtCreation = name;
    }
    CHECK((mCodec != NULL) != (mInitStatus != OK));
}

The last two arguments are - 1.

  1. Call mediacodeclist:: findmatchingcodes (...) to find all matching decoders;
  2. Traverse the component name in the container Vector returned in the previous step, create the MediaCodec cpp object, and call its init(...) function according to the component name;
  3. If there are no initialization errors, the MediaCodec cpp object is returned directly.

frameworks/av/media/libstagefright/MediaCodec.cpp

sp<MediaCodec> MediaCodec::CreateByType(
        const sp<ALooper> &looper, const AString &mime, bool encoder, status_t *err, pid_t pid,
        uid_t uid) {
    Vector<AString> matchingCodecs;

    MediaCodecList::findMatchingCodecs(
            mime.c_str(),
            encoder,
            0,
            &matchingCodecs);

    if (err != NULL) {
        *err = NAME_NOT_FOUND;
    }
    for (size_t i = 0; i < matchingCodecs.size(); ++i) {
        sp<MediaCodec> codec = new MediaCodec(looper, pid, uid);
        AString componentName = matchingCodecs[i];
        status_t ret = codec->init(componentName);
        if (err != NULL) {
            *err = ret;
        }
        if (ret == OK) {
            return codec;
        }
        ALOGD("Allocating component '%s' failed (%d), try next one.",
                componentName.c_str(), ret);
    }
    return NULL;
}
  1. Call getInstance() to get the BpMediaCodecList object;
  2. Call the findCodecByType(...) method of BpMediaCodecList, which is actually processed by MediaCodecList::findCodecByType(...) and returns the matching index. When the matchIndex is less than 0, exit the loop;
  3. Call getCodecInfo(...) method of BpMediaCodecList to get MediaCodecInfo;
  4. Get the decoder name through the MediaCodecInfo object getCodecName();
  5. The flags input parameter is equal to 0, so the decoder component name is directly pushed into the container matches;
  6. Next, determine whether to sort the container elements according to whether to set the debug.stagefright.swcodec attribute.

frameworks/av/media/libstagefright/MediaCodecList.cpp

void MediaCodecList::findMatchingCodecs(
        const char *mime, bool encoder, uint32_t flags,
        Vector<AString> *matches) {
    matches->clear();

    const sp<IMediaCodecList> list = getInstance();
    if (list == nullptr) {
        return;
    }

    size_t index = 0;
    for (;;) {
        ssize_t matchIndex =
            list->findCodecByType(mime, encoder, index);

        if (matchIndex < 0) {
            break;
        }

        index = matchIndex + 1;

        const sp<MediaCodecInfo> info = list->getCodecInfo(matchIndex);
        CHECK(info != nullptr);
        AString componentName = info->getCodecName();

        if ((flags & kHardwareCodecsOnly) && isSoftwareCodec(componentName)) {
            ALOGV("skipping SW codec '%s'", componentName.c_str());
        } else {
            matches->push(componentName);
            ALOGV("matching '%s'", componentName.c_str());
        }
    }

    if (flags & kPreferSoftwareCodecs ||
            property_get_bool("debug.stagefright.swcodec", false)) {
        matches->sort(compareSoftwareCodecsFirst);
    }
}

Find the service named media.player, which is actually MediaPlayerService, and then obtain IMediaCodecList (BpMediaCodecList) through MediaPlayerService getCodecList(). Here, cross process communication actually calls the remote MediaPlayerService through BpMediaPlayerService.

frameworks/av/media/libstagefright/MediaCodecList.cpp

sp<IMediaCodecList> MediaCodecList::getInstance() {
    Mutex::Autolock _l(sRemoteInitMutex);
    if (sRemoteList == nullptr) {
        sp<IBinder> binder =
            defaultServiceManager()->getService(String16("media.player"));
        sp<IMediaPlayerService> service =
            interface_cast<IMediaPlayerService>(binder);
        if (service.get() != nullptr) {
            sRemoteList = service->getCodecList();
            if (sRemoteList != nullptr) {
                sBinderDeathObserver = new BinderDeathObserver();
                binder->linkToDeath(sBinderDeathObserver.get());
            }
        }
        if (sRemoteList == nullptr) {
            // if failed to get remote list, create local list
            sRemoteList = getLocalInstance();
        }
    }
    return sRemoteList;
}

The BpMediaPlayerService getCodecList() method uses the binder mechanism to write data and wait for the remote response.

frameworks/av/media/libmedia/IMediaPlayerService.cpp

class BpMediaPlayerService: public BpInterface<IMediaPlayerService>
{
public:
    ......
    virtual sp<IMediaCodecList> getCodecList() const {
        Parcel data, reply;
        data.writeInterfaceToken(IMediaPlayerService::getInterfaceDescriptor());
        remote()->transact(GET_CODEC_LIST, data, &reply);
        return interface_cast<IMediaCodecList>(reply.readStrongBinder());
    }
};

The actual responder is MediaPlayerService::getCodecList(), which internally calls MediaCodecList::getLocalInstance() and goes back (because the calling process and the response process are different, although they all use the functions in the same class).

frameworks/av/media/libmediaplayerservice/MediaPlayerService.cpp

sp<IMediaCodecList> MediaPlayerService::getCodecList() const {
    return MediaCodecList::getLocalInstance();
}

MediaCodecList inherits from BnMediaCodecList. The comment says: getLocalInstance() is only used by MediaPlayerService.

  1. Create a MediaCodecList object, but the input parameter is the return value of another function GetBuilders();
  2. Check whether the previous step is abnormal. If there is no exception, the BpMediaCodecList singleton object is returned;
  3. Isprofilingneed() the name is related to codec analysis. It is not in the main process and will not be analyzed in detail.

frameworks/av/media/libstagefright/MediaCodecList.cpp

sp<IMediaCodecList> MediaCodecList::getLocalInstance() {
    Mutex::Autolock autoLock(sInitMutex);

    if (sCodecList == nullptr) {
        MediaCodecList *codecList = new MediaCodecList(GetBuilders());
        if (codecList->initCheck() == OK) {
            sCodecList = codecList;

            if (isProfilingNeeded()) {
                ALOGV("Codec profiling needed, will be run in separated thread.");
                pthread_t profiler;
                if (pthread_create(&profiler, nullptr, profilerThreadWrapper, nullptr) != 0) {
                    ALOGW("Failed to create thread for codec profiling.");
                }
            }
        } else {
            // failure to initialize may be temporary. retry on next call.
            delete codecList;
        }
    }

    return sCodecList;
}
  1. Traverse all MediaCodecListBuilderBase * in the container returned by GetBuilders();
  2. Call the buildMediaCodecList(...) method of each builder;
  3. MediaCodecListWriter writes global settings to AMessage and Codec information to the container pointed to by mcodecinfo;
  4. mCodecInfos is a variable that points to the STD:: vector < SP > container to sort the elements in the container (the sorting basis is rank);
  5. Remove the duplicate elements in the container, provided that the debug.stagefright.reduce-codes attribute switch is turned on.

frameworks/av/media/libstagefright/MediaCodecList.cpp

MediaCodecList::MediaCodecList(std::vector<MediaCodecListBuilderBase*> builders) {
    mGlobalSettings = new AMessage();
    mCodecInfos.clear();
    MediaCodecListWriter writer;
    for (MediaCodecListBuilderBase *builder : builders) {
        if (builder == nullptr) {
            ALOGD("ignored a null builder");
            continue;
        }
        mInitCheck = builder->buildMediaCodecList(&writer);
        if (mInitCheck != OK) {
            break;
        }
    }
    writer.writeGlobalSettings(mGlobalSettings);
    writer.writeCodecInfos(&mCodecInfos);
    std::stable_sort(
            mCodecInfos.begin(),
            mCodecInfos.end(),
            [](const sp<MediaCodecInfo> &info1, const sp<MediaCodecInfo> &info2) {
                // null is lowest
                return info1 == nullptr
                        || (info2 != nullptr && info1->getRank() < info2->getRank());
            });

    // remove duplicate entries
    bool dedupe = property_get_bool("debug.stagefright.dedupe-codecs", true);
    if (dedupe) {
        std::set<std::string> codecsSeen;
        for (auto it = mCodecInfos.begin(); it != mCodecInfos.end(); ) {
            std::string codecName = (*it)->getCodecName();
            if (codecsSeen.count(codecName) == 0) {
                codecsSeen.emplace(codecName);
                it++;
            } else {
                it = mCodecInfos.erase(it);
            }
        }
    }
}

The dependent plug-in provides a list of available OMX codec s. If the plug-in provides an input surface, the OMX video encoder cannot be used.

  1. Call StagefrightPluginLoader:: getccodeinstance() to get the StagefrightPluginLoader object, and then call its createInputSurface() method;
  2. Branch which builder to add to the container according to the return value of the previous step;
  3. Add the builder returned by GetCodec2InfoBuilder() to the container.

The GetCodec2InfoBuilder() function internally calls the createBuilder() method of the StagefrightPluginLoader object to return the builder.

frameworks/av/media/libstagefright/MediaCodecList.cpp

OmxInfoBuilder sOmxInfoBuilder{true /* allowSurfaceEncoders */};
OmxInfoBuilder sOmxNoSurfaceEncoderInfoBuilder{false /* allowSurfaceEncoders */};

Mutex sCodec2InfoBuilderMutex;
std::unique_ptr<MediaCodecListBuilderBase> sCodec2InfoBuilder;

MediaCodecListBuilderBase *GetCodec2InfoBuilder() {
    Mutex::Autolock _l(sCodec2InfoBuilderMutex);
    if (!sCodec2InfoBuilder) {
        sCodec2InfoBuilder.reset(
                StagefrightPluginLoader::GetCCodecInstance()->createBuilder());
    }
    return sCodec2InfoBuilder.get();
}

std::vector<MediaCodecListBuilderBase *> GetBuilders() {
    std::vector<MediaCodecListBuilderBase *> builders;
    // if plugin provides the input surface, we cannot use OMX video encoders.
    // In this case, rely on plugin to provide list of OMX codecs that are usable.
    sp<PersistentSurface> surfaceTest =
        StagefrightPluginLoader::GetCCodecInstance()->createInputSurface();
    if (surfaceTest == nullptr) {
        ALOGD("Allowing all OMX codecs");
        builders.push_back(&sOmxInfoBuilder);
    } else {
        ALOGD("Allowing only non-surface-encoder OMX codecs");
        builders.push_back(&sOmxNoSurfaceEncoderInfoBuilder);
    }
    builders.push_back(GetCodec2InfoBuilder());
    return builders;
}

Here, the pointer to the StagefrightPluginLoader object is unique_ptr is declared, so the sInstance pointer variable can only be "bound" once, which achieves the effect of singleton mode.

Little knowledge

Unique_ptr is a smart pointer with exclusive ownership of resources, that is, an object resource can only be pointed to by a unique_ptr at the same time.

frameworks/av/media/libstagefright/StagefrightPluginLoader.cpp

const std::unique_ptr<StagefrightPluginLoader> &StagefrightPluginLoader::GetCCodecInstance() {
    Mutex::Autolock _l(sMutex);
    if (!sInstance) {
        ALOGV("Loading library");
        sInstance.reset(new StagefrightPluginLoader(kCCodecPluginPath));
    }
    return sInstance;
}
  1. Call dlopen(...) to open the so library according to the path libsfplugin_ccodec.so;
  2. Call dlsym to parse the symbol to get the addresses of CreateCodec, CreateBuilder and CreateInputSurface.

Little knowledge

RTLD_NOW: all undefined symbols need to be resolved before dlopen returns. If not, NULL will be returned in dlopen. The error is: undefined symbol: xxxx

RTLD_NODELETE: do not unload the library during dlclose(), and do not initialize the static variables in the library when reloading the library later using dlopen(). This flag is not POSIX-2001 standard.

frameworks/av/media/libstagefright/StagefrightPluginLoader.cpp

namespace /* unnamed */ {

constexpr const char kCCodecPluginPath[] = "libsfplugin_ccodec.so";

}  // unnamed namespace

StagefrightPluginLoader::StagefrightPluginLoader(const char *libPath) {
    if (android::base::GetIntProperty("debug.stagefright.ccodec", 1) == 0) {
        ALOGD("CCodec is disabled.");
        return;
    }
    mLibHandle = dlopen(libPath, RTLD_NOW | RTLD_NODELETE);
    if (mLibHandle == nullptr) {
        ALOGD("Failed to load library: %s (%s)", libPath, dlerror());
        return;
    }
    mCreateCodec = (CodecBase::CreateCodecFunc)dlsym(mLibHandle, "CreateCodec");
    if (mCreateCodec == nullptr) {
        ALOGD("Failed to find symbol: CreateCodec (%s)", dlerror());
    }
    mCreateBuilder = (MediaCodecListBuilderBase::CreateBuilderFunc)dlsym(
            mLibHandle, "CreateBuilder");
    if (mCreateBuilder == nullptr) {
        ALOGD("Failed to find symbol: CreateBuilder (%s)", dlerror());
    }
    mCreateInputSurface = (CodecBase::CreateInputSurfaceFunc)dlsym(
            mLibHandle, "CreateInputSurface");
    if (mCreateInputSurface == nullptr) {
        ALOGD("Failed to find symbol: CreateInputSurface (%s)", dlerror());
    }
}

From the above symbol analysis, it is not difficult to find that StagefrightPluginLoader::createInputSurface() will eventually call the android::PersistentSurface *CreateInputSurface() method under codec2/sfplugin/CCodec.cpp.

frameworks/av/media/libstagefright/StagefrightPluginLoader.cpp

PersistentSurface *StagefrightPluginLoader::createInputSurface() {
    if (mLibHandle == nullptr || mCreateInputSurface == nullptr) {
        ALOGD("Handle or CreateInputSurface symbol is null");
        return nullptr;
    }
    return mCreateInputSurface();
}

From the bp file, it is easy to see that the so file libsfplugin_ccodec is located in the frameworks/av/media/codec2/sfplugin / directory, and the specific cpp files compiled are clear at a glance, including the dependent libraries.

frameworks/av/media/codec2/sfplugin/Android.bp

cc_library_shared {
    name: "libsfplugin_ccodec",

    srcs: [
        "C2OMXNode.cpp",
        "CCodec.cpp",
        "CCodecBufferChannel.cpp",
        "CCodecBuffers.cpp",
        "CCodecConfig.cpp",
        "Codec2Buffer.cpp",
        "Codec2InfoBuilder.cpp",
        "Omx2IGraphicBufferSource.cpp",
        "PipelineWatcher.cpp",
        "ReflectedParamUpdater.cpp",
        "SkipCutBuffer.cpp",
    ],

    cflags: [
        "-Werror",
        "-Wall",
    ],

    header_libs: [
        "libcodec2_internal",
    ],

    shared_libs: [
        "android.hardware.cas.native@1.0",
        "android.hardware.graphics.bufferqueue@1.0",
        "android.hardware.media.c2@1.0",
        "android.hardware.media.omx@1.0",
        "libbase",
        "libbinder",
        "libcodec2",
        "libcodec2_client",
        "libcodec2_vndk",
        "libcutils",
        "libgui",
        "libhidlallocatorutils",
        "libhidlbase",
        "liblog",
        "libmedia",
        "libmedia_omx",
        "libsfplugin_ccodec_utils",
        "libstagefright_bufferqueue_helper",
        "libstagefright_codecbase",
        "libstagefright_foundation",
        "libstagefright_omx",
        "libstagefright_omx_utils",
        "libstagefright_xmlparser",
        "libui",
        "libutils",
    ],

    sanitize: {
        cfi: true,
        misc_undefined: [
            "unsigned-integer-overflow",
            "signed-integer-overflow",
        ],
    },
}

Now let's analyze the concrete implementation of the CreateInputSurface() method in the sfplugin.

  1. Call Codec2Client::CreateInputSurface() to obtain the shared pointer to Codec2Client::InputSurface;
  2. Since the value of the debug.stagefright.c2inputsurface property is not set on the Ruixin micro rk platform, it defaults to 0, and the Codec2Client::CreateInputSurface() function directly returns a null pointer;
  3. Finally, android::PersistentSurface *CreateInputSurface() also returns a null pointer.

frameworks/av/media/codec2/sfplugin/CCodec.cpp

extern "C" android::PersistentSurface *CreateInputSurface() {
    using namespace android;
    // Attempt to create a Codec2's input surface.
    std::shared_ptr<Codec2Client::InputSurface> inputSurface =
            Codec2Client::CreateInputSurface();
    if (!inputSurface) {
        if (property_get_int32("debug.stagefright.c2inputsurface", 0) == -1) {
            sp<IGraphicBufferProducer> gbp;
            sp<OmxGraphicBufferSource> gbs = new OmxGraphicBufferSource();
            status_t err = gbs->initCheck();
            if (err != OK) {
                ALOGE("Failed to create persistent input surface: error %d", err);
                return nullptr;
            }
            return new PersistentSurface(
                    gbs->getIGraphicBufferProducer(),
                    sp<IGraphicBufferSource>(
                        new Omx2IGraphicBufferSource(gbs)));
        } else {
            return nullptr;
        }
    }
    return new PersistentSurface(
            inputSurface->getGraphicBufferProducer(),
            static_cast<sp<android::hidl::base::V1_0::IBase>>(
            inputSurface->getHalInterface()));
}

The Codec2Client structure is defined in codec2/hidl/client.h.

frameworks/av/media/codec2/hidl/client/include/codec2/hidl/client.h

struct Codec2Client : public Codec2ConfigurableClient {
    ......
    // Create an input surface.
    static std::shared_ptr<InputSurface> CreateInputSurface(
            char const* serviceName = nullptr);
    ......
}

The value of the debug.stagefright.c2inputsurface property on the Ruixin micro rk platform is not set, so it defaults to 0. nullptr is directly returned here.

frameworks/av/media/codec2/hidl/client/client.cpp

std::shared_ptr<Codec2Client::InputSurface> Codec2Client::CreateInputSurface(
        char const* serviceName) {
    int32_t inputSurfaceSetting = ::android::base::GetIntProperty(
            "debug.stagefright.c2inputsurface", int32_t(0));
    if (inputSurfaceSetting <= 0) {
        return nullptr;
    }
    ......
}

Now go back to MediaCodecList.cpp GetBuilders() function. It is known that somxinfbuilder will be added to the container. Somxinfbuilder is an instance of omxinfbuilder class, and surface encoder is allowed by default. In addition to adding somxinfbuilder, GetCodec2InfoBuilder() will also be added Function, which internally calls the function CreateBuilder() in sfplugin.

You can see that the Codec2InfoBuilder object is returned directly here.

frameworks/av/media/codec2/sfplugin/Codec2InfoBuilder.cpp

extern "C" android::MediaCodecListBuilderBase *CreateBuilder() {
    return new android::Codec2InfoBuilder;
}

The Codec2InfoBuilder class constructor did nothing.

frameworks/av/media/codec2/sfplugin/include/media/stagefright/Codec2InfoBuilder.h

namespace android {

class Codec2InfoBuilder : public MediaCodecListBuilderBase {
public:
    Codec2InfoBuilder() = default;
    ~Codec2InfoBuilder() override = default;
    status_t buildMediaCodecList(MediaCodecListWriter* writer) override;
};

}  // namespace android

In the MediaCodecList constructor, according to the above analysis, it is not difficult for the builder to call its buildmediacodelist (...), one is omxinfbuilder, and the other is Codec2InfoBuilder. As for how to construct the MediaCodecList, we will analyze it in the next section. Back to the main line, MediaCodecList::findMatchingCodecs(...) in MediaCodecList.cpp The function internally calls the findCodecByType(...) method of BpMediaCodecList, which is actually processed by MediaCodecList::findCodecByType(...) and returns the matching index.

  1. Traverse all MediaCodecInfo in the container;
  2. If MediaCodecInfo isEncoder() returns not false, skip this item, indicating that it is not a decoder;
  3. If MediaCodecInfo getCapabilitiesFor(...) is returned as nullptr, this item is also skipped, indicating that this item has no matching decoding ability;
  4. Check whether the advanced features support the encrypted decoder and tunnel playback decoder. If the two features are not supported, return the corresponding index.

frameworks/av/media/libstagefright/MediaCodecList.cpp

ssize_t MediaCodecList::findCodecByType(
        const char *type, bool encoder, size_t startIndex) const {
    static const char *advancedFeatures[] = {
        "feature-secure-playback",
        "feature-tunneled-playback",
    };

    size_t numCodecInfos = mCodecInfos.size();
    for (; startIndex < numCodecInfos; ++startIndex) {
        const MediaCodecInfo &info = *mCodecInfos[startIndex];

        if (info.isEncoder() != encoder) {
            continue;
        }
        sp<MediaCodecInfo::Capabilities> capabilities = info.getCapabilitiesFor(type);
        if (capabilities == nullptr) {
            continue;
        }
        const sp<AMessage> &details = capabilities->getDetails();

        int32_t required;
        bool isAdvanced = false;
        for (size_t ix = 0; ix < ARRAY_SIZE(advancedFeatures); ix++) {
            if (details->findInt32(advancedFeatures[ix], &required) &&
                    required != 0) {
                isAdvanced = true;
                break;
            }
        }

        if (!isAdvanced) {
            return startIndex;
        }
    }

    return -ENOENT;
}

Let's take another look at the mediacodeclist:: findmatchingcodes (...) function in MediaCodecList.cpp, which calls the getCodecInfo(...) method of BpMediaCodecList. In fact, MediaCodecList::getCodecInfo(...) returns MediaCodecInfo. In fact, it directly returns the corresponding item in the container (STD:: vector < SP >) pointed to by mcodecinfo according to the index.

frameworks/av/media/libstagefright/include/media/stagefright/MediaCodecList.h

struct MediaCodecList : public BnMediaCodecList {
    ......
    virtual sp<MediaCodecInfo> getCodecInfo(size_t index) const {
        if (index >= mCodecInfos.size()) {
            ALOGE("b/24445127");
            return NULL;
        }
        return mCodecInfos[index];
    }
    ......
}

Now you can finally analyze the MediaCodec cpp constructor and its init process. Various fields are initialized in the constructor.

frameworks/av/media/libstagefright/MediaCodec.cpp

MediaCodec::MediaCodec(const sp<ALooper> &looper, pid_t pid, uid_t uid)
    : mState(UNINITIALIZED),
      mReleasedByResourceManager(false),
      mLooper(looper),
      mCodec(NULL),
      mReplyID(0),
      mFlags(0),
      mStickyError(OK),
      mSoftRenderer(NULL),
      mAnalyticsItem(NULL),
      mResourceManagerClient(new ResourceManagerClient(this)),
      mResourceManagerService(new ResourceManagerServiceProxy(pid)),
      mBatteryStatNotified(false),
      mIsVideo(false),
      mVideoWidth(0),
      mVideoHeight(0),
      mRotationDegrees(0),
      mDequeueInputTimeoutGeneration(0),
      mDequeueInputReplyID(0),
      mDequeueOutputTimeoutGeneration(0),
      mDequeueOutputReplyID(0),
      mHaveInputSurface(false),
      mHavePendingInputBuffers(false),
      mCpuBoostRequested(false),
      mLatencyUnknown(0) {
    if (uid == kNoUid) {
        mUid = IPCThreadState::self()->getCallingUid();
    } else {
        mUid = uid;
    }

    initAnalyticsItem();
}
  1. If the name contains a. secure string, set secureCodec to true;
  2. Obtain the corresponding MediaCodecInfo according to the name and check whether it supports video decoding;
  3. Call GetCodecBase(...) to create a subclass inherited from CodecBase to obtain the decoder;
  4. If the video codec creates a dedicated looper, the corresponding registered handler is the CodecBase subclass created in the previous step;
  5. The registered handler corresponding to mloop is the MediaCodec object itself;
  6. Set CodecBase subclass callback and BufferChannelBase subclass callback;
  7. Call ResourceManagerService reclaimResource(...) to recycle resources.

frameworks/av/media/libstagefright/MediaCodec.cpp

status_t MediaCodec::init(const AString &name) {
    mResourceManagerService->init();

    // save init parameters for reset
    mInitName = name;

    // Current video decoders do not return from OMX_FillThisBuffer
    // quickly, violating the OpenMAX specs, until that is remedied
    // we need to invest in an extra looper to free the main event
    // queue.

    mCodecInfo.clear();

    bool secureCodec = false;
    AString tmp = name;
    if (tmp.endsWith(".secure")) {
        secureCodec = true;
        tmp.erase(tmp.size() - 7, 7);
    }
    const sp<IMediaCodecList> mcl = MediaCodecList::getInstance();
    if (mcl == NULL) {
        mCodec = NULL;  // remove the codec.
        return NO_INIT; // if called from Java should raise IOException
    }
    for (const AString &codecName : { name, tmp }) {
        ssize_t codecIdx = mcl->findCodecByName(codecName.c_str());
        if (codecIdx < 0) {
            continue;
        }
        mCodecInfo = mcl->getCodecInfo(codecIdx);
        Vector<AString> mediaTypes;
        mCodecInfo->getSupportedMediaTypes(&mediaTypes);
        for (size_t i = 0; i < mediaTypes.size(); i++) {
            if (mediaTypes[i].startsWith("video/")) {
                mIsVideo = true;
                break;
            }
        }
        break;
    }
    if (mCodecInfo == nullptr) {
        return NAME_NOT_FOUND;
    }

    mCodec = GetCodecBase(name, mCodecInfo->getOwnerName());
    if (mCodec == NULL) {
        return NAME_NOT_FOUND;
    }

    if (mIsVideo) {
        // video codec needs dedicated looper
        if (mCodecLooper == NULL) {
            mCodecLooper = new ALooper;
            mCodecLooper->setName("CodecLooper");
            mCodecLooper->start(false, false, ANDROID_PRIORITY_AUDIO);
        }

        mCodecLooper->registerHandler(mCodec);
    } else {
        mLooper->registerHandler(mCodec);
    }

    mLooper->registerHandler(this);

    mCodec->setCallback(
            std::unique_ptr<CodecBase::CodecCallback>(
                    new CodecCallback(new AMessage(kWhatCodecNotify, this))));
    mBufferChannel = mCodec->getBufferChannel();
    mBufferChannel->setCallback(
            std::unique_ptr<CodecBase::BufferCallback>(
                    new BufferCallback(new AMessage(kWhatCodecNotify, this))));

    sp<AMessage> msg = new AMessage(kWhatInit, this);
    msg->setObject("codecInfo", mCodecInfo);
    // name may be different from mCodecInfo->getCodecName() if we stripped
    // ".secure"
    msg->setString("name", name);

    if (mAnalyticsItem != NULL) {
        mAnalyticsItem->setCString(kCodecCodec, name.c_str());
        mAnalyticsItem->setCString(kCodecMode, mIsVideo ? kCodecModeVideo : kCodecModeAudio);
    }

    status_t err;
    Vector<MediaResource> resources;
    MediaResource::Type type =
            secureCodec ? MediaResource::kSecureCodec : MediaResource::kNonSecureCodec;
    MediaResource::SubType subtype =
            mIsVideo ? MediaResource::kVideoCodec : MediaResource::kAudioCodec;
    resources.push_back(MediaResource(type, subtype, 1));
    for (int i = 0; i <= kMaxRetry; ++i) {
        if (i > 0) {
            // Don't try to reclaim resource for the first time.
            if (!mResourceManagerService->reclaimResource(resources)) {
                break;
            }
        }

        sp<AMessage> response;
        err = PostAndAwaitResponse(msg, &response);
        if (!isResourceError(err)) {
            break;
        }
    }
    return err;
}

In the MediaCodec::GetCodecBase(...) method, create different objects or structures ACodec, android::CCodec or MediaFilter according to different owner s and name s.

After adding a Log, the rk3399 platform prints the Log as follows:

MediaCodec: GetCodecBase name=OMX.rk.video_decoder.avc owner=default

So the ACodec structure is actually created.

frameworks/av/media/libstagefright/MediaCodec.cpp

static CodecBase *CreateCCodec() {
    return StagefrightPluginLoader::GetCCodecInstance()->createCodec();
}

//static
sp<CodecBase> MediaCodec::GetCodecBase(const AString &name, const char *owner) {
    if (owner) {
        if (strcmp(owner, "default") == 0) {
            return new ACodec;
        } else if (strncmp(owner, "codec2", 6) == 0) {
            return CreateCCodec();
        }
    }

    if (name.startsWithIgnoreCase("c2.")) {
        return CreateCCodec();
    } else if (name.startsWithIgnoreCase("omx.")) {
        // at this time only ACodec specifies a mime type.
        return new ACodec;
    } else if (name.startsWithIgnoreCase("android.filter.")) {
        return new MediaFilter;
    } else {
        return NULL;
    }
}

This initialization assigns initial values to a series of fields, initializes nine states, and changes the state to uninitialized in the constructor.

frameworks/av/media/libstagefright/ACodec.cpp

ACodec::ACodec()
    : mSampleRate(0),
      mNodeGeneration(0),
      mUsingNativeWindow(false),
      mNativeWindowUsageBits(0),
      mLastNativeWindowDataSpace(HAL_DATASPACE_UNKNOWN),
      mIsVideo(false),
      mIsImage(false),
      mIsEncoder(false),
      mFatalError(false),
      mShutdownInProgress(false),
      mExplicitShutdown(false),
      mIsLegacyVP9Decoder(false),
      mEncoderDelay(0),
      mEncoderPadding(0),
      mRotationDegrees(0),
      mChannelMaskPresent(false),
      mChannelMask(0),
      mDequeueCounter(0),
      mMetadataBuffersToSubmit(0),
      mNumUndequeuedBuffers(0),
      mRepeatFrameDelayUs(-1LL),
      mMaxPtsGapUs(0LL),
      mMaxFps(-1),
      mFps(-1.0),
      mCaptureFps(-1.0),
      mCreateInputBuffersSuspended(false),
      mTunneled(false),
      mDescribeColorAspectsIndex((OMX_INDEXTYPE)0),
      mDescribeHDRStaticInfoIndex((OMX_INDEXTYPE)0),
      mDescribeHDR10PlusInfoIndex((OMX_INDEXTYPE)0),
      mStateGeneration(0),
      mVendorExtensionsStatus(kExtensionsUnchecked) {
    memset(&mLastHDRStaticInfo, 0, sizeof(mLastHDRStaticInfo));

    mUninitializedState = new UninitializedState(this);
    mLoadedState = new LoadedState(this);
    mLoadedToIdleState = new LoadedToIdleState(this);
    mIdleToExecutingState = new IdleToExecutingState(this);
    mExecutingState = new ExecutingState(this);

    mOutputPortSettingsChangedState =
        new OutputPortSettingsChangedState(this);

    mExecutingToIdleState = new ExecutingToIdleState(this);
    mIdleToLoadedState = new IdleToLoadedState(this);
    mFlushingState = new FlushingState(this);

    mPortEOS[kPortIndexInput] = mPortEOS[kPortIndexOutput] = false;
    mInputEOSResult = OK;

    mPortMode[kPortIndexInput] = IOMX::kPortModePresetByteBuffer;
    mPortMode[kPortIndexOutput] = IOMX::kPortModePresetByteBuffer;

    memset(&mLastNativeWindowCrop, 0, sizeof(mLastNativeWindowCrop));

    changeState(mUninitializedState);
}

Tags: Android h264 MediaCodec

Posted on Thu, 21 Oct 2021 14:19:44 -0400 by Avimander