Video streaming process and live blackscreen issues

Video streaming process and live blackscreen issues

A recent live video project required local recording when live broadcasting was not smooth or the network was unavailable, while guaranteeing that the video was not lost.Previous projects had only live broadcasts without local recordings, so a lot of modifications and code flow changes were needed on the original project.When you do stress testing after you have basically finished coding, you will find that the video will be Black-screen on occasional live broadcasts.Our project is based on open source projects SopCastComponent The code analysis below is also based on this open source project.Video live broadcasting is also based on RTMP protocol.

First, let's talk about how to make a live broadcast based on RTMP protocol.The RTMP protocol is based on the TCP protocol.First of all, of course, a TCP link is established, after which RTMP header information is sent:

private void rtmpConnect() {
        SessionInfo.markSessionTimestampTx();
        Command invoke = new Command("connect", ++transactionIdCounter);

        AmfObject args = new AmfObject();
        args.setProperty("app", connectData.appName);
        args.setProperty("flashVer", "LNX 11,2,202,233"); // Flash player OS: Linux, version: 11.2.202.233
        args.setProperty("swfUrl", connectData.swfUrl);
        args.setProperty("tcUrl", connectData.tcUrl);
        args.setProperty("fpad", false);
        args.setProperty("capabilities", 239);
        args.setProperty("audioCodecs", 3575);
        args.setProperty("videoCodecs", 252);
        args.setProperty("videoFunction", 1);
        args.setProperty("pageUrl", connectData.pageUrl);
        args.setProperty("objectEncoding", 0);
        invoke.addData(args);
        MLog.i(TAG,"rtmpConnect---- queuesize = "+((NormalSendQueue)mSendQueue).getBufferFrameCount());
        Frame<Chunk> frame = new Frame(invoke, RtmpPacker.CONFIGRATION, Frame.FRAME_TYPE_CONFIGURATION);
        mSendQueue.putFrame(frame);
        state = State.CONNECTING;
    }

This information includes ip address, port number, stream name, flv version, video encoding format, audio encoding format and so on. After sending, the server replies to the command information of the response based on the live information sent by the client:

private void handleRxCommandInvoke(Command command) {
        String commandName = command.getCommandName();
        MLog.i(TAG,"handleRxCommandInvoke ");
        if(commandName.equals("_result")) {
            String method = sessionInfo.takeInvokedCommand(command.getTransactionId());

            MLog.d(TAG, "Got result for invoked method: " + method);
            if ("connect".equals(method)) {
                if(listener != null) {
                    listener.onRtmpConnectSuccess();
                }
                createStream();
            } else if("createStream".equals(method)) {
                currentStreamId = (int) ((AmfNumber) command.getData().get(1)).getValue();
                if(listener != null) {
                    listener.onCreateStreamSuccess();
                }
                fmlePublish();
            }
        } else if(commandName.equals("_error")) {
            String method = sessionInfo.takeInvokedCommand(command.getTransactionId());
            MLog.d(TAG, "Got error for invoked method: " + method);
            if ("connect".equals(method)) {
                stop();
                if(listener != null) {
                    listener.onRtmpConnectFail();
                }
            } else if("createStream".equals(method)) {
                stop();
                if(listener != null) {
                    listener.onCreateStreamFail();
                }
            }
        } else if(commandName.equals("onStatus")) {
            String code = ((AmfString) ((AmfObject) command.getData().get(1)).getProperty("code")).getValue();
            if (code.equals("NetStream.Publish.Start")||code.equals("NetStream.Record.Start")) {
                MLog.d(TAG, "Got publish start success");
                state = State.LIVING;
                if(listener != null) {
                    listener.onPublishSuccess();
                }
                onMetaData();
                // We can now publish AV data
                publishPermitted = true;
            } else {
                MLog.d(TAG, "Got publish start fail");
                stop();
                if(listener != null) {
                    listener.onPublishFail();
                }
            }
        } else {
            MLog.d(TAG, "Got Command result: " + commandName);
        }
    }

Among them, rmtp was established successfully, stream was created successfully, pushlish and other related commands.You can send live data when the stream creation command succeeds.

The video format specified above is flv. Although the stream was created successfully, there is no information about the video. Here is the information about the sending stream.First is the header information for flv:

 public static void writeFlvHeader(ByteBuffer buffer, boolean hasVideo, boolean hasAudio) {
        /**
         *  Flv Header There are always 9 bytes in the current version.
         *  The first to third bytes are the file identifiers, which are always "FLV" (0x46 0x4C 0x56), as shown in the purple area.
         *  The fourth byte is the version and is currently 1 (0x01).
         *  The first five bits of the fifth byte are reserved and must be 0.
         *  The 6th bit of the 5th byte indicates whether there is an audio Tag.
         *  The 7th bit of the 5th byte is reserved and must be 0.
         *  The 8th bit of the 5th byte indicates whether a video Tag exists.
         *  The 6-9 bytes are UI32 type values, representing the number of bytes from File Header to File Body, with a total of 9 in Version 1.
         */
        byte[] signature = new byte[] {'F', 'L', 'V'};  /* always "FLV" */
        byte version = (byte) 0x01;     /* should be 1 */
        byte videoFlag = hasVideo ? (byte) 0x01 : 0x00;
        byte audioFlag = hasAudio ? (byte) 0x04 : 0x00;
        byte flags = (byte) (videoFlag | audioFlag);  /* 4, audio; 1, video; 5 audio+video.*/
        byte[] offset = new byte[] {(byte) 0x00, (byte) 0x00, (byte) 0x00, (byte) 0x09};  /* always 9 */

        buffer.put(signature);
        buffer.put(version);
        buffer.put(flags);
        buffer.put(offset);
    }

The header contains the flv logo, whether there is audio and video information, version number and so on.Here's MetaData, which contains detailed information about the video:

public static byte[] writeFlvMetaData(int width, int height, int fps, int audioRate, int audioSize, boolean isStereo) {
        AmfString metaDataHeader = new AmfString("onMetaData", false);
        AmfMap amfMap = new AmfMap();
        MLog.i("writeFlvMetaData","writeFlvMetaData width = "+width+" height = "+height);
        amfMap.setProperty("width", width);
        amfMap.setProperty("height", height);
        amfMap.setProperty("framerate", fps);
        amfMap.setProperty("videocodecid", FlvVideoCodecID.AVC);
        amfMap.setProperty("audiosamplerate", audioRate);
        amfMap.setProperty("audiosamplesize", audioSize);
        if(isStereo) {
            amfMap.setProperty("stereo", true);
        } else {
            amfMap.setProperty("stereo", false);
        }
        amfMap.setProperty("audiocodecid", FlvAudio.AAC);

        int size = amfMap.getSize() + metaDataHeader.getSize();
        ByteBuffer amfBuffer = ByteBuffer.allocate(size);
        amfBuffer.put(metaDataHeader.getBytes());
        amfBuffer.put(amfMap.getBytes());
        return amfBuffer.array();
    }

This contains video details such as video width and height, keyframe frequency, video encoding, audio encoding, etc.Then there's the video data.The video format of flv is also made up of tagsize and tag, tag is made up of tagheader and tagboday.Detailed information can be searched online.

The basic process of live broadcasting is over, but what causes the black screen?We saved the relevant live data and found that flv has format header, matedata has, and it also carries relevant video information.Live data is also available in later frames.But why is it black screen?And if you take the saved live file to another player to play, there really is no picture.Reviewing live data over and over again to compare normal and abnormal broadcast data revealed that the abnormal live data lacked the first frame information of a video.For some reason, the first frame of information is written to the live broadcast before the live stream has started, which results in the loss of the first frame of information.It can be inferred that when a player parses a video, it does not only look at the video information inside the matedata, but also at the information of the first frame of the video. If the information of the first frame of the video is lost, even if there are other video frames in the video, the video will not be played.The first frame code is as follows:

    //Write to Flv Video Header
        writeVideoHeader(buffer, FlvVideoFrameType.KeyFrame, FlvVideoCodecID.AVC, FlvVideoAVCPacketType.SequenceHeader);

        buffer.put((byte)0x01);
        buffer.put(sps[1]);
        buffer.put(sps[2]);
        buffer.put(sps[3]);
        buffer.put((byte)0xff);

        buffer.put((byte)0xe1);
        buffer.putShort((short)sps.length);
        buffer.put(sps);

        buffer.put((byte)0x01);
        buffer.putShort((short)pps.length);
        buffer.put(pps);
    }

Tags: Mobile encoding network Linux

Posted on Sat, 14 Mar 2020 20:53:36 -0400 by christophe