Mode 1:
1. Generally, it is a switching mode. For example, when the system player switches the code stream, first Stop, then Create Player, then onPrepare, and then Start
Mode 2:
2. In fact, there is also a seamless switching code stream. If it is a single instance player, click to switch the code stream without destroying the player, just pause decoding. Start to request new url data. For example, after downloading 1-2 TS streams, send them to the decoder again, restart decoding, and then render the output. Achieve seamless output.
Mode 3:
3. If it is a multi instance player, the general implementation idea is to use two mediaplayers and two surfaceviews
First, let a MediaPlayer play a video, generally a small video, so that it does not occupy resources. When playing this MediaPlayer, let it Stop here, so that the MediaPlayer can be hidden. Pay attention not to Reset or Release this MediaPlayer. Then you can let another MediaPlayer play the video. At this time, when the MediaPlayer switches the video resources, there will be no black screen. To achieve the seamless effect, it is very simple to play to which position. It can be stored. When another MediaPlayer plays, if the data is removed, SeekTo will go to the corresponding position after playing. Then Start. The same is seamless.
Mode 4:
4. There has been a reference to a patent before. The idea is as follows (in fact, many examples are used):
After receiving the specification switching request from the user terminal, the video stream acquisition terminal keeps the original encoder instance running and starts a new encoder instance; then synchronizes the frame number of the new video stream with the original video stream; then selects a key frame in the new video stream and transmits the new video stream to the user terminal from the key frame, and the original video stream transmits the previous frame of the key frame After the data, the space between the frame number of the key frame and the frame number of the last key frame of the original video stream is greater than 1 / 2 of the GOP length of the new video stream; then the original encoder instance is closed.
Specific steps:
a. Keep the original encoder instance running, and start a new encoder instance according to the new specification requested by the user terminal. The original encoder instance refers to the encoder instance of the original specification requested by the user terminal last time;
b. The frame numbers of the new video stream and the original video stream are synchronized so that the frames of the same content in the two video streams correspond one by one; wherein, the new video stream refers to the video stream output by the new encoder instance, and the original video stream refers to the video stream output by the original encoder instance;
c. Select a key frame in the new video stream, and transmit the new video stream to the user terminal from the key frame, and the original video stream ends after transmitting the data of the previous frame of the key frame, and the space between the frame number of the key frame and the frame number of the last key frame of the original video stream is larger than 1 / 2 of the GOP length of the new video stream;
d. Close the original encoder instance, empty the encoder hardware resources and prepare for the next switch.
Give an example
Taking mode 3 as an example, let's do a simple Demo experiment to realize seamless bitstream cutting
First, take a look at the switching effect picture (two videos, one is playing and the other is recording scenery):
package com.example.hejunlin.seamlessvideo; import android.annotation.TargetApi; import android.app.AlertDialog; import android.content.DialogInterface; import android.content.pm.PackageManager; import android.media.MediaPlayer; import android.media.MediaPlayer.OnPreparedListener; import android.net.Uri; import android.os.Build; import android.os.Bundle; import android.os.Environment; import android.os.Handler; import android.support.annotation.NonNull; import android.support.v4.app.ActivityCompat; import android.support.v4.content.ContextCompat; import android.support.v7.app.AppCompatActivity; import android.util.Log; import android.view.Gravity; import android.view.SurfaceHolder; import android.view.SurfaceView; import android.view.View; import android.widget.FrameLayout; import static android.Manifest.permission.READ_EXTERNAL_STORAGE; import static android.Manifest.permission.WRITE_EXTERNAL_STORAGE; /** * Compatible with version 8.0 */ public class MainActivity extends AppCompatActivity { private SurfaceView mVideoSurface, mNextSurface; private FrameLayout mFrame; private MediaPlayer mCurrentMediaPlayer, mNextMediaPlayer; private Handler mHandler; private int mIndex = 0; private String path1 = Environment.getExternalStorageDirectory().getAbsolutePath() + "/1.mp4"; private String path2 = Environment.getExternalStorageDirectory().getAbsolutePath() + "/2.mp4"; private String[] paths = new String[]; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); if (!checkPermission()) { startAction(); } else { if (checkPermission()) { requestPermissionAndContinue(); } else { startAction(); } } } @Override protected void onDestroy() { super.onDestroy(); mHandler.removeCallbacks(mPlayRun); if (mCurrentMediaPlayer != null) { mCurrentMediaPlayer.release(); mCurrentMediaPlayer = null; } if (mNextMediaPlayer != null) { mNextMediaPlayer.release(); mNextMediaPlayer = null; } } Runnable mPlayRun = new Runnable() { @Override public void run() { Log.d(MainActivity.class.getSimpleName(), "run: "); mCurrentMediaPlayer.pause(); mNextMediaPlayer.pause(); mNextMediaPlayer.reset(); try { if (mIndex == 0) { String path = paths[mIndex % paths.length]; Log.d(MainActivity.class.getSimpleName(), "path1: " + path); mIndex++; mCurrentMediaPlayer.setDataSource(MainActivity.this, Uri.parse(path)); mCurrentMediaPlayer.setOnPreparedListener(new OnPreparedListener() { @Override public void onPrepared(MediaPlayer player) { Log.d(MainActivity.class.getSimpleName(), "start 1"); mCurrentMediaPlayer.start(); mVideoSurface.setVisibility(View.GONE); } }); mCurrentMediaPlayer.prepareAsync(); mNextMediaPlayer.setDataSource(MainActivity.this, Uri.parse(path)); mNextMediaPlayer.setOnPreparedListener(new OnPreparedListener() { @Override public void onPrepared(MediaPlayer arg0) { mNextMediaPlayer.start(); } }); mNextMediaPlayer.prepareAsync(); } else { String path = paths[mIndex % paths.length]; mIndex++; Log.d(MainActivity.class.getSimpleName(), "path2: " + path); mNextMediaPlayer.setDataSource(MainActivity.this, Uri.parse(path)); mNextMediaPlayer.setOnPreparedListener(new OnPreparedListener() { @Override public void onPrepared(MediaPlayer arg0) { mNextMediaPlayer.start(); Log.d(MainActivity.class.getSimpleName(), "start 2"); } }); mNextMediaPlayer.prepareAsync(); } } catch (Exception e) { e.printStackTrace(); } mHandler.postDelayed(mPlayRun, 10000); // First video 10 seconds } }; class VideoSurfaceHodlerCallback implements SurfaceHolder.Callback { @Override public void surfaceChanged( SurfaceHolder holder, int format, int width, int height) { } @Override public void surfaceCreated(SurfaceHolder holder) { mCurrentMediaPlayer.setDisplay(mVideoSurface.getHolder()); } @Override public void surfaceDestroyed(SurfaceHolder holder) { } } class NextVideoSurfaceHodlerCallback implements SurfaceHolder.Callback { @Override public void surfaceChanged( SurfaceHolder holder, int format, int width, int height) { } @Override public void surfaceCreated(SurfaceHolder holder) { mNextMediaPlayer.setDisplay(mNextSurface.getHolder()); } @Override public void surfaceDestroyed(SurfaceHolder holder) { } } private static final int PERMISSION_REQUEST_CODE = 200; private boolean checkPermission() { return ContextCompat.checkSelfPermission(this, WRITE_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED && ContextCompat.checkSelfPermission(this, READ_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED ; } private void requestPermissionAndContinue() { if (ContextCompat.checkSelfPermission(this, WRITE_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED && ContextCompat.checkSelfPermission(this, READ_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED) { if (ActivityCompat.shouldShowRequestPermissionRationale(this, WRITE_EXTERNAL_STORAGE) && ActivityCompat.shouldShowRequestPermissionRationale(this, READ_EXTERNAL_STORAGE)) { AlertDialog.Builder alertBuilder = new AlertDialog.Builder(this); alertBuilder.setCancelable(true); alertBuilder.setTitle("Authority application"); alertBuilder.setMessage("Get corresponding permission"); alertBuilder.setPositiveButton(android.R.string.yes, new DialogInterface.OnClickListener() { @TargetApi(Build.VERSION_CODES.JELLY_BEAN) public void onClick(DialogInterface dialog, int which) { ActivityCompat.requestPermissions(MainActivity.this, new String[], PERMISSION_REQUEST_CODE); } }); AlertDialog alert = alertBuilder.create(); alert.show(); Log.e("", "permission denied, show dialog"); } else { ActivityCompat.requestPermissions(MainActivity.this, new String[], PERMISSION_REQUEST_CODE); } } else { startAction(); } } @Override public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) { if (requestCode == PERMISSION_REQUEST_CODE) { if (permissions.length > 0 && grantResults.length > 0) { boolean flag = true; for (int i = 0; i < grantResults.length; i++) { if (grantResults[i] != PackageManager.PERMISSION_GRANTED) { flag = false; } } if (flag) { startAction(); } else { finish(); } } else { finish(); } } else { super.onRequestPermissionsResult(requestCode, permissions, grantResults); } } private void startAction() { mFrame = new FrameLayout(this); setContentView(mFrame); mHandler = new Handler(); mCurrentMediaPlayer = new MediaPlayer(); mNextMediaPlayer = new MediaPlayer(); mVideoSurface = new SurfaceView(this); mVideoSurface.getHolder().addCallback(new VideoSurfaceHodlerCallback()); mNextSurface = new SurfaceView(this); mNextSurface.getHolder().addCallback(new NextVideoSurfaceHodlerCallback()); FrameLayout.LayoutParams lp = new FrameLayout.LayoutParams( 1080, 1920); lp.gravity = Gravity.LEFT | Gravity.TOP; mVideoSurface.setLayoutParams(lp); lp = new FrameLayout.LayoutParams( 1080, 1920); lp.gravity = Gravity.LEFT | Gravity.TOP; mNextSurface.setLayoutParams(lp); mFrame.addView(mNextSurface); mFrame.addView(mVideoSurface); mHandler.postDelayed(mPlayRun, 0); } }
Welcome to my WeChat official account, "code breakout", sharing technology like Python, Java, big data, machine learning, AI, etc., focusing on code farming technology upgrading, workplace breakout, thinking leap, 200 thousand + code farm growth charging station, and growing up with you.