Previously, I learned FFMPEG, using FFmPEG decoding in Ubuntu, but arriving with Android, finding more codecs, MediaCodec, after work, taking time, learning this class, do some records For subsequent review. The MediaCodec class can be used to access the media codec for Android underlying, which is part of the underlying interface provided by Android for multimedia support (usually used with MediaExTractor, MediaSync, MediaMuxer, Mediacrypto, MediadRM, Image, Surface, and Audiotrack) - this is A Brief Introduction to the MediaCodec class. Read the API document, or in the fog, so I have designed an experiment and slowly try the use of MediaCodec classes. The experiment I have designed is very simple, just getting data from the camera, then use MediaCode to compress it to H.264 format, and write a file after compression. Before getting the data of the camera, it has been said in one article, yes, based on this, we will encode the data. Let's take a look at the results, because the H.264 file can be played directly:
Remove the image data from the camera
First, create a TextureView object or SurfaceView, both, here you use textureview, it is used directly in the layout file, then use FindViewbyID to find it in Activity. Second, set the listener for TextureView.setsurfaceTureListener (this); Acitivity implements this listener interface. Then, initialize the camera in the OnSurfaceTextureAvailable function, and open, just like this:
@Override
Public void OnsurfaceTextureavailable (SurfaceTexture Surface, Int Width, INT Height) {
Log.d ("jw_liu", "onpreviewframe");
Mcamera = Camera.open ();
Try {
Mcamera.SetPreviewTexture (Surface);
Mcamera.SetPreviewCallback (New Camera.PreviewCallback () {
@Override
Public void onpreviewframe (byte [] data, camera badra) {
Log.d ("jinwei", "l:" + data.length);
IF (MediaCodeCManager.frame! = null) {
System.ArrayCopy (Data, 0, Mediacodecmanager.frame, 0, Data.Length);
}
}
});
} catch (ioexception e) {
E.PrintStackTrace ();
}
Camera.Parameters Parameters = mcamera.getParameters ();
Parameters.SetPictureFormat (Pixelformat.jpeg);
Parameters.SetPreviewFormat (Pixelformat.ycbcr_420_sp);
Parameters.SetPreviewSize (480, 320);
Parameters.Set ("Orientation", "Portrait");
Parameters.Set ("Rotation", 180);
Mcamera.SetDisplayorientation (180);
Mcamera.SetParameters (Parameters);
Mcamera.startPreview ();
}
@Override
Public void OnsurfaceTextureSizechanged (SurfaceTexture Surface, Int Width, Int Height) {
// log.d ("jw_liu", "onsurfacetexturesizechange);
}
@Override
Public Boolean OnSurfaceTextureDestroyed (SurfaceTexture Surface) {
// log.d ("jw_liu", "onsurfacetexturedestroyed");
Return False;
}
@Override
Public void OnsurfaceTextureUpdated (SurfaceTexture Surface) {
// log.d ("jw_liu", "onsurfacetextureupdated");
} 12345678910111213141516171819202122232425262728293033333343436373839464777
This process is very simple, we need to pay attention to the configuration parameters of the camera. We configured the preview format of the camera to YCBCR_420_SP, the preview size is 420 * 320, which is very important because the encoder also needs to set the size and picture format of the encoded image, if both are consistent, then you I will save the cumbersome work between this conversion.
We set the preview of the callback function to Camera, so that after the camera is collected, the callback function will pass the preview data to you, thereby, we get the data of the camera. After obtaining the data of the camera, we can encode it. I put this part of the logic in a separate class MediaCodeCManager. This class creates an instance of a meidAcodec and uses it to encode the video screen.
Encoding process
First, we have to get an encoder, this is not easy to say, we can get directly by type: MediaCodec = MediaCodec.createEncoderbyType ("VIDEO / AVC"); Second, this encoder we need to configure him. There is a clear title of the configured entry in the official document, many entries must be configured, configured, calling the MediaCodec.Start () method will fail. I have configured for encoders as follows:
Try {
Mediacodec = Mediacodec.createEncoderbyType ("VIDEO / AVC");
} catch (ioexception e) {
E.PrintStackTrace ();
}
Mediaformat MediaFormat = MediaFormat.createvideoFormat ("VIDEO / AVC", 480, 320);
MediaFormat.setinteger (MediaFormat.Key_bit_Rate, 125000);
MediaFormat.setinteger (MediaFormat.Key_Frame_Rate, 25);
Mediaformat.setinteger (MediaFormat.key_color_format, mediacodecinfo.codeccapabilities.color_formatyuv420planar);
MediaFormat.setinteger (MediaFormat.Key_i_Frame_Interval, 5);
Mediacodec.setcallback (new mediacodec.callback () {
@Override
Public void onInputBuffRavailable (MediaCodec Codec, int index) {
Log.d ("JINWEI", "OnInputBuffRavailable:" + index);
Bytebuffer Bytebuffer = CODEC.GETINPUTBUFFER (INDEX);
Bytebuffer.put (frame);
Codec.queueInputBuffer (Index, 0, Frame.length, 1, Buffer_Flag_codec_config);
}
@Override
Public void onoutputbuffravailable (MediaCodec Codec, int index, mediacodec.bufferinfo info) {
Log.d ("jinwei", "onoutputbuffravailable:" + index);
IF (INDEX> -1) {
BYTEBUFFER OUTPUTBUFFER = CODEC.GETOUTPUTBUFFER (INDEX);
Byte [] bb = new byte [info.size];
OutputBuffer.get (bb);
Try {
FileoutPutStream.write (bb);
} catch (ioexception e) {
E.PrintStackTrace ();
}
Codec.releaseOutputBuffer (INDEX, FALSE);
}
}
@Override
Public void Onerror (Mediacodec Codec, MediaCodec.codecexception E) {
Log.d ("jinwei", "one error");
Codec.reset ();
}
@Override
Public void onoutputformatchanged (MediaCodec Codec, Mediaformat Format) {
Log.d ("jinwei","onoutputformatchanged");
}
});
Mediacodec.configure (MediaFormat, Null, Null, MediaCodec.configure_flag_encode);
Mediacodec.start (); 123456789101112131415161718192021222324252627282930333839464144434445464748
First, when we create a video format, specify the size of 480 * 320, which is consistent with the size of the preview in the camera. Mediaformat MediaFormat = MediaFormat.createvideoFormat ("VIDEO / AVC", 480, 320); key_bit_rate, baud rate, this is necessary. KEY_FRAME_RATE, how many frames are required for a second, and it is necessary. Picture Format: Color_Formatyuv420Planar, the image format of the camera is YCBCR_420_SP, plus PLANAR is stored in three components in three arrays in three arrays, and YUVR_420_sp is stored in an array. This has no relationship. Key_i_frame_interval, the interval of key frames involves the specific encoding process of H.264, many people are configured to 5, and I am also configured to be 5.
YUV420 format data, stored size is width * height * 1.5; therefore, for an image of a frame 480 * 320 size, we need to assign 480 * 320 * 1.5 bytes, which is 230400. Therefore, in the program, we allocate a BYTE array of 230400 size in MediaCodeCManager, constantly writing collected data into this array, then take data from the encoder, encoding, and writing the encoded data .
Finally, it is emphasized that MediaCodec.SetCallback (New MediaCodec.callback () This code sets a callback interface to the encoder, which makes the encoder work asynchronously.
If you feel a little hard, you need to understand the workflow of the encoder, which is actually a state machine: After we create an encoder, it is in the uninitialized state, we call the configure function to configure it, its status Change to configured, after calling START, it belongs to the FLUSHED state, this time, its input buffer is empty, the onInputBuffRavailable method is called to tell you that you can write data. At this time we can write data into its input cache, then submit it. The encoder is paired after the encoding of the data that we submitted, and the onoutputbuffravailable release is called. If a file is over, we need to be launched in the last frame of data. This is the last frame so that the encoder will stop accepting the input of the data. After the current cache data is completed, the encoder can be release. If you see this here, you still don't understand how the encoder is used, then look at the introduction of the API document, and then do this simple experiment, you will definitely be successful.
The following is a complete code, very small, two Java files and a layout file, for reference only: Of course, don't forget the permissions
1234567
Layout file:
XML Version = "1.0" encoding = "UTF-8"?>
Framelayout>
RelativeLayout> 1234567891011121314151617
Mediacodecmanager.java
Import Android.content.Context;
Import android.media.mediacodec;
Import android.media.mediacodecinfo;
Import android.media.mediacodeclist;
Import android.media.mediaformat;
Import Android.OS.Environment;
Import android.util.log;
Import java.io.file;
Import java.io.filenotfoundexception;
Import java.io.fileoutputstream;
Import java.io.ioException;
Import java.nio.bytebuffer;
Import static android.media.mediacodec.buffer_flag_codec_config;
/ **
* CREATED by Administrator on 2017/3/31 0031.
* /
Public class mediacodecmanager {
Mediacodec Mediacode;
PRIVATE INT count = 0;
Public static factory byte [] frame = new byte [230400];
FileOutputstream FileOutputStream;
Public MediaCodeCManager (Context Context) {
Int numcodecs = mediacodeclist.getcodeccount ();
For (int i = 0; i -1) {
BYTEBUFFER OUTPUTBUFFER = CODEC.GETOUTPUTBUFFER (INDEX);
Byte [] bb = new byte [info.size];
OutputBuffer.get (bb);
Try {
FileoutPutStream.write (bb);
} catch (ioexception e) {
E.PrintStackTrace ();
}
Codec.releaseOutputBuffer (INDEX, FALSE);
}
}
@Override
Public void Onerror (Mediacodec Codec, MediaCodec.codecexception E) {
Log.d ("jinwei", "one error");
Codec.reset ();
}
@Override
Public void onoutputformatchanged (MediaCodec Codec, Mediaformat Format) {
Log.d ("jinwei", "onoutputformatchanged");
}
});
Mediacodec.configure (MediaFormat, Null, Null, MediaCodec.configure_flag_encode);
Mediacodec.start ();
}
Public void release () {
Mediacodec.release ();
}123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899 MainActivity
import android.app.ActionBar;
import android.app.Activity;
import android.graphics.PixelFormat;
import android.graphics.SurfaceTexture;
import android.hardware.Camera;
import android.os.Handler;
import android.os.Message;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.util.Log;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.TextureView;
import android.view.WindowManager;
import java.io.IOException;
import java.util.List;
public class MainActivity extends Activity implements TextureView.SurfaceTextureListener{
TextureView textureView;
Camera mCamera;
boolean isProcess = false;
MediaCodecManager mediaCodecManager ;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
this.getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
setContentView(R.layout.activity_main);
textureView = (TextureView) findViewById(R.id.texture_view);
textureView.setSurfaceTextureListener(this);
mediaCodecManager= new MediaCodecManager(this);
}
private long clarityCalculator(byte[] data,int width,int height){
long rest = 0;
long restF = 0;
if(width<400 || height< 400)return 0;
int startX = (width-400)/2;
int startY = (height-400)/2;
int startBase = startY*1280+startX;
for(int i=0;i<100;i+=4){
for(int j=0;j<100;j+=4){
rest = 10*data[startBase+j*width+i];
rest -= 4*data[startBase+j*width+i+1];
rest -= 4*data[startBase+(j+1)*width+j];
rest -= data[startBase+(j-1)*width+i+1];
rest -= data[startBase+(j+1)*width+i+1];
restF+=rest*rest;
}
}
return restF/10000;
}
@Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
Log.d("jw_liu","onPreviewFrame");
mCamera = Camera.open();
try {
mCamera.setPreviewTexture(surface);
mCamera.setPreviewCallback(new Camera.PreviewCallback() {
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
Log.d("jinwei","l:"+data.length);
if(MediaCodecManager.frame != null){
System.arraycopy(data,0,MediaCodecManager.frame,0,data.length);
}
}
});
} catch (IOException e) {
e.printStackTrace();
}
Camera.Parameters parameters = mCamera.getParameters();
parameters.setPictureFormat(PixelFormat.JPEG);
parameters.setPreviewFormat(PixelFormat.YCbCr_420_SP);
parameters.setPreviewSize(480, 320);
parameters.set("orientation", "portrait");
parameters.set("rotation", 180);
mCamera.setDisplayOrientation(180);
mCamera.setParameters(parameters);
mCamera.startPreview();
}
@Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {
// Log.d("jw_liu","onSurfaceTextureSizeChanged");
}
@Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
// Log.d("jw_liu","onSurfaceTextureDestroyed");
return false;
}
@Override
public void onSurfaceTextureUpdated(SurfaceTexture surface) {
// Log.d("jw_liu","onSurfaceTextureUpdated");
}
@Override
protected void onDestroy() {
super.onDestroy();
mCamera.release();
mediaCodecManager.release();
}
}123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109
小结
上面的程序执行完成后,会在/sdcard/下生成test.avc文件,咦,不是h.264吗?真么是avc,还有,我们之前的编码器类型也是avc啊?我一开始也有这样的困惑,百度了下,发现他们是等价的。我们使用adb把test.avc pull出来,然后就可以播放它了。强烈建议你下载ffplay,ffmpeg等工具,播放h.264格式的文件只需要一个命令:ffplay test.avc 如果你想生成gif,你可以使用ffmpeg:ffmpeg -i test.avc -s 480x320 -r 10 -t 10 test.gif 这条命令能生成帧率为10,大小为480*320,在原视频中时长为10秒的gif。
Our other product: