Search in sources :

Example 1 with FFmpegFrameFilter

use of org.bytedeco.javacv.FFmpegFrameFilter in project javacv by bytedeco.

the class DeinterlacedVideoPlayer method start.

public void start() {
    FrameFilter filter = null;
    try {
        startFrameGrabber();
        Frame frame = null;
        while ((frame = grabber.grab()) != null) {
            if (filter == null) {
                filter = new FFmpegFrameFilter(ffmpegString, frame.imageWidth, frame.imageHeight);
                filter.setPixelFormat(PIXEL_FORMAT);
                filter.start();
            }
            filter.push(frame);
            frame = filter.pull();
        // do something with the filtered frame
        }
    } catch (Exception | org.bytedeco.javacv.FrameFilter.Exception e) {
        throw new RuntimeException(e.getMessage(), e);
    } finally {
        releaseGrabberAndFilter(this.grabber, filter);
    }
}
Also used : Frame(org.bytedeco.javacv.Frame) FFmpegFrameFilter(org.bytedeco.javacv.FFmpegFrameFilter) FFmpegFrameFilter(org.bytedeco.javacv.FFmpegFrameFilter) FrameFilter(org.bytedeco.javacv.FrameFilter) Exception(org.bytedeco.javacv.FrameGrabber.Exception)

Example 2 with FFmpegFrameFilter

use of org.bytedeco.javacv.FFmpegFrameFilter in project javacv by bytedeco.

the class RecordActivity method initRecorder.

// ---------------------------------------
// initialize ffmpeg_recorder
// ---------------------------------------
private void initRecorder() {
    Log.w(LOG_TAG, "init recorder");
    Log.i(LOG_TAG, "ffmpeg_url: " + ffmpeg_link);
    recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
    recorder.setFormat("flv");
    recorder.setSampleRate(sampleAudioRateInHz);
    // Set in the surface changed method
    recorder.setFrameRate(frameRate);
    // The filterString  is any ffmpeg filter.
    // Here is the link for a list: https://ffmpeg.org/ffmpeg-filters.html
    filterString = "transpose=2,crop=w=200:h=200:x=0:y=0";
    filter = new FFmpegFrameFilter(filterString, imageWidth, imageHeight);
    // default format on android
    filter.setPixelFormat(avutil.AV_PIX_FMT_NV21);
    if (RECORD_LENGTH > 0) {
        imagesIndex = 0;
        images = new Frame[RECORD_LENGTH * frameRate];
        timestamps = new long[images.length];
        for (int i = 0; i < images.length; i++) {
            images[i] = new Frame(imageWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
            timestamps[i] = -1;
        }
    } else if (yuvImage == null) {
        yuvImage = new Frame(imageWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
        Log.i(LOG_TAG, "create yuvImage");
    }
    Log.i(LOG_TAG, "recorder initialize success");
    audioRecordRunnable = new AudioRecordRunnable();
    audioThread = new Thread(audioRecordRunnable);
    runAudioThread = true;
}
Also used : Frame(org.bytedeco.javacv.Frame) FFmpegFrameFilter(org.bytedeco.javacv.FFmpegFrameFilter) FFmpegFrameRecorder(org.bytedeco.javacv.FFmpegFrameRecorder)

Aggregations

FFmpegFrameFilter (org.bytedeco.javacv.FFmpegFrameFilter)2 Frame (org.bytedeco.javacv.Frame)2 FFmpegFrameRecorder (org.bytedeco.javacv.FFmpegFrameRecorder)1 FrameFilter (org.bytedeco.javacv.FrameFilter)1 Exception (org.bytedeco.javacv.FrameGrabber.Exception)1