Search in sources :

Example 1 with VideoClip

use of com.helospark.tactview.core.timeline.VideoClip in project tactview by helospark.

the class CreateLowResolutionProxyChainItem method createMenu.

@Override
public MenuItem createMenu(ClipContextMenuChainItemRequest request) {
    VideoClip videoClip = (VideoClip) request.getPrimaryClip();
    MenuItem menuItem = new MenuItem("Create lowres proxy");
    menuItem.setOnAction(e -> {
        proxyCreationService.createProxy(videoClip, uiProjectRepository.getPreviewWidth(), uiProjectRepository.getPreviewHeight());
    });
    return menuItem;
}
Also used : VideoClip(com.helospark.tactview.core.timeline.VideoClip) MenuItem(javafx.scene.control.MenuItem)

Example 2 with VideoClip

use of com.helospark.tactview.core.timeline.VideoClip in project tactview by helospark.

the class ClipLoadIT method testOgvVideo.

@Test
public void testOgvVideo(@DownloadedResourceName("earth.ogv") File testFile) {
    VideoClip videoClip = (VideoClip) fakeUi.dragFileToTimeline(testFile, TimelinePosition.ofZero());
    ReadOnlyClipImage imageFrame = getFrame(timelineManager, videoClip.getMediaMetadata(), 0.1, new TimelinePosition(1.0));
    ClipImage expected = IntegrationTestUtil.loadTestClasspathImage("clipit/earth_ogv_at_1s.png");
    IntegrationTestUtil.assertFrameEquals(imageFrame, expected, "Video frames not equal");
}
Also used : VideoClip(com.helospark.tactview.core.timeline.VideoClip) ReadOnlyClipImage(com.helospark.tactview.core.timeline.image.ReadOnlyClipImage) ReadOnlyClipImage(com.helospark.tactview.core.timeline.image.ReadOnlyClipImage) ClipImage(com.helospark.tactview.core.timeline.image.ClipImage) TimelinePosition(com.helospark.tactview.core.timeline.TimelinePosition) Test(org.junit.jupiter.api.Test)

Example 3 with VideoClip

use of com.helospark.tactview.core.timeline.VideoClip in project tactview by helospark.

the class ClipLoadIT method test4kh264Video.

@Test
public void test4kh264Video(@DownloadedResourceName("4k_h264_beaches.mp4") File testFile) {
    VideoClip videoClip = (VideoClip) fakeUi.dragFileToTimeline(testFile, TimelinePosition.ofZero());
    ReadOnlyClipImage videoFrame = getFrame(timelineManager, videoClip.getMediaMetadata(), 0.1, TimelinePosition.ofSeconds(1));
    ClipImage expected = IntegrationTestUtil.loadTestClasspathImage("clipit/beach_at_1s_and_0.1_scale.png");
    IntegrationTestUtil.assertFrameEquals(videoFrame, expected, "Video frames not equal");
}
Also used : VideoClip(com.helospark.tactview.core.timeline.VideoClip) ReadOnlyClipImage(com.helospark.tactview.core.timeline.image.ReadOnlyClipImage) ReadOnlyClipImage(com.helospark.tactview.core.timeline.image.ReadOnlyClipImage) ClipImage(com.helospark.tactview.core.timeline.image.ClipImage) Test(org.junit.jupiter.api.Test)

Example 4 with VideoClip

use of com.helospark.tactview.core.timeline.VideoClip in project tactview by helospark.

the class ClipAddedListener method initializeProjectOnFirstVideoClipAdded.

private void initializeProjectOnFirstVideoClipAdded(TimelineClip clip) {
    if (!projectRepository.isVideoInitialized() && clip instanceof VisualTimelineClip) {
        VisualTimelineClip visualClip = (VisualTimelineClip) clip;
        VisualMediaMetadata metadata = visualClip.getMediaMetadata();
        int width = metadata.getWidth();
        int height = metadata.getHeight();
        if (metadata instanceof VideoMetadata && visualClip instanceof VideoClip) {
            double rotation = ((VideoMetadata) metadata).getRotation();
            if (MathUtil.fuzzyEquals(Math.abs(rotation), 90.0) && ((VideoClip) visualClip).isRotationEnabledAt(TimelinePosition.ofZero())) {
                int tmp = width;
                width = height;
                height = tmp;
            }
        }
        BigDecimal fps = metadata instanceof VideoMetadata ? new BigDecimal(((VideoMetadata) metadata).getFps()) : new BigDecimal("30");
        projectSizeInitializer.initializeProjectSize(width, height, fps);
    }
    if (!projectRepository.isAudioInitialized() && clip instanceof AudibleTimelineClip) {
        AudibleTimelineClip audioClip = (AudibleTimelineClip) clip;
        int sampleRate = audioClip.getMediaMetadata().getSampleRate();
        int bytesPerSample = audioClip.getMediaMetadata().getBytesPerSample();
        int numberOfChannels = audioClip.getMediaMetadata().getChannels();
        projectRepository.initializeAudio(sampleRate, bytesPerSample, numberOfChannels);
    }
}
Also used : VideoClip(com.helospark.tactview.core.timeline.VideoClip) VisualMediaMetadata(com.helospark.tactview.core.decoder.VisualMediaMetadata) VisualTimelineClip(com.helospark.tactview.core.timeline.VisualTimelineClip) VideoMetadata(com.helospark.tactview.core.decoder.VideoMetadata) BigDecimal(java.math.BigDecimal) AudibleTimelineClip(com.helospark.tactview.core.timeline.AudibleTimelineClip)

Example 5 with VideoClip

use of com.helospark.tactview.core.timeline.VideoClip in project tactview by helospark.

the class LowResolutionProxyCreatorService method createLowResolutionProxyInternal.

private void createLowResolutionProxyInternal(VideoClip clip, int proxyWidth, int proxyHeight) {
    File proxyFile = new File(path + File.separator + clip.getId() + ".mp4");
    VideoMetadata originalMetadata = (VideoMetadata) clip.getMediaMetadata();
    BigDecimal frameTime = BigDecimal.ONE.divide(new BigDecimal(originalMetadata.getFps()), 10, RoundingMode.HALF_UP);
    FFmpegInitEncoderRequest initNativeRequest = new FFmpegInitEncoderRequest();
    initNativeRequest.fileName = proxyFile.getAbsolutePath();
    // why int?
    initNativeRequest.fps = (int) originalMetadata.getFps();
    int width = proxyWidth;
    if (width % 2 == 1) {
        width++;
    }
    int height = proxyHeight;
    if (height % 2 == 1) {
        height++;
    }
    initNativeRequest.renderWidth = width;
    initNativeRequest.renderHeight = height;
    initNativeRequest.actualWidth = width;
    initNativeRequest.actualHeight = height;
    initNativeRequest.bytesPerSample = 0;
    initNativeRequest.audioChannels = 0;
    initNativeRequest.sampleRate = 0;
    initNativeRequest.audioBitRate = 0;
    initNativeRequest.audioSampleRate = 0;
    initNativeRequest.videoBitRate = 320000;
    initNativeRequest.videoCodec = "default";
    initNativeRequest.audioCodec = "none";
    initNativeRequest.videoPixelFormat = "default";
    initNativeRequest.videoPreset = "ultrafast";
    int encoderIndex = ffmpegBasedMediaEncoder.initEncoder(initNativeRequest);
    if (encoderIndex < 0) {
        throw new RuntimeException("Unable to render, statuscode is " + encoderIndex + " , check logs");
    }
    int allJobs = originalMetadata.getLength().getSeconds().divide(frameTime, 10, RoundingMode.HALF_UP).intValue();
    String jobId = UUID.randomUUID().toString();
    messagingService.sendAsyncMessage(new ProgressInitializeMessage(jobId, allJobs, ProgressType.LOW_RES_PROXY_CREATE));
    TimelinePosition position = TimelinePosition.ofZero();
    int frameIndex = 0;
    while (position.isLessOrEqualToThan(originalMetadata.getLength().toPosition())) {
        RequestFrameParameter frameRequest = RequestFrameParameter.builder().withWidth(width).withHeight(height).withPosition(position).withLowResolutionPreview(false).withUseApproximatePosition(false).build();
        ByteBuffer frame = clip.requestFrame(frameRequest).getBuffer();
        FFmpegEncodeFrameRequest nativeRequest = new FFmpegEncodeFrameRequest();
        nativeRequest.frame = new RenderFFMpegFrame();
        RenderFFMpegFrame[] array = (RenderFFMpegFrame[]) nativeRequest.frame.toArray(1);
        array[0].imageData = frame;
        nativeRequest.encoderIndex = encoderIndex;
        nativeRequest.startFrameIndex = frameIndex;
        int encodeResult = ffmpegBasedMediaEncoder.encodeFrames(nativeRequest);
        if (encodeResult < 0) {
            throw new RuntimeException("Cannot encode frames, error code " + encodeResult);
        }
        GlobalMemoryManagerAccessor.memoryManager.returnBuffer(frame);
        messagingService.sendAsyncMessage(new ProgressAdvancedMessage(jobId, 1));
        ++frameIndex;
        position = position.add(frameTime);
    }
    FFmpegClearEncoderRequest clearRequest = new FFmpegClearEncoderRequest();
    clearRequest.encoderIndex = encoderIndex;
    ffmpegBasedMediaEncoder.clearEncoder(clearRequest);
    VideoMetadata metadata = mediaDecoder.readMetadata(proxyFile);
    VisualMediaSource videoSource = new VisualMediaSource(proxyFile, mediaDecoder);
    VideoClip.LowResolutionProxyData proxyData = new VideoClip.LowResolutionProxyData(videoSource, metadata);
    clip.setLowResolutionProxy(proxyData);
    messagingService.sendAsyncMessage(new ProgressDoneMessage(jobId));
}
Also used : RequestFrameParameter(com.helospark.tactview.core.timeline.RequestFrameParameter) ProgressAdvancedMessage(com.helospark.tactview.core.timeline.message.progress.ProgressAdvancedMessage) VisualMediaSource(com.helospark.tactview.core.timeline.VisualMediaSource) FFmpegInitEncoderRequest(com.helospark.tactview.core.render.ffmpeg.FFmpegInitEncoderRequest) ByteBuffer(java.nio.ByteBuffer) VideoMetadata(com.helospark.tactview.core.decoder.VideoMetadata) BigDecimal(java.math.BigDecimal) FFmpegEncodeFrameRequest(com.helospark.tactview.core.render.ffmpeg.FFmpegEncodeFrameRequest) FFmpegClearEncoderRequest(com.helospark.tactview.core.render.ffmpeg.FFmpegClearEncoderRequest) VideoClip(com.helospark.tactview.core.timeline.VideoClip) ProgressInitializeMessage(com.helospark.tactview.core.timeline.message.progress.ProgressInitializeMessage) TimelinePosition(com.helospark.tactview.core.timeline.TimelinePosition) File(java.io.File) ProgressDoneMessage(com.helospark.tactview.core.timeline.message.progress.ProgressDoneMessage) RenderFFMpegFrame(com.helospark.tactview.core.render.ffmpeg.RenderFFMpegFrame)

Aggregations

VideoClip (com.helospark.tactview.core.timeline.VideoClip)13 File (java.io.File)7 VisualMediaSource (com.helospark.tactview.core.timeline.VisualMediaSource)6 VideoMetadata (com.helospark.tactview.core.decoder.VideoMetadata)5 Test (org.junit.jupiter.api.Test)5 TimelinePosition (com.helospark.tactview.core.timeline.TimelinePosition)4 ClipImage (com.helospark.tactview.core.timeline.image.ClipImage)4 ReadOnlyClipImage (com.helospark.tactview.core.timeline.image.ReadOnlyClipImage)4 BigDecimal (java.math.BigDecimal)3 GifVideoMetadata (com.helospark.tactview.core.decoder.gif.GifVideoMetadata)2 ProgressAdvancedMessage (com.helospark.tactview.core.timeline.message.progress.ProgressAdvancedMessage)2 ProgressDoneMessage (com.helospark.tactview.core.timeline.message.progress.ProgressDoneMessage)2 ProgressInitializeMessage (com.helospark.tactview.core.timeline.message.progress.ProgressInitializeMessage)2 Qualifier (com.helospark.lightdi.annotation.Qualifier)1 Service (com.helospark.lightdi.annotation.Service)1 VisualMediaMetadata (com.helospark.tactview.core.decoder.VisualMediaMetadata)1 GlobalMemoryManagerAccessor (com.helospark.tactview.core.decoder.framecache.GlobalMemoryManagerAccessor)1 ImageSequenceDecoderDecorator (com.helospark.tactview.core.decoder.imagesequence.ImageSequenceDecoderDecorator)1 FFmpegClearEncoderRequest (com.helospark.tactview.core.render.ffmpeg.FFmpegClearEncoderRequest)1 FFmpegEncodeFrameRequest (com.helospark.tactview.core.render.ffmpeg.FFmpegEncodeFrameRequest)1