Search in sources :

Example 1 with AudibleTimelineClip

use of com.helospark.tactview.core.timeline.AudibleTimelineClip in project tactview by helospark.

the class ClipPatternDrawerListener method updatePatternDelegate.

private Image updatePatternDelegate(TimelineClip clipToUpdate, TimelineInterval interval, double zoom, int pixelWidth) {
    if (patternDrawingEnabled) {
        LOGGER.debug("Generating pattern for clip={} with the local interval={} and zoom={}", clipToUpdate.getId(), interval, zoom);
        double visibleStartPosition = interval.getStartPosition().getSeconds().doubleValue();
        double visibleEndPosition = interval.getEndPosition().getSeconds().doubleValue();
        Image image = null;
        if (clipToUpdate instanceof VisualTimelineClip) {
            VisualTimelineClip videoClip = (VisualTimelineClip) clipToUpdate;
            image = timelineImagePatternService.createTimelinePattern(videoClip, pixelWidth, visibleStartPosition, visibleEndPosition);
        } else if (clipToUpdate instanceof AudibleTimelineClip) {
            AudibleTimelineClip audibleTimelineClip = (AudibleTimelineClip) clipToUpdate;
            image = audioImagePatternService.createAudioImagePattern(audibleTimelineClip, pixelWidth, visibleStartPosition, visibleEndPosition);
        }
        return image;
    }
    return null;
}
Also used : VisualTimelineClip(com.helospark.tactview.core.timeline.VisualTimelineClip) Image(javafx.scene.image.Image) AudibleTimelineClip(com.helospark.tactview.core.timeline.AudibleTimelineClip)

Example 2 with AudibleTimelineClip

use of com.helospark.tactview.core.timeline.AudibleTimelineClip in project tactview by helospark.

the class ClipAddedListener method initializeProjectOnFirstVideoClipAdded.

private void initializeProjectOnFirstVideoClipAdded(TimelineClip clip) {
    if (!projectRepository.isVideoInitialized() && clip instanceof VisualTimelineClip) {
        VisualTimelineClip visualClip = (VisualTimelineClip) clip;
        VisualMediaMetadata metadata = visualClip.getMediaMetadata();
        int width = metadata.getWidth();
        int height = metadata.getHeight();
        if (metadata instanceof VideoMetadata && visualClip instanceof VideoClip) {
            double rotation = ((VideoMetadata) metadata).getRotation();
            if (MathUtil.fuzzyEquals(Math.abs(rotation), 90.0) && ((VideoClip) visualClip).isRotationEnabledAt(TimelinePosition.ofZero())) {
                int tmp = width;
                width = height;
                height = tmp;
            }
        }
        BigDecimal fps = metadata instanceof VideoMetadata ? new BigDecimal(((VideoMetadata) metadata).getFps()) : new BigDecimal("30");
        projectSizeInitializer.initializeProjectSize(width, height, fps);
    }
    if (!projectRepository.isAudioInitialized() && clip instanceof AudibleTimelineClip) {
        AudibleTimelineClip audioClip = (AudibleTimelineClip) clip;
        int sampleRate = audioClip.getMediaMetadata().getSampleRate();
        int bytesPerSample = audioClip.getMediaMetadata().getBytesPerSample();
        int numberOfChannels = audioClip.getMediaMetadata().getChannels();
        projectRepository.initializeAudio(sampleRate, bytesPerSample, numberOfChannels);
    }
}
Also used : VideoClip(com.helospark.tactview.core.timeline.VideoClip) VisualMediaMetadata(com.helospark.tactview.core.decoder.VisualMediaMetadata) VisualTimelineClip(com.helospark.tactview.core.timeline.VisualTimelineClip) VideoMetadata(com.helospark.tactview.core.decoder.VideoMetadata) BigDecimal(java.math.BigDecimal) AudibleTimelineClip(com.helospark.tactview.core.timeline.AudibleTimelineClip)

Example 3 with AudibleTimelineClip

use of com.helospark.tactview.core.timeline.AudibleTimelineClip in project tactview by helospark.

the class LongProcessRequestor method requestAudioFrames.

public <T extends StatelessAudioEffect & LongProcessAudibleImagePushAware> void requestAudioFrames(T target, LongProcessFrameRequest longProcessFrameRequest) {
    AudibleTimelineClip clip = (AudibleTimelineClip) timelineManagerAccessor.findClipForEffect(target.getId()).get();
    Optional<Integer> effectChannel = clip.getEffectChannelIndex(target.getId());
    LongProcessDescriptor descriptor = new LongProcessDescriptor();
    descriptor.setClipId(clip.getId());
    descriptor.setEffectId(Optional.of(target.getId()));
    descriptor.setJobId(UUID.randomUUID().toString());
    if (longProcessFrameRequest.getDuplaceRequestStrategy().equals(ONLY_KEEP_LATEST_REQUEST)) {
        removeAndStopAllJobsWithEffectId(target.getId());
    }
    Runnable runnable = audioLongProcessRunnableFactory.createAudibleRunnable(target, clip, effectChannel, descriptor);
    descriptor.setRunnable(runnable);
    requestedJobs.add(descriptor);
}
Also used : AudibleTimelineClip(com.helospark.tactview.core.timeline.AudibleTimelineClip)

Aggregations

AudibleTimelineClip (com.helospark.tactview.core.timeline.AudibleTimelineClip)3 VisualTimelineClip (com.helospark.tactview.core.timeline.VisualTimelineClip)2 VideoMetadata (com.helospark.tactview.core.decoder.VideoMetadata)1 VisualMediaMetadata (com.helospark.tactview.core.decoder.VisualMediaMetadata)1 VideoClip (com.helospark.tactview.core.timeline.VideoClip)1 BigDecimal (java.math.BigDecimal)1 Image (javafx.scene.image.Image)1