Search in sources :

Example 16 with Mpeg7Catalog

use of org.opencastproject.metadata.mpeg7.Mpeg7Catalog in project opencast by opencast.

the class VideoSegmenterServiceImpl method uniformSegmentation.

/**
 * Creates a uniform segmentation for a given track, with prefNumber as the number of segments
 * which will all have the same length
 *
 * @param track the track that is segmented
 * @param segmentsNew will be set to list of new segments (pass null if not required)
 * @param prefNumber number of generated segments
 * @return Mpeg7Catalog that can later be saved in a Catalog as endresult
 */
protected Mpeg7Catalog uniformSegmentation(Track track, LinkedList<Segment> segmentsNew, int prefNumber) {
    if (segmentsNew == null) {
        segmentsNew = new LinkedList<Segment>();
    }
    MediaTime contentTime = new MediaRelTimeImpl(0, track.getDuration());
    MediaLocator contentLocator = new MediaLocatorImpl(track.getURI());
    Mpeg7Catalog mpeg7 = mpeg7CatalogService.newInstance();
    Video videoContent = mpeg7.addVideoContent("videosegment", contentTime, contentLocator);
    long segmentDuration = track.getDuration() / prefNumber;
    long currentSegStart = 0;
    // create "prefNumber"-many segments that all have the same length
    for (int i = 1; i < prefNumber; i++) {
        Segment s = videoContent.getTemporalDecomposition().createSegment("segment-" + i);
        s.setMediaTime(new MediaRelTimeImpl(currentSegStart, segmentDuration));
        segmentsNew.add(s);
        currentSegStart += segmentDuration;
    }
    // add last segment separately to make sure the last segment ends exactly at the end of the track
    Segment s = videoContent.getTemporalDecomposition().createSegment("segment-" + prefNumber);
    s.setMediaTime(new MediaRelTimeImpl(currentSegStart, track.getDuration() - currentSegStart));
    segmentsNew.add(s);
    return mpeg7;
}
Also used : Mpeg7Catalog(org.opencastproject.metadata.mpeg7.Mpeg7Catalog) MediaLocator(org.opencastproject.metadata.mpeg7.MediaLocator) Video(org.opencastproject.metadata.mpeg7.Video) MediaTime(org.opencastproject.metadata.mpeg7.MediaTime) Segment(org.opencastproject.metadata.mpeg7.Segment) MediaLocatorImpl(org.opencastproject.metadata.mpeg7.MediaLocatorImpl) MediaTimePoint(org.opencastproject.metadata.mpeg7.MediaTimePoint) MediaRelTimeImpl(org.opencastproject.metadata.mpeg7.MediaRelTimeImpl)

Example 17 with Mpeg7Catalog

use of org.opencastproject.metadata.mpeg7.Mpeg7Catalog in project opencast by opencast.

the class VideoSegmenterServiceImpl method filterSegmentation.

/**
 * Merges small subsequent segments (with high difference) into a bigger one
 *
 * @param segments list of segments to be filtered
 * @param track the track that is segmented
 * @param segmentsNew will be set to list of new segments (pass null if not required)
 * @param mergeThresh minimum duration for a segment in milliseconds
 * @return Mpeg7Catalog that can later be saved in a Catalog as endresult
 */
protected Mpeg7Catalog filterSegmentation(LinkedList<Segment> segments, Track track, LinkedList<Segment> segmentsNew, int mergeThresh) {
    if (segmentsNew == null) {
        segmentsNew = new LinkedList<Segment>();
    }
    boolean merging = false;
    MediaTime contentTime = new MediaRelTimeImpl(0, track.getDuration());
    MediaLocator contentLocator = new MediaLocatorImpl(track.getURI());
    Mpeg7Catalog mpeg7 = mpeg7CatalogService.newInstance();
    Video videoContent = mpeg7.addVideoContent("videosegment", contentTime, contentLocator);
    int segmentcount = 1;
    MediaTimePoint currentSegStart = new MediaTimePointImpl();
    for (Segment o : segments) {
        // if the current segment is shorter than merge treshold start merging
        if (o.getMediaTime().getMediaDuration().getDurationInMilliseconds() <= mergeThresh) {
            // start merging and save beginning of new segment that will be generated
            if (!merging) {
                currentSegStart = o.getMediaTime().getMediaTimePoint();
                merging = true;
            }
        // current segment is longer than merge threshold
        } else {
            long currentSegDuration = o.getMediaTime().getMediaDuration().getDurationInMilliseconds();
            long currentSegEnd = o.getMediaTime().getMediaTimePoint().getTimeInMilliseconds() + currentSegDuration;
            if (merging) {
                long newDuration = o.getMediaTime().getMediaTimePoint().getTimeInMilliseconds() - currentSegStart.getTimeInMilliseconds();
                // save new segment that merges all previously skipped short segments
                if (newDuration >= mergeThresh) {
                    Segment s = videoContent.getTemporalDecomposition().createSegment("segment-" + segmentcount++);
                    s.setMediaTime(new MediaRelTimeImpl(currentSegStart.getTimeInMilliseconds(), newDuration));
                    segmentsNew.add(s);
                    // copy the following long segment to new list
                    Segment s2 = videoContent.getTemporalDecomposition().createSegment("segment-" + segmentcount++);
                    s2.setMediaTime(o.getMediaTime());
                    segmentsNew.add(s2);
                // if too short split new segment in middle and merge halves to
                // previous and following segments
                } else {
                    long followingStartOld = o.getMediaTime().getMediaTimePoint().getTimeInMilliseconds();
                    long newSplit = (currentSegStart.getTimeInMilliseconds() + followingStartOld) / 2;
                    long followingEnd = followingStartOld + o.getMediaTime().getMediaDuration().getDurationInMilliseconds();
                    long followingDuration = followingEnd - newSplit;
                    // if at beginning, don't split, just merge to first large segment
                    if (segmentsNew.isEmpty()) {
                        Segment s = videoContent.getTemporalDecomposition().createSegment("segment-" + segmentcount++);
                        s.setMediaTime(new MediaRelTimeImpl(0, followingEnd));
                        segmentsNew.add(s);
                    } else {
                        long previousStart = segmentsNew.getLast().getMediaTime().getMediaTimePoint().getTimeInMilliseconds();
                        // adjust end time of previous segment to split time
                        segmentsNew.getLast().setMediaTime(new MediaRelTimeImpl(previousStart, newSplit - previousStart));
                        // create new segment starting at split time
                        Segment s = videoContent.getTemporalDecomposition().createSegment("segment-" + segmentcount++);
                        s.setMediaTime(new MediaRelTimeImpl(newSplit, followingDuration));
                        segmentsNew.add(s);
                    }
                }
                merging = false;
            // copy segments that are long enough to new list (with corrected number)
            } else {
                Segment s = videoContent.getTemporalDecomposition().createSegment("segment-" + segmentcount++);
                s.setMediaTime(o.getMediaTime());
                segmentsNew.add(s);
            }
        }
    }
    // if there is an unfinished merging process after going through all segments
    if (merging && !segmentsNew.isEmpty()) {
        long newDuration = track.getDuration() - currentSegStart.getTimeInMilliseconds();
        // if merged segment is long enough, create new segment
        if (newDuration >= mergeThresh) {
            Segment s = videoContent.getTemporalDecomposition().createSegment("segment-" + segmentcount);
            s.setMediaTime(new MediaRelTimeImpl(currentSegStart.getTimeInMilliseconds(), newDuration));
            segmentsNew.add(s);
        // if not long enough, merge with previous segment
        } else {
            newDuration = track.getDuration() - segmentsNew.getLast().getMediaTime().getMediaTimePoint().getTimeInMilliseconds();
            segmentsNew.getLast().setMediaTime(new MediaRelTimeImpl(segmentsNew.getLast().getMediaTime().getMediaTimePoint().getTimeInMilliseconds(), newDuration));
        }
    }
    // segment spanning the whole video
    if (segmentsNew.isEmpty()) {
        Segment s = videoContent.getTemporalDecomposition().createSegment("segment-" + segmentcount);
        s.setMediaTime(new MediaRelTimeImpl(0, track.getDuration()));
        segmentsNew.add(s);
    }
    return mpeg7;
}
Also used : MediaTimePoint(org.opencastproject.metadata.mpeg7.MediaTimePoint) Segment(org.opencastproject.metadata.mpeg7.Segment) MediaTimePoint(org.opencastproject.metadata.mpeg7.MediaTimePoint) Mpeg7Catalog(org.opencastproject.metadata.mpeg7.Mpeg7Catalog) MediaLocator(org.opencastproject.metadata.mpeg7.MediaLocator) MediaTimePointImpl(org.opencastproject.metadata.mpeg7.MediaTimePointImpl) Video(org.opencastproject.metadata.mpeg7.Video) MediaTime(org.opencastproject.metadata.mpeg7.MediaTime) MediaLocatorImpl(org.opencastproject.metadata.mpeg7.MediaLocatorImpl) MediaRelTimeImpl(org.opencastproject.metadata.mpeg7.MediaRelTimeImpl)

Aggregations

Mpeg7Catalog (org.opencastproject.metadata.mpeg7.Mpeg7Catalog)14 MediaTime (org.opencastproject.metadata.mpeg7.MediaTime)10 Catalog (org.opencastproject.mediapackage.Catalog)9 MediaTimePoint (org.opencastproject.metadata.mpeg7.MediaTimePoint)9 Video (org.opencastproject.metadata.mpeg7.Video)9 Segment (org.opencastproject.metadata.mpeg7.Segment)8 IOException (java.io.IOException)6 Mpeg7CatalogImpl (org.opencastproject.metadata.mpeg7.Mpeg7CatalogImpl)6 LinkedList (java.util.LinkedList)5 Job (org.opencastproject.job.api.Job)5 Attachment (org.opencastproject.mediapackage.Attachment)5 MediaRelTimeImpl (org.opencastproject.metadata.mpeg7.MediaRelTimeImpl)5 Test (org.junit.Test)4 MediaLocator (org.opencastproject.metadata.mpeg7.MediaLocator)4 MediaLocatorImpl (org.opencastproject.metadata.mpeg7.MediaLocatorImpl)4 MediaTimeImpl (org.opencastproject.metadata.mpeg7.MediaTimeImpl)4 URI (java.net.URI)3 Iterator (java.util.Iterator)3 JobBarrier (org.opencastproject.job.api.JobBarrier)3 MediaPackageException (org.opencastproject.mediapackage.MediaPackageException)3