Search in sources :

Example 31 with GrayS16

use of boofcv.struct.image.GrayS16 in project BoofCV by lessthanoptimal.

the class TestTldRegionTracker method createAlg.

private TldRegionTracker<GrayU8, GrayS16> createAlg() {
    ImageGradient<GrayU8, GrayS16> gradient = FactoryDerivative.sobel(GrayU8.class, GrayS16.class);
    PyramidKltTracker<GrayU8, GrayS16> tracker = FactoryTrackerAlg.kltPyramid(new KltConfig(), GrayU8.class, GrayS16.class);
    return new TldRegionTracker<>(10, 5, 100, gradient, tracker, GrayU8.class, GrayS16.class);
}
Also used : GrayS16(boofcv.struct.image.GrayS16) KltConfig(boofcv.alg.tracker.klt.KltConfig) GrayU8(boofcv.struct.image.GrayU8)

Example 32 with GrayS16

use of boofcv.struct.image.GrayS16 in project BoofCV by lessthanoptimal.

the class ExampleStereoTwoViewsOneCamera method main.

public static void main(String[] args) {
    // specify location of images and calibration
    String calibDir = UtilIO.pathExample("calibration/mono/Sony_DSC-HX5V_Chess/");
    String imageDir = UtilIO.pathExample("stereo/");
    // Camera parameters
    CameraPinholeRadial intrinsic = CalibrationIO.load(new File(calibDir, "intrinsic.yaml"));
    // Input images from the camera moving left to right
    BufferedImage origLeft = UtilImageIO.loadImage(imageDir, "mono_wall_01.jpg");
    BufferedImage origRight = UtilImageIO.loadImage(imageDir, "mono_wall_02.jpg");
    // Input images with lens distortion
    GrayU8 distortedLeft = ConvertBufferedImage.convertFrom(origLeft, (GrayU8) null);
    GrayU8 distortedRight = ConvertBufferedImage.convertFrom(origRight, (GrayU8) null);
    // matched features between the two images
    List<AssociatedPair> matchedFeatures = ExampleFundamentalMatrix.computeMatches(origLeft, origRight);
    // convert from pixel coordinates into normalized image coordinates
    List<AssociatedPair> matchedCalibrated = convertToNormalizedCoordinates(matchedFeatures, intrinsic);
    // Robustly estimate camera motion
    List<AssociatedPair> inliers = new ArrayList<>();
    Se3_F64 leftToRight = estimateCameraMotion(intrinsic, matchedCalibrated, inliers);
    drawInliers(origLeft, origRight, intrinsic, inliers);
    // Rectify and remove lens distortion for stereo processing
    DMatrixRMaj rectifiedK = new DMatrixRMaj(3, 3);
    GrayU8 rectifiedLeft = distortedLeft.createSameShape();
    GrayU8 rectifiedRight = distortedRight.createSameShape();
    rectifyImages(distortedLeft, distortedRight, leftToRight, intrinsic, rectifiedLeft, rectifiedRight, rectifiedK);
    // compute disparity
    StereoDisparity<GrayS16, GrayF32> disparityAlg = FactoryStereoDisparity.regionSubpixelWta(DisparityAlgorithms.RECT_FIVE, minDisparity, maxDisparity, 5, 5, 20, 1, 0.1, GrayS16.class);
    // Apply the Laplacian across the image to add extra resistance to changes in lighting or camera gain
    GrayS16 derivLeft = new GrayS16(rectifiedLeft.width, rectifiedLeft.height);
    GrayS16 derivRight = new GrayS16(rectifiedLeft.width, rectifiedLeft.height);
    LaplacianEdge.process(rectifiedLeft, derivLeft);
    LaplacianEdge.process(rectifiedRight, derivRight);
    // process and return the results
    disparityAlg.process(derivLeft, derivRight);
    GrayF32 disparity = disparityAlg.getDisparity();
    // show results
    BufferedImage visualized = VisualizeImageData.disparity(disparity, null, minDisparity, maxDisparity, 0);
    BufferedImage outLeft = ConvertBufferedImage.convertTo(rectifiedLeft, null);
    BufferedImage outRight = ConvertBufferedImage.convertTo(rectifiedRight, null);
    ShowImages.showWindow(new RectifiedPairPanel(true, outLeft, outRight), "Rectification");
    ShowImages.showWindow(visualized, "Disparity");
    showPointCloud(disparity, outLeft, leftToRight, rectifiedK, minDisparity, maxDisparity);
    System.out.println("Total found " + matchedCalibrated.size());
    System.out.println("Total Inliers " + inliers.size());
}
Also used : AssociatedPair(boofcv.struct.geo.AssociatedPair) GrayS16(boofcv.struct.image.GrayS16) ArrayList(java.util.ArrayList) DMatrixRMaj(org.ejml.data.DMatrixRMaj) RectifiedPairPanel(boofcv.gui.stereo.RectifiedPairPanel) BufferedImage(java.awt.image.BufferedImage) ConvertBufferedImage(boofcv.io.image.ConvertBufferedImage) GrayF32(boofcv.struct.image.GrayF32) CameraPinholeRadial(boofcv.struct.calib.CameraPinholeRadial) GrayU8(boofcv.struct.image.GrayU8) File(java.io.File) Se3_F64(georegression.struct.se.Se3_F64)

Example 33 with GrayS16

use of boofcv.struct.image.GrayS16 in project BoofCV by lessthanoptimal.

the class ExampleImageConvert method convert.

void convert() {
    // Converting between BoofCV image types is easy with ConvertImage.  ConvertImage copies
    // the value of a pixel in one image into another image.  When doing so you need to take
    // in account the storage capabilities of these different class types.
    // Going from an unsigned 8-bit image to unsigned 16-bit image is no problem
    GrayU16 imageU16 = new GrayU16(gray.width, gray.height);
    ConvertImage.convert(gray, imageU16);
    // You can convert back into the 8-bit image from the 16-bit image with no problem
    // in this situation because imageU16 does not use the full range of 16-bit values
    ConvertImage.convert(imageU16, gray);
    // Here is an example where you over flow the image after converting
    // There won't be an exception or any error messages but the output image will be corrupted
    GrayU8 imageBad = new GrayU8(derivX.width, derivX.height);
    ConvertImage.convert(derivX, imageBad);
    // One way to get around this problem rescale and adjust the pixel values so that they
    // will be within a valid range.
    GrayS16 scaledAbs = new GrayS16(derivX.width, derivX.height);
    GPixelMath.abs(derivX, scaledAbs);
    GPixelMath.multiply(scaledAbs, 255.0 / ImageStatistics.max(scaledAbs), scaledAbs);
    // If you just want to see the values of a 16-bit image there are built in utility functions
    // for visualizing their values too
    BufferedImage colorX = VisualizeImageData.colorizeSign(derivX, null, -1);
    // Let's see what all the bad image looks like
    // ConvertBufferedImage is similar to ImageConvert in that it does a direct coversion with out
    // adjusting the pixel's value
    BufferedImage outBad = new BufferedImage(imageBad.width, imageBad.height, BufferedImage.TYPE_INT_RGB);
    BufferedImage outScaled = new BufferedImage(imageBad.width, imageBad.height, BufferedImage.TYPE_INT_RGB);
    ListDisplayPanel panel = new ListDisplayPanel();
    panel.addImage(ConvertBufferedImage.convertTo(scaledAbs, outScaled), "Scaled");
    panel.addImage(colorX, "Visualized");
    panel.addImage(ConvertBufferedImage.convertTo(imageBad, outBad), "Bad");
    ShowImages.showWindow(panel, "Image Convert", true);
}
Also used : ListDisplayPanel(boofcv.gui.ListDisplayPanel) GrayU16(boofcv.struct.image.GrayU16) GrayS16(boofcv.struct.image.GrayS16) GrayU8(boofcv.struct.image.GrayU8) BufferedImage(java.awt.image.BufferedImage) ConvertBufferedImage(boofcv.io.image.ConvertBufferedImage)

Example 34 with GrayS16

use of boofcv.struct.image.GrayS16 in project BoofCV by lessthanoptimal.

the class FactoryStereoDisparity method regionSubpixelWta.

/**
 * <p>
 * Returns an algorithm for computing a dense disparity images with sub-pixel disparity accuracy.
 * </p>
 *
 * <p>
 * NOTE: For RECT_FIVE the size of the sub-regions it uses is what is specified.
 * </p>
 *
 * @param minDisparity Minimum disparity that it will check. Must be &ge; 0 and &lt; maxDisparity
 * @param maxDisparity Maximum disparity that it will calculate. Must be &gt; 0
 * @param regionRadiusX Radius of the rectangular region along x-axis. Try 3.
 * @param regionRadiusY Radius of the rectangular region along y-axis. Try 3.
 * @param maxPerPixelError Maximum allowed error in a region per pixel.  Set to &lt; 0 to disable.
 * @param validateRtoL Tolerance for how difference the left to right associated values can be.  Try 6
 * @param texture Tolerance for how similar optimal region is to other region.  Disable with a value &le; 0.
 *                Closer to zero is more tolerant. Try 0.1
 * @param imageType Type of input image.
 * @return Rectangular region based WTA disparity.algorithm.
 */
public static <T extends ImageGray<T>> StereoDisparity<T, GrayF32> regionSubpixelWta(DisparityAlgorithms whichAlg, int minDisparity, int maxDisparity, int regionRadiusX, int regionRadiusY, double maxPerPixelError, int validateRtoL, double texture, Class<T> imageType) {
    double maxError = (regionRadiusX * 2 + 1) * (regionRadiusY * 2 + 1) * maxPerPixelError;
    // 3 regions are used not just one in this case
    if (whichAlg == DisparityAlgorithms.RECT_FIVE)
        maxError *= 3;
    DisparitySelect select;
    if (imageType == GrayU8.class || imageType == GrayS16.class) {
        select = selectDisparitySubpixel_S32((int) maxError, validateRtoL, texture);
    } else if (imageType == GrayF32.class) {
        select = selectDisparitySubpixel_F32((int) maxError, validateRtoL, texture);
    } else {
        throw new IllegalArgumentException("Unknown image type");
    }
    DisparityScoreRowFormat<T, GrayF32> alg = null;
    switch(whichAlg) {
        case RECT:
            if (imageType == GrayU8.class) {
                alg = FactoryStereoDisparityAlgs.scoreDisparitySadRect_U8(minDisparity, maxDisparity, regionRadiusX, regionRadiusY, select);
            } else if (imageType == GrayS16.class) {
                alg = FactoryStereoDisparityAlgs.scoreDisparitySadRect_S16(minDisparity, maxDisparity, regionRadiusX, regionRadiusY, select);
            } else if (imageType == GrayF32.class) {
                alg = FactoryStereoDisparityAlgs.scoreDisparitySadRect_F32(minDisparity, maxDisparity, regionRadiusX, regionRadiusY, select);
            }
            break;
        case RECT_FIVE:
            if (imageType == GrayU8.class) {
                alg = FactoryStereoDisparityAlgs.scoreDisparitySadRectFive_U8(minDisparity, maxDisparity, regionRadiusX, regionRadiusY, select);
            } else if (imageType == GrayS16.class) {
                alg = FactoryStereoDisparityAlgs.scoreDisparitySadRectFive_S16(minDisparity, maxDisparity, regionRadiusX, regionRadiusY, select);
            } else if (imageType == GrayF32.class) {
                alg = FactoryStereoDisparityAlgs.scoreDisparitySadRectFive_F32(minDisparity, maxDisparity, regionRadiusX, regionRadiusY, select);
            }
            break;
        default:
            throw new IllegalArgumentException("Unknown algorithms " + whichAlg);
    }
    if (alg == null)
        throw new RuntimeException("Image type not supported: " + imageType.getSimpleName());
    return new WrapDisparitySadRect<>(alg);
}
Also used : GrayF32(boofcv.struct.image.GrayF32) DisparitySelect(boofcv.alg.feature.disparity.DisparitySelect) GrayS16(boofcv.struct.image.GrayS16) GrayU8(boofcv.struct.image.GrayU8) WrapDisparitySadRect(boofcv.abst.feature.disparity.WrapDisparitySadRect)

Example 35 with GrayS16

use of boofcv.struct.image.GrayS16 in project BoofCV by lessthanoptimal.

the class DerivativeHelperFunctions method processBorderVertical.

public static void processBorderVertical(GrayS16 orig, GrayS16 deriv, Kernel1D_S32 kernel, ImageBorder_S32 borderType) {
    borderType.setImage(orig);
    ConvolveJustBorder_General_SB.vertical(kernel, borderType, deriv);
    GrayS16 origSub;
    GrayS16 derivSub;
    origSub = orig.subimage(0, 0, 2, orig.height, null);
    derivSub = deriv.subimage(0, 0, 2, orig.height, null);
    ConvolveImageNoBorder.vertical(kernel, origSub, derivSub);
    origSub = orig.subimage(orig.width - 2, 0, orig.width, orig.height, null);
    derivSub = deriv.subimage(orig.width - 2, 0, orig.width, orig.height, null);
    ConvolveImageNoBorder.vertical(kernel, origSub, derivSub);
}
Also used : GrayS16(boofcv.struct.image.GrayS16)

Aggregations

GrayS16 (boofcv.struct.image.GrayS16)63 GrayU8 (boofcv.struct.image.GrayU8)45 Test (org.junit.Test)39 ImageBorder_S32 (boofcv.core.image.border.ImageBorder_S32)15 GrayF32 (boofcv.struct.image.GrayF32)13 Random (java.util.Random)8 CompareDerivativeToConvolution (boofcv.alg.filter.derivative.CompareDerivativeToConvolution)7 BorderIndex1D_Extend (boofcv.core.image.border.BorderIndex1D_Extend)4 ImageBorder1D_S32 (boofcv.core.image.border.ImageBorder1D_S32)4 ImageGray (boofcv.struct.image.ImageGray)4 ConvertBufferedImage (boofcv.io.image.ConvertBufferedImage)3 BufferedImage (java.awt.image.BufferedImage)3 WrapDisparitySadRect (boofcv.abst.feature.disparity.WrapDisparitySadRect)2 DisparitySelect (boofcv.alg.feature.disparity.DisparitySelect)2 ImageBorder_F32 (boofcv.core.image.border.ImageBorder_F32)2 ListDisplayPanel (boofcv.gui.ListDisplayPanel)2 Kernel1D_S32 (boofcv.struct.convolve.Kernel1D_S32)2 FDistort (boofcv.abst.distort.FDistort)1 DetectLineSegmentsGridRansac (boofcv.abst.feature.detect.line.DetectLineSegmentsGridRansac)1 EdgeContour (boofcv.alg.feature.detect.edge.EdgeContour)1