Search in sources :

Example 1 with LookUpImageFilesByIndex

use of boofcv.io.image.LookUpImageFilesByIndex in project BoofCV by lessthanoptimal.

the class ExampleMultiViewSparseReconstruction method visualizeSparseCloud.

/**
 * To visualize the results we will render a sparse point cloud along with the location of each camera in the
 * scene.
 */
public void visualizeSparseCloud() {
    checkTrue(scene.isHomogenous());
    List<Point3D_F64> cloudXyz = new ArrayList<>();
    Point4D_F64 world = new Point4D_F64();
    // NOTE: By default the colors found below are not used. Look before to see why and how to turn them on.
    // 
    // Colorize the cloud by reprojecting the images. The math is straight forward but there's a lot of book
    // keeping that needs to be done due to the scene data structure. A class is provided to make this process easy
    var imageLookup = new LookUpImageFilesByIndex(imageFiles);
    var colorize = new ColorizeMultiViewStereoResults<>(new LookUpColorRgbFormats.PL_U8(), imageLookup);
    DogArray_I32 rgb = new DogArray_I32();
    rgb.resize(scene.points.size);
    colorize.processScenePoints(scene, // String encodes the image's index
    (viewIdx) -> viewIdx + "", // Assign the RGB color
    (pointIdx, r, g, b) -> rgb.set(pointIdx, (r << 16) | (g << 8) | b));
    // Convert the structure into regular 3D points from homogenous
    for (int i = 0; i < scene.points.size; i++) {
        scene.points.get(i).get(world);
        // array would be out of sync. Let's just throw it far far away then.
        if (world.w == 0.0)
            cloudXyz.add(new Point3D_F64(0, 0, Double.MAX_VALUE));
        else
            cloudXyz.add(new Point3D_F64(world.x / world.w, world.y / world.w, world.z / world.w));
    }
    PointCloudViewer viewer = VisualizeData.createPointCloudViewer();
    viewer.setFog(true);
    // We just did a bunch of work to look up the true color of points, however for sparse data it's easy to see
    // the structure with psuedo color. Comment out the line below to see the true color.
    viewer.setColorizer(new TwoAxisRgbPlane.Z_XY(1.0).fperiod(40));
    viewer.setDotSize(1);
    viewer.setTranslationStep(0.15);
    viewer.addCloud((idx, p) -> p.setTo(cloudXyz.get(idx)), rgb::get, rgb.size);
    viewer.setCameraHFov(UtilAngle.radian(60));
    SwingUtilities.invokeLater(() -> {
        // Show where the cameras are
        BoofSwingUtil.visualizeCameras(scene, viewer);
        // Size the window and show it to the user
        viewer.getComponent().setPreferredSize(new Dimension(600, 600));
        ShowImages.showWindow(viewer.getComponent(), "Refined Scene", true);
        var copy = new DogArray<>(Point3dRgbI_F64::new);
        viewer.copyCloud(copy);
        try (var out = new FileOutputStream("saved_cloud.ply")) {
            PointCloudIO.save3D(PointCloudIO.Format.PLY, PointCloudReader.wrapF64RGB(copy.toList()), true, out);
        } catch (IOException e) {
            e.printStackTrace();
        }
    });
}
Also used : Point3D_F64(georegression.struct.point.Point3D_F64) LookUpColorRgbFormats(boofcv.core.image.LookUpColorRgbFormats) ArrayList(java.util.ArrayList) DogArray_I32(org.ddogleg.struct.DogArray_I32) UncheckedIOException(java.io.UncheckedIOException) IOException(java.io.IOException) DogArray(org.ddogleg.struct.DogArray) ColorizeMultiViewStereoResults(boofcv.alg.mvs.ColorizeMultiViewStereoResults) FileOutputStream(java.io.FileOutputStream) PointCloudViewer(boofcv.visualize.PointCloudViewer) Point3dRgbI_F64(boofcv.struct.Point3dRgbI_F64) Point4D_F64(georegression.struct.point.Point4D_F64) LookUpImageFilesByIndex(boofcv.io.image.LookUpImageFilesByIndex)

Example 2 with LookUpImageFilesByIndex

use of boofcv.io.image.LookUpImageFilesByIndex in project BoofCV by lessthanoptimal.

the class ExampleMultiBaselineStereo method main.

public static void main(String[] args) {
    // Compute a sparse reconstruction. This will give us intrinsic and extrinsic for all views
    var example = new ExampleMultiViewSparseReconstruction();
    // Specifies the "center" frame to use
    int centerViewIdx = 15;
    example.compute("tree_snow_01.mp4", true);
    // example.compute("ditch_02.mp4", true);
    // example.compute("holiday_display_01.mp4"", true);
    // example.compute("log_building_02.mp4"", true);
    // example.compute("drone_park_01.mp4", false);
    // example.compute("stone_sign.mp4", true);
    // We need a way to load images based on their ID. In this particular case the ID encodes the array index.
    var imageLookup = new LookUpImageFilesByIndex(example.imageFiles);
    // Next we tell it which view to use as the "center", which acts as the common view for all disparity images.
    // The process of selecting the best views to use as centers is a problem all it's own. To keep things
    // we just pick a frame.
    SceneWorkingGraph.View center = example.working.getAllViews().get(centerViewIdx);
    // The final scene refined by bundle adjustment is created by the Working graph. However the 3D relationship
    // between views is contained in the pairwise graph. A View in the working graph has a reference to the view
    // in the pairwise graph. Using that we will find all connected views that have a 3D relationship
    var pairedViewIdxs = new DogArray_I32();
    var sbaIndexToImageID = new TIntObjectHashMap<String>();
    // This relationship between pairwise and working graphs might seem (and is) a bit convoluted. The Pairwise
    // graph is the initial crude sketch of what might be connected. The working graph is an intermediate
    // data structure for computing the metric scene. SBA is a refinement of the working graph.
    // Iterate through all connected views in the pairwise graph and mark their indexes in the working graph
    center.pview.connections.forEach((m) -> {
        // if there isn't a 3D relationship just skip it
        if (!m.is3D)
            return;
        String connectedID = m.other(center.pview).id;
        SceneWorkingGraph.View connected = example.working.views.get(connectedID);
        // Make sure the pairwise view exists in the working graph too
        if (connected == null)
            return;
        // Add this view to the index to name/ID lookup table
        sbaIndexToImageID.put(connected.index, connectedID);
        // Note that this view is one which acts as the second image in the stereo pair
        pairedViewIdxs.add(connected.index);
    });
    // Add the center camera image to the ID look up table
    sbaIndexToImageID.put(centerViewIdx, center.pview.id);
    // Configure there stereo disparity algorithm which is used
    var configDisparity = new ConfigDisparityBMBest5();
    configDisparity.validateRtoL = 1;
    configDisparity.texture = 0.5;
    configDisparity.regionRadiusX = configDisparity.regionRadiusY = 4;
    configDisparity.disparityRange = 120;
    // This is the actual MBS algorithm mentioned previously. It selects the best disparity for each pixel
    // in the original image using a median filter.
    var multiBaseline = new MultiBaselineStereoIndependent<>(imageLookup, ImageType.SB_U8);
    multiBaseline.setStereoDisparity(FactoryStereoDisparity.blockMatchBest5(configDisparity, GrayU8.class, GrayF32.class));
    // Print out verbose debugging and profile information
    multiBaseline.setVerbose(System.out, null);
    multiBaseline.setVerboseProfiling(System.out);
    // Improve stereo by removing small regions, which tends to be noise. Consider adjusting the region size.
    multiBaseline.setDisparitySmoother(FactoryStereoDisparity.removeSpeckle(null, GrayF32.class));
    // Print out debugging information from the smoother
    // Objects.requireNonNull(multiBaseline.getDisparitySmoother()).setVerbose(System.out,null);
    // Creates a list where you can switch between different images/visualizations
    var listDisplay = new ListDisplayPanel();
    listDisplay.setPreferredSize(new Dimension(1000, 300));
    ShowImages.showWindow(listDisplay, "Intermediate Results", true);
    // We will display intermediate results as they come in
    multiBaseline.setListener((leftView, rightView, rectLeft, rectRight, disparity, mask, parameters, rect) -> {
        // Visualize the rectified stereo pair. You can interact with this window and verify
        // that the y-axis is  aligned
        var rectified = new RectifiedPairPanel(true);
        rectified.setImages(ConvertBufferedImage.convertTo(rectLeft, null), ConvertBufferedImage.convertTo(rectRight, null));
        // Cleans up the disparity image by zeroing out pixels that are outside the original image bounds
        RectifyImageOps.applyMask(disparity, mask, 0);
        // Display the colorized disparity
        BufferedImage colorized = VisualizeImageData.disparity(disparity, null, parameters.disparityRange, 0);
        SwingUtilities.invokeLater(() -> {
            listDisplay.addItem(rectified, "Rectified " + leftView + " " + rightView);
            listDisplay.addImage(colorized, leftView + " " + rightView);
        });
    });
    // Process the images and compute a single combined disparity image
    if (!multiBaseline.process(example.scene, center.index, pairedViewIdxs, sbaIndexToImageID::get)) {
        throw new RuntimeException("Failed to fuse stereo views");
    }
    // Extract the point cloud from the fused disparity image
    GrayF32 fusedDisparity = multiBaseline.getFusedDisparity();
    DisparityParameters fusedParam = multiBaseline.getFusedParam();
    BufferedImage colorizedDisp = VisualizeImageData.disparity(fusedDisparity, null, fusedParam.disparityRange, 0);
    ShowImages.showWindow(colorizedDisp, "Fused Disparity");
    // Now compute the point cloud it represents and the color of each pixel.
    // For the fused image, instead of being in rectified image coordinates it's in the original image coordinates
    // this makes extracting color much easier.
    var cloud = new DogArray<>(Point3D_F64::new);
    var cloudRgb = new DogArray_I32(cloud.size);
    // Load the center image in color
    var colorImage = new InterleavedU8(1, 1, 3);
    imageLookup.loadImage(center.pview.id, colorImage);
    // Since the fused image is in the original (i.e. distorted) pixel coordinates and is not rectified,
    // that needs to be taken in account by undistorting the image to create the point cloud.
    CameraPinholeBrown intrinsic = BundleAdjustmentOps.convert(example.scene.cameras.get(center.cameraIdx).model, colorImage.width, colorImage.height, null);
    Point2Transform2_F64 pixel_to_norm = new LensDistortionBrown(intrinsic).distort_F64(true, false);
    MultiViewStereoOps.disparityToCloud(fusedDisparity, fusedParam, new PointToPixelTransform_F64(pixel_to_norm), (pixX, pixY, x, y, z) -> {
        cloud.grow().setTo(x, y, z);
        cloudRgb.add(colorImage.get24(pixX, pixY));
    });
    // Configure the point cloud viewer
    PointCloudViewer pcv = VisualizeData.createPointCloudViewer();
    pcv.setCameraHFov(UtilAngle.radian(70));
    pcv.setTranslationStep(0.15);
    pcv.addCloud(cloud.toList(), cloudRgb.data);
    // pcv.setColorizer(new SingleAxisRgb.Z().fperiod(30.0));
    JComponent viewer = pcv.getComponent();
    viewer.setPreferredSize(new Dimension(600, 600));
    ShowImages.showWindow(viewer, "Point Cloud", true);
    System.out.println("Done");
}
Also used : Point3D_F64(georegression.struct.point.Point3D_F64) InterleavedU8(boofcv.struct.image.InterleavedU8) ListDisplayPanel(boofcv.gui.ListDisplayPanel) CameraPinholeBrown(boofcv.struct.calib.CameraPinholeBrown) ConfigDisparityBMBest5(boofcv.factory.disparity.ConfigDisparityBMBest5) LensDistortionBrown(boofcv.alg.distort.brown.LensDistortionBrown) RectifiedPairPanel(boofcv.gui.stereo.RectifiedPairPanel) BufferedImage(java.awt.image.BufferedImage) ConvertBufferedImage(boofcv.io.image.ConvertBufferedImage) PointCloudViewer(boofcv.visualize.PointCloudViewer) PointToPixelTransform_F64(boofcv.struct.distort.PointToPixelTransform_F64) LookUpImageFilesByIndex(boofcv.io.image.LookUpImageFilesByIndex) GrayU8(boofcv.struct.image.GrayU8) SceneWorkingGraph(boofcv.alg.structure.SceneWorkingGraph) MultiBaselineStereoIndependent(boofcv.alg.mvs.MultiBaselineStereoIndependent) Point2Transform2_F64(boofcv.struct.distort.Point2Transform2_F64) DogArray_I32(org.ddogleg.struct.DogArray_I32) DogArray(org.ddogleg.struct.DogArray) GrayF32(boofcv.struct.image.GrayF32) TIntObjectHashMap(gnu.trove.map.hash.TIntObjectHashMap) DisparityParameters(boofcv.alg.mvs.DisparityParameters)

Example 3 with LookUpImageFilesByIndex

use of boofcv.io.image.LookUpImageFilesByIndex in project BoofCV by lessthanoptimal.

the class ExampleMultiViewDenseReconstruction method main.

public static void main(String[] args) {
    var example = new ExampleMultiViewSparseReconstruction();
    example.compute("tree_snow_01.mp4", true);
    // example.compute("ditch_02.mp4", true);
    // example.compute("holiday_display_01.mp4", true);
    // example.compute("log_building_02.mp4", true);
    // example.compute("drone_park_01.mp4", false);
    // example.compute("stone_sign.mp4", true);
    // Looks up images based on their index in the file list
    var imageLookup = new LookUpImageFilesByIndex(example.imageFiles);
    // We will use a high level algorithm that does almost all the work for us. It is highly configurable
    // and just about every parameter can be tweaked using its Config. Internal algorithms can be accessed
    // and customize directly if needed. Specifics for how it work is beyond this example but the code
    // is easily accessible.
    // Let's do some custom configuration for this scenario
    var config = new ConfigSparseToDenseCloud();
    config.disparity.approach = ConfigDisparity.Approach.SGM;
    ConfigDisparitySGM configSgm = config.disparity.approachSGM;
    configSgm.validateRtoL = 0;
    configSgm.texture = 0.75;
    configSgm.disparityRange = 250;
    configSgm.paths = ConfigDisparitySGM.Paths.P4;
    configSgm.configBlockMatch.radiusX = 3;
    configSgm.configBlockMatch.radiusY = 3;
    // Create the sparse to dense reconstruction using a factory
    SparseSceneToDenseCloud<GrayU8> sparseToDense = FactorySceneReconstruction.sparseSceneToDenseCloud(config, ImageType.SB_U8);
    // To help make the time go by faster while we wait about 1 to 2 minutes for it to finish, let's print stuff
    sparseToDense.getMultiViewStereo().setVerbose(System.out, BoofMiscOps.hashSet(BoofVerbose.RECURSIVE, BoofVerbose.RUNTIME));
    // To visualize intermediate results we will add a listener. This will show fused disparity images
    sparseToDense.getMultiViewStereo().setListener(new MultiViewStereoFromKnownSceneStructure.Listener<>() {

        @Override
        public void handlePairDisparity(String left, String right, GrayU8 rect0, GrayU8 rect1, GrayF32 disparity, GrayU8 mask, DisparityParameters parameters) {
        // Uncomment to display individual stereo pairs. Commented out by default because it generates
        // a LOT of windows
        // BufferedImage outLeft = ConvertBufferedImage.convertTo(rect0, null);
        // BufferedImage outRight = ConvertBufferedImage.convertTo(rect1, null);
        // 
        // ShowImages.showWindow(new RectifiedPairPanel(true, outLeft, outRight), "Rectification: "+left+" "+right);
        // BufferedImage colorized = VisualizeImageData.disparity(disparity, null, parameters.disparityRange, 0);
        // ShowImages.showWindow(colorized, "Disparity " + left + " " + right);
        }

        @Override
        public void handleFusedDisparity(String name, GrayF32 disparity, GrayU8 mask, DisparityParameters parameters) {
            // You can also do custom filtering of the disparity image in this function. If the line below is
            // uncommented then points which are far away will be marked as invalid
            // PixelMath.operator1(disparity, ( v ) -> v >= 20 ? v : parameters.disparityRange, disparity);
            // Display the disparity for each center view
            BufferedImage colorized = VisualizeImageData.disparity(disparity, null, parameters.disparityRange, 0);
            ShowImages.showWindow(colorized, "Center " + name);
        }
    });
    // It needs a look up table to go from SBA view index to image name. It loads images as needed to perform
    // stereo disparity
    var viewToId = new TIntObjectHashMap<String>();
    BoofMiscOps.forIdx(example.working.listViews, (workIdxI, wv) -> viewToId.put(wv.index, wv.pview.id));
    if (!sparseToDense.process(example.scene, viewToId, imageLookup))
        throw new RuntimeException("Dense reconstruction failed!");
    saveCloudToDisk(sparseToDense);
    // Display the dense cloud
    visualizeInPointCloud(sparseToDense.getCloud(), sparseToDense.getColorRgb(), example.scene);
}
Also used : ConfigDisparitySGM(boofcv.factory.disparity.ConfigDisparitySGM) BufferedImage(java.awt.image.BufferedImage) GrayF32(boofcv.struct.image.GrayF32) TIntObjectHashMap(gnu.trove.map.hash.TIntObjectHashMap) DisparityParameters(boofcv.alg.mvs.DisparityParameters) ConfigSparseToDenseCloud(boofcv.factory.structure.ConfigSparseToDenseCloud) LookUpImageFilesByIndex(boofcv.io.image.LookUpImageFilesByIndex) GrayU8(boofcv.struct.image.GrayU8) MultiViewStereoFromKnownSceneStructure(boofcv.alg.mvs.MultiViewStereoFromKnownSceneStructure)

Aggregations

LookUpImageFilesByIndex (boofcv.io.image.LookUpImageFilesByIndex)3 DisparityParameters (boofcv.alg.mvs.DisparityParameters)2 GrayF32 (boofcv.struct.image.GrayF32)2 GrayU8 (boofcv.struct.image.GrayU8)2 PointCloudViewer (boofcv.visualize.PointCloudViewer)2 Point3D_F64 (georegression.struct.point.Point3D_F64)2 TIntObjectHashMap (gnu.trove.map.hash.TIntObjectHashMap)2 BufferedImage (java.awt.image.BufferedImage)2 DogArray (org.ddogleg.struct.DogArray)2 DogArray_I32 (org.ddogleg.struct.DogArray_I32)2 LensDistortionBrown (boofcv.alg.distort.brown.LensDistortionBrown)1 ColorizeMultiViewStereoResults (boofcv.alg.mvs.ColorizeMultiViewStereoResults)1 MultiBaselineStereoIndependent (boofcv.alg.mvs.MultiBaselineStereoIndependent)1 MultiViewStereoFromKnownSceneStructure (boofcv.alg.mvs.MultiViewStereoFromKnownSceneStructure)1 SceneWorkingGraph (boofcv.alg.structure.SceneWorkingGraph)1 LookUpColorRgbFormats (boofcv.core.image.LookUpColorRgbFormats)1 ConfigDisparityBMBest5 (boofcv.factory.disparity.ConfigDisparityBMBest5)1 ConfigDisparitySGM (boofcv.factory.disparity.ConfigDisparitySGM)1 ConfigSparseToDenseCloud (boofcv.factory.structure.ConfigSparseToDenseCloud)1 ListDisplayPanel (boofcv.gui.ListDisplayPanel)1