Search in sources :

Example 1 with Transform2d

use of edu.wpi.first.math.geometry.Transform2d in project RobotCode2022 by Mechanical-Advantage.

the class Drive method addVisionMeasurement.

/**
 * Adds a new timestamped vision measurement
 */
public void addVisionMeasurement(TimestampedTranslation2d data) {
    Optional<Pose2d> historicalFieldToTarget = poseHistory.get(data.timestamp);
    if (historicalFieldToTarget.isPresent()) {
        // Get camera constants
        CameraPosition cameraPosition = VisionConstants.getCameraPosition(hoodStateSupplier.get());
        if (cameraPosition == null) {
            // Hood is moving, don't process frame
            return;
        }
        // Calculate new robot pose
        Rotation2d robotRotation = historicalFieldToTarget.get().getRotation();
        Rotation2d cameraRotation = robotRotation.rotateBy(cameraPosition.vehicleToCamera.getRotation());
        Transform2d fieldToTargetRotated = new Transform2d(FieldConstants.hubCenter, cameraRotation);
        Transform2d fieldToCamera = fieldToTargetRotated.plus(GeomUtil.transformFromTranslation(data.translation.unaryMinus()));
        Pose2d visionFieldToTarget = GeomUtil.transformToPose(fieldToCamera.plus(cameraPosition.vehicleToCamera.inverse()));
        if (visionFieldToTarget.getX() > FieldConstants.fieldLength || visionFieldToTarget.getX() < 0.0 || visionFieldToTarget.getY() > FieldConstants.fieldWidth || visionFieldToTarget.getY() < 0.0) {
            return;
        }
        // Save vision pose for logging
        noVisionTimer.reset();
        lastVisionPose = visionFieldToTarget;
        // Calculate vision percent
        double angularErrorScale = Math.abs(inputs.gyroYawVelocityRadPerSec) / visionMaxAngularVelocity;
        angularErrorScale = MathUtil.clamp(angularErrorScale, 0, 1);
        double visionShift = 1 - Math.pow(1 - visionShiftPerSec, 1 / visionNominalFramerate);
        visionShift *= 1 - angularErrorScale;
        // Reset pose
        Pose2d currentFieldToTarget = getPose();
        Translation2d fieldToVisionField = new Translation2d(visionFieldToTarget.getX() - historicalFieldToTarget.get().getX(), visionFieldToTarget.getY() - historicalFieldToTarget.get().getY());
        Pose2d visionLatencyCompFieldToTarget = new Pose2d(currentFieldToTarget.getX() + fieldToVisionField.getX(), currentFieldToTarget.getY() + fieldToVisionField.getY(), currentFieldToTarget.getRotation());
        if (resetOnVision) {
            setPose(new Pose2d(visionFieldToTarget.getX(), visionFieldToTarget.getY(), currentFieldToTarget.getRotation()), true);
            resetOnVision = false;
        } else {
            setPose(new Pose2d(currentFieldToTarget.getX() * (1 - visionShift) + visionLatencyCompFieldToTarget.getX() * visionShift, currentFieldToTarget.getY() * (1 - visionShift) + visionLatencyCompFieldToTarget.getY() * visionShift, currentFieldToTarget.getRotation()), false);
        }
    }
}
Also used : CameraPosition(frc.robot.VisionConstants.CameraPosition) Transform2d(edu.wpi.first.math.geometry.Transform2d) TimestampedTranslation2d(frc.robot.subsystems.vision.Vision.TimestampedTranslation2d) Translation2d(edu.wpi.first.math.geometry.Translation2d) Rotation2d(edu.wpi.first.math.geometry.Rotation2d) Pose2d(edu.wpi.first.math.geometry.Pose2d)

Example 2 with Transform2d

use of edu.wpi.first.math.geometry.Transform2d in project 2022-RapidReact by Spartronics4915.

the class TestObjectFinder method interactivePointcloudTest.

@Tag("hardwareDependant")
@Test
public void interactivePointcloudTest() {
    final ObjectFinder finder = new ObjectFinder(0.01);
    final double circleRadiusMeters = Units.inchesToMeters(3.5);
    var pointCloudCanvas = new Canvas() {

        @Override
        public void paint(Graphics g) {
            super.paint(g);
            try {
                synchronized (mPointcloud) {
                    if (mPointcloud.size() <= 0) {
                        return;
                    }
                    mPointcloud.forEach((p) -> drawPoint(p, 255, g));
                    int circleDiameterCentimeters = (int) Math.round(circleRadiusMeters * 100.0 * 2.0);
                    // var centers = finder.findSquares(pointcloud, new Translation2d(), 0.28, mNumVotesNeeded, 3, 0.3);
                    var centers = finder.findCircles(mPointcloud, circleRadiusMeters, mNumVotesNeeded, 3);
                    var center = mTargetTracker.update(centers);
                    if (center == null) {
                        System.out.println("No target found");
                        return;
                    }
                    // for (Translation2d center : centers) {
                    System.out.println(center);
                    g.setColor(Color.BLUE);
                    g.drawOval(toScreenCoordinates(center.getX() - circleRadiusMeters, true), toScreenCoordinates(center.getY() - circleRadiusMeters, false), circleDiameterCentimeters, circleDiameterCentimeters);
                    // break;
                    // }
                    g.drawString("Edge: " + mEdgeDetectorValue + ", Votes: " + mNumVotesNeeded, 10, 10);
                }
            } catch (Exception e) {
                e.printStackTrace();
            }
        }

        // Screen coordinates correspond to centimeters
        private int toScreenCoordinates(double coordMeters, boolean isX) {
            Dimension size = this.getSize();
            coordMeters *= 100;
            coordMeters += (isX ? size.getWidth() : size.getHeight()) / 2;
            return (int) Math.round(coordMeters);
        }

        private void drawPoint(Translation2d point, int quality, Graphics g) {
            g.setColor(new Color(255 - quality, 0, 0));
            g.drawOval(toScreenCoordinates(point.getX(), true), toScreenCoordinates(point.getY(), false), 1, 1);
        }
    };
    pointCloudCanvas.setSize(6000, 6000);
    pointCloudCanvas.setBackground(Color.WHITE);
    var frame = new Frame();
    frame.add(pointCloudCanvas);
    frame.setSize(6000, 6000);
    frame.setVisible(true);
    frame.addKeyListener(new KeyListener() {

        @Override
        public void keyPressed(KeyEvent k) {
        }

        @Override
        public void keyReleased(KeyEvent k) {
            System.out.println(k.getKeyChar());
            if (k.getKeyChar() == '-' || k.getKeyChar() == '+') {
                double toAdd = (k.getKeyChar() == '+' ? 1 : -1);
                if (k.isAltDown()) {
                    mEdgeDetectorValue += toAdd;
                } else if (k.isControlDown()) {
                    mNumVotesNeeded += toAdd;
                }
            } else if (k.getKeyChar() == 'q') {
                System.exit(0);
            }
        }

        @Override
        public void keyTyped(KeyEvent k) {
        }
    });
    RPLidarA1 lidar = new RPLidarA1();
    lidar.setCallback((List<Translation2d> pointcloud) -> {
        synchronized (mPointcloud) {
            if (System.currentTimeMillis() - mLastResetTime > 0) {
                mPointcloud = pointcloud;
                mLastResetTime = System.currentTimeMillis();
            } else {
                mPointcloud.addAll(pointcloud);
            }
        }
    }, new RobotStateMap(), new Transform2d(new Translation2d(Units.inchesToMeters(-5.5), Units.inchesToMeters(-14)), Rotation2d.fromDegrees(180)));
    lidar.start();
    Runtime.getRuntime().addShutdownHook(new Thread(() -> {
        lidar.stop();
        System.out.println("Graceful shutdown complete");
    }));
    while (true) {
        try {
            Thread.sleep(500);
            synchronized (mPointcloud) {
                pointCloudCanvas.repaint();
            }
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
    }
}
Also used : Frame(java.awt.Frame) RobotStateMap(com.spartronics4915.lib.subsystems.estimator.RobotStateMap) Transform2d(edu.wpi.first.math.geometry.Transform2d) Canvas(java.awt.Canvas) Color(java.awt.Color) Dimension(java.awt.Dimension) Graphics(java.awt.Graphics) KeyEvent(java.awt.event.KeyEvent) Translation2d(edu.wpi.first.math.geometry.Translation2d) RPLidarA1(com.spartronics4915.lib.hardware.sensors.RPLidarA1) KeyListener(java.awt.event.KeyListener) ArrayList(java.util.ArrayList) List(java.util.List) Test(org.junit.jupiter.api.Test) Tag(org.junit.jupiter.api.Tag)

Example 3 with Transform2d

use of edu.wpi.first.math.geometry.Transform2d in project 2022-RapidReact by Spartronics4915.

the class DrivetrainEstimatorTest method testEstimator.

@Test
public void testEstimator() {
    var stateStdDevs = new MatBuilder<>(Nat.N3(), Nat.N1()).fill(0.02, 0.02, 0.01);
    var measurementStdDevs = new MatBuilder<>(Nat.N6(), Nat.N1()).fill(0.1, 0.1, 0.1, 0.05, 0.05, 0.002);
    var est = new DrivetrainEstimator(stateStdDevs, measurementStdDevs, 3, new Pose2d());
    final double dt = 0.01;
    final double visionUpdateRate = 0.2;
    var traj = TrajectoryGenerator.generateTrajectory(List.of(new Pose2d(), new Pose2d(3, 3, new Rotation2d())), new TrajectoryConfig(Units.inchesToMeters(12), Units.inchesToMeters(12)));
    var kinematics = new DifferentialDriveKinematics(1);
    Pose2d lastPose = null;
    List<Double> trajXs = new ArrayList<>();
    List<Double> trajYs = new ArrayList<>();
    List<Double> observerXs = new ArrayList<>();
    List<Double> observerYs = new ArrayList<>();
    List<Double> slamXs = new ArrayList<>();
    List<Double> slamYs = new ArrayList<>();
    List<Double> visionXs = new ArrayList<>();
    List<Double> visionYs = new ArrayList<>();
    var rand = new Random();
    final double steadyStateErrorX = 1.0;
    final double steadyStateErrorY = 1.0;
    double t = 0.0;
    Pose2d lastVisionUpdate = null;
    double lastVisionUpdateT = Double.NEGATIVE_INFINITY;
    double maxError = Double.NEGATIVE_INFINITY;
    double errorSum = 0;
    while (t <= traj.getTotalTimeSeconds()) {
        t += dt;
        var groundtruthState = traj.sample(t);
        var input = kinematics.toWheelSpeeds(new ChassisSpeeds(groundtruthState.velocityMetersPerSecond, 0.0, // ds/dt * dtheta/ds = dtheta/dt
        groundtruthState.velocityMetersPerSecond * groundtruthState.curvatureRadPerMeter));
        Matrix<N3, N1> u = new MatBuilder<>(Nat.N3(), Nat.N1()).fill(input.leftMetersPerSecond * dt, input.rightMetersPerSecond * dt, 0.0);
        if (lastPose != null) {
            u.set(2, 0, groundtruthState.poseMeters.getRotation().getRadians() - lastPose.getRotation().getRadians());
        }
        u = u.plus(StateSpaceUtil.makeWhiteNoiseVector(new MatBuilder<>(Nat.N3(), Nat.N1()).fill(0.002, 0.002, 0.001)));
        lastPose = groundtruthState.poseMeters;
        Pose2d realPose = groundtruthState.poseMeters;
        if (lastVisionUpdateT + visionUpdateRate < t) {
            if (lastVisionUpdate != null) {
                est.addVisionMeasurement(lastVisionUpdate, lastVisionUpdateT);
            }
            lastVisionUpdateT = t;
            lastVisionUpdate = realPose.transformBy(new Transform2d(new Translation2d(rand.nextGaussian() * 0.05, rand.nextGaussian() * 0.05), new Rotation2d(rand.nextGaussian() * 0.002)));
            visionXs.add(lastVisionUpdate.getTranslation().getX());
            visionYs.add(lastVisionUpdate.getTranslation().getY());
        }
        double dist = realPose.getTranslation().getDistance(new Translation2d());
        Pose2d measurementVSlam = realPose.transformBy(new Transform2d(new Translation2d(steadyStateErrorX * (dist / 76.0), steadyStateErrorY * (dist / 76.0)), new Rotation2d())).transformBy(new Transform2d(new Translation2d(rand.nextGaussian() * 0.05, rand.nextGaussian() * 0.05), new Rotation2d(rand.nextGaussian() * 0.001)));
        var xHat = est.update(measurementVSlam, u.get(0, 0), u.get(1, 0), u.get(2, 0), t);
        double error = groundtruthState.poseMeters.getTranslation().getDistance(xHat.getTranslation());
        if (error > maxError) {
            maxError = error;
        }
        errorSum += error;
        trajXs.add(groundtruthState.poseMeters.getTranslation().getX());
        trajYs.add(groundtruthState.poseMeters.getTranslation().getY());
        observerXs.add(xHat.getTranslation().getX());
        observerYs.add(xHat.getTranslation().getY());
        slamXs.add(measurementVSlam.getTranslation().getX());
        slamYs.add(measurementVSlam.getTranslation().getY());
    }
    System.out.println("Mean error (meters): " + errorSum / (traj.getTotalTimeSeconds() / dt));
    System.out.println("Max error (meters):  " + maxError);
    try {
        if (true)
            throw new HeadlessException();
        var chartBuilder = new XYChartBuilder();
        chartBuilder.title = "The Magic of Sensor Fusion";
        var chart = chartBuilder.build();
        chart.addSeries("vSLAM", slamXs, slamYs);
        chart.addSeries("Vision", visionXs, visionYs);
        chart.addSeries("Trajectory", trajXs, trajYs);
        chart.addSeries("xHat", observerXs, observerYs);
        new SwingWrapper<>(chart).displayChart();
        try {
            Thread.sleep(1000000000);
        } catch (InterruptedException e) {
        }
    } catch (java.awt.HeadlessException ex) {
        System.out.println("skipping charts in headless mode");
    }
}
Also used : Transform2d(edu.wpi.first.math.geometry.Transform2d) DifferentialDriveKinematics(edu.wpi.first.math.kinematics.DifferentialDriveKinematics) ArrayList(java.util.ArrayList) Random(java.util.Random) TrajectoryConfig(edu.wpi.first.math.trajectory.TrajectoryConfig) N1(edu.wpi.first.math.numbers.N1) N3(edu.wpi.first.math.numbers.N3) ChassisSpeeds(edu.wpi.first.math.kinematics.ChassisSpeeds) MatBuilder(edu.wpi.first.math.MatBuilder) Translation2d(edu.wpi.first.math.geometry.Translation2d) Rotation2d(edu.wpi.first.math.geometry.Rotation2d) Pose2d(edu.wpi.first.math.geometry.Pose2d) XYChartBuilder(org.knowm.xchart.XYChartBuilder) java.awt(java.awt) Test(org.junit.jupiter.api.Test)

Example 4 with Transform2d

use of edu.wpi.first.math.geometry.Transform2d in project 2022-RapidReact by Spartronics4915.

the class RPLidarA1 method setCallback.

/**
 * @param pointcloudConsumer A callback that accepts a field-relative
 *                           pointcloud, with all coordinates in meters.
 * @param robotStateMap      A robot state map to transform the pointcloud with.
 * @param vehicleToLidar     Transformation from the vehicle to the lidar sensor
 *                           (i.e. the sensor's offset from the vehicle's
 *                           center).
 */
public void setCallback(PointcloudConsumer pointcloudConsumer, RobotStateMap robotStateMap, Transform2d vehicleToLidar) {
    setCallback((Measurement m) -> {
        if (m.start && mCurrentPointcloud.size() > 0) {
            pointcloudConsumer.accept(new ArrayList<>(mCurrentPointcloud));
            mCurrentPointcloud.clear();
        }
        Translation2d point = m.getAsPoint();
        point = robotStateMap.getFieldToVehicle(m.timestamp).transformBy(vehicleToLidar).transformBy(new Transform2d(point, new Rotation2d())).getTranslation();
        mCurrentPointcloud.add(point);
    });
}
Also used : Translation2d(edu.wpi.first.math.geometry.Translation2d) Transform2d(edu.wpi.first.math.geometry.Transform2d) Rotation2d(edu.wpi.first.math.geometry.Rotation2d)

Example 5 with Transform2d

use of edu.wpi.first.math.geometry.Transform2d in project 2022-RapidReact by Spartronics4915.

the class T265Camera method consumePoseUpdate.

private synchronized void consumePoseUpdate(float x, float y, float radians, float dx, float dy, float dtheta, int confOrdinal) {
    // First we apply an offset to go from the camera coordinate system to the
    // robot coordinate system with an origin at the center of the robot. This
    // is not a directional transformation.
    // Then we transform the pose our camera is giving us so that it reports is
    // the robot's pose, not the camera's. This is a directional transformation.
    final Pose2d currentPose = new Pose2d(x - mRobotOffset.getTranslation().getX(), y - mRobotOffset.getTranslation().getY(), new Rotation2d(radians)).transformBy(mRobotOffset);
    mLastRecievedPose = currentPose;
    if (!mIsStarted)
        return;
    // See
    // https://github.com/IntelRealSense/librealsense/blob/7f2ba0de8769620fd672f7b44101f0758e7e2fb3/include/librealsense2/h/rs_types.h#L115
    // for ordinals
    PoseConfidence confidence;
    switch(confOrdinal) {
        case 0x0:
            confidence = PoseConfidence.Failed;
            break;
        case 0x1:
            confidence = PoseConfidence.Low;
            break;
        case 0x2:
            confidence = PoseConfidence.Medium;
            break;
        case 0x3:
            confidence = PoseConfidence.High;
            break;
        default:
            throw new RuntimeException("Unknown confidence ordinal \"" + confOrdinal + "\" passed from native code");
    }
    final Pose2d transformedPose = mOrigin.transformBy(new Transform2d(currentPose.getTranslation(), currentPose.getRotation()));
    mPoseConsumer.accept(new CameraUpdate(transformedPose, new ChassisSpeeds(dx, dy, dtheta), confidence));
}
Also used : Transform2d(edu.wpi.first.math.geometry.Transform2d) Rotation2d(edu.wpi.first.math.geometry.Rotation2d) Pose2d(edu.wpi.first.math.geometry.Pose2d) ChassisSpeeds(edu.wpi.first.math.kinematics.ChassisSpeeds)

Aggregations

Transform2d (edu.wpi.first.math.geometry.Transform2d)15 Pose2d (edu.wpi.first.math.geometry.Pose2d)9 Translation2d (edu.wpi.first.math.geometry.Translation2d)6 ArrayList (java.util.ArrayList)6 Rotation2d (edu.wpi.first.math.geometry.Rotation2d)5 Test (org.junit.jupiter.api.Test)4 ChassisSpeeds (edu.wpi.first.math.kinematics.ChassisSpeeds)2 RPLidarA1 (com.spartronics4915.lib.hardware.sensors.RPLidarA1)1 RobotStateMap (com.spartronics4915.lib.subsystems.estimator.RobotStateMap)1 MatBuilder (edu.wpi.first.math.MatBuilder)1 Twist2d (edu.wpi.first.math.geometry.Twist2d)1 DifferentialDriveKinematics (edu.wpi.first.math.kinematics.DifferentialDriveKinematics)1 N1 (edu.wpi.first.math.numbers.N1)1 N3 (edu.wpi.first.math.numbers.N3)1 TrajectoryConfig (edu.wpi.first.math.trajectory.TrajectoryConfig)1 NetworkTable (edu.wpi.first.networktables.NetworkTable)1 NetworkTableEntry (edu.wpi.first.networktables.NetworkTableEntry)1 NetworkTableInstance (edu.wpi.first.networktables.NetworkTableInstance)1 CameraPosition (frc.robot.VisionConstants.CameraPosition)1 TimestampedTranslation2d (frc.robot.subsystems.vision.Vision.TimestampedTranslation2d)1