Search in sources :

Example 1 with Complex

use of org.apache.commons.math3.complex.Complex in project gatk by broadinstitute.

the class PCATangentNormalizationUtils method tangentNormalizeSpark.

/**
     * Tangent normalize given the raw PoN data using Spark:  the code here is a little more complex for optimization purposes.
     *
     *  Please see notes in docs/PoN ...
     *
     *  Ahat^T = (C^T P^T) A^T
     *  Therefore, C^T is the RowMatrix
     *
     *  pinv: P
     *  panel: A
     *  projection: Ahat
     *  cases: C
     *  betahat: C^T P^T
     *  tangentNormalizedCounts: C - Ahat
     */
private static PCATangentNormalizationResult tangentNormalizeSpark(final ReadCountCollection targetFactorNormalizedCounts, final RealMatrix reducedPanelCounts, final RealMatrix reducedPanelPInvCounts, final CaseToPoNTargetMapper targetMapper, final RealMatrix tangentNormalizationInputCounts, final JavaSparkContext ctx) {
    // Make the C^T a distributed matrix (RowMatrix)
    final RowMatrix caseTDistMat = SparkConverter.convertRealMatrixToSparkRowMatrix(ctx, tangentNormalizationInputCounts.transpose(), TN_NUM_SLICES_SPARK);
    // Spark local matrices (transposed)
    final Matrix pinvTLocalMat = new DenseMatrix(reducedPanelPInvCounts.getRowDimension(), reducedPanelPInvCounts.getColumnDimension(), Doubles.concat(reducedPanelPInvCounts.getData()), true).transpose();
    final Matrix panelTLocalMat = new DenseMatrix(reducedPanelCounts.getRowDimension(), reducedPanelCounts.getColumnDimension(), Doubles.concat(reducedPanelCounts.getData()), true).transpose();
    // Calculate the projection transpose in a distributed matrix, then convert to Apache Commons matrix (not transposed)
    final RowMatrix betahatDistMat = caseTDistMat.multiply(pinvTLocalMat);
    final RowMatrix projectionTDistMat = betahatDistMat.multiply(panelTLocalMat);
    final RealMatrix projection = SparkConverter.convertSparkRowMatrixToRealMatrix(projectionTDistMat, tangentNormalizationInputCounts.transpose().getRowDimension()).transpose();
    // Subtract the projection from the cases
    final RealMatrix tangentNormalizedCounts = tangentNormalizationInputCounts.subtract(projection);
    // Construct the result object and return it with the correct targets.
    final ReadCountCollection tangentNormalized = targetMapper.fromPoNtoCaseCountCollection(tangentNormalizedCounts, targetFactorNormalizedCounts.columnNames());
    final ReadCountCollection preTangentNormalized = targetMapper.fromPoNtoCaseCountCollection(tangentNormalizationInputCounts, targetFactorNormalizedCounts.columnNames());
    final RealMatrix tangentBetaHats = SparkConverter.convertSparkRowMatrixToRealMatrix(betahatDistMat, tangentNormalizedCounts.getColumnDimension());
    return new PCATangentNormalizationResult(tangentNormalized, preTangentNormalized, tangentBetaHats.transpose(), targetFactorNormalizedCounts);
}
Also used : RowMatrix(org.apache.spark.mllib.linalg.distributed.RowMatrix) DenseMatrix(org.apache.spark.mllib.linalg.DenseMatrix) RealMatrix(org.apache.commons.math3.linear.RealMatrix) Matrix(org.apache.spark.mllib.linalg.Matrix) RealMatrix(org.apache.commons.math3.linear.RealMatrix) ReadCountCollection(org.broadinstitute.hellbender.tools.exome.ReadCountCollection) RowMatrix(org.apache.spark.mllib.linalg.distributed.RowMatrix) DenseMatrix(org.apache.spark.mllib.linalg.DenseMatrix)

Example 2 with Complex

use of org.apache.commons.math3.complex.Complex in project tutorials by eugenp.

the class ComplexUnitTest method whenComplexPow_thenCorrect.

@Test
public void whenComplexPow_thenCorrect() {
    Complex first = new Complex(1.0, 3.0);
    Complex second = new Complex(2.0, 5.0);
    Complex power = first.pow(second);
    Assert.assertEquals(-0.007563724861696302, power.getReal(), 1e-7);
    Assert.assertEquals(0.01786136835085382, power.getImaginary(), 1e-7);
}
Also used : Complex(org.apache.commons.math3.complex.Complex) Test(org.junit.Test)

Example 3 with Complex

use of org.apache.commons.math3.complex.Complex in project gatk-protected by broadinstitute.

the class PCATangentNormalizationUtils method tangentNormalizeSpark.

/**
     * Tangent normalize given the raw PoN data using Spark:  the code here is a little more complex for optimization purposes.
     *
     *  Please see notes in docs/PoN ...
     *
     *  Ahat^T = (C^T P^T) A^T
     *  Therefore, C^T is the RowMatrix
     *
     *  pinv: P
     *  panel: A
     *  projection: Ahat
     *  cases: C
     *  betahat: C^T P^T
     *  tangentNormalizedCounts: C - Ahat
     */
private static PCATangentNormalizationResult tangentNormalizeSpark(final ReadCountCollection targetFactorNormalizedCounts, final RealMatrix reducedPanelCounts, final RealMatrix reducedPanelPInvCounts, final CaseToPoNTargetMapper targetMapper, final RealMatrix tangentNormalizationInputCounts, final JavaSparkContext ctx) {
    // Make the C^T a distributed matrix (RowMatrix)
    final RowMatrix caseTDistMat = SparkConverter.convertRealMatrixToSparkRowMatrix(ctx, tangentNormalizationInputCounts.transpose(), TN_NUM_SLICES_SPARK);
    // Spark local matrices (transposed)
    final Matrix pinvTLocalMat = new DenseMatrix(reducedPanelPInvCounts.getRowDimension(), reducedPanelPInvCounts.getColumnDimension(), Doubles.concat(reducedPanelPInvCounts.getData()), true).transpose();
    final Matrix panelTLocalMat = new DenseMatrix(reducedPanelCounts.getRowDimension(), reducedPanelCounts.getColumnDimension(), Doubles.concat(reducedPanelCounts.getData()), true).transpose();
    // Calculate the projection transpose in a distributed matrix, then convert to Apache Commons matrix (not transposed)
    final RowMatrix betahatDistMat = caseTDistMat.multiply(pinvTLocalMat);
    final RowMatrix projectionTDistMat = betahatDistMat.multiply(panelTLocalMat);
    final RealMatrix projection = SparkConverter.convertSparkRowMatrixToRealMatrix(projectionTDistMat, tangentNormalizationInputCounts.transpose().getRowDimension()).transpose();
    // Subtract the projection from the cases
    final RealMatrix tangentNormalizedCounts = tangentNormalizationInputCounts.subtract(projection);
    // Construct the result object and return it with the correct targets.
    final ReadCountCollection tangentNormalized = targetMapper.fromPoNtoCaseCountCollection(tangentNormalizedCounts, targetFactorNormalizedCounts.columnNames());
    final ReadCountCollection preTangentNormalized = targetMapper.fromPoNtoCaseCountCollection(tangentNormalizationInputCounts, targetFactorNormalizedCounts.columnNames());
    final RealMatrix tangentBetaHats = SparkConverter.convertSparkRowMatrixToRealMatrix(betahatDistMat, tangentNormalizedCounts.getColumnDimension());
    return new PCATangentNormalizationResult(tangentNormalized, preTangentNormalized, tangentBetaHats.transpose(), targetFactorNormalizedCounts);
}
Also used : RowMatrix(org.apache.spark.mllib.linalg.distributed.RowMatrix) DenseMatrix(org.apache.spark.mllib.linalg.DenseMatrix) RealMatrix(org.apache.commons.math3.linear.RealMatrix) Matrix(org.apache.spark.mllib.linalg.Matrix) RealMatrix(org.apache.commons.math3.linear.RealMatrix) ReadCountCollection(org.broadinstitute.hellbender.tools.exome.ReadCountCollection) RowMatrix(org.apache.spark.mllib.linalg.distributed.RowMatrix) DenseMatrix(org.apache.spark.mllib.linalg.DenseMatrix)

Example 4 with Complex

use of org.apache.commons.math3.complex.Complex in project narchy by automenta.

the class MyCMAESOptimizer method updateBD.

/**
 * Update B and D from C.
 *
 * @param negccov Negative covariance factor.
 */
private void updateBD(double negccov) {
    if (ccov1 + ccovmu + negccov > 0 && (iterations % 1.0 / (ccov1 + ccovmu + negccov) / dimension / dimensionDivisorWTF) < 1) {
        // to achieve O(N^2)
        C = triu(C, 0).add(triu(C, 1).transpose());
        // enforce symmetry to prevent complex numbers
        final EigenDecomposition eig = new EigenDecomposition(C);
        // eigen decomposition, B==normalized eigenvectors
        B = eig.getV();
        D = eig.getD();
        diagD = diag(D);
        if (min(diagD) <= 0) {
            for (int i = 0; i < dimension; i++) {
                if (diagD.getEntry(i, 0) < 0) {
                    diagD.setEntry(i, 0, 0);
                }
            }
            final double tfac = max(diagD) / big_magic_number_WTF;
            C = C.add(eye(dimension, dimension).scalarMultiply(tfac));
            diagD = diagD.add(ones(dimension, 1).scalarMultiply(tfac));
        }
        if (max(diagD) > big_magic_number_WTF * min(diagD)) {
            final double tfac = max(diagD) / big_magic_number_WTF - min(diagD);
            C = C.add(eye(dimension, dimension).scalarMultiply(tfac));
            diagD = diagD.add(ones(dimension, 1).scalarMultiply(tfac));
        }
        diagC = diag(C);
        // D contains standard deviations now
        diagD = sqrt(diagD);
        // O(n^2)
        BD = times(B, repmat(diagD.transpose(), dimension, 1));
    }
}
Also used : EigenDecomposition(org.apache.commons.math3.linear.EigenDecomposition)

Example 5 with Complex

use of org.apache.commons.math3.complex.Complex in project ma-modules-public by infiniteautomation.

the class PointValueFftCalculator method streamData.

/* (non-Javadoc)
	 * @see com.serotonin.m2m2.web.mvc.rest.v1.model.pointValue.PointValueTimeStream#streamData(java.io.Writer)
	 */
@Override
public void streamData(JsonGenerator jgen) {
    this.setupDates();
    DateTime startTime = new DateTime(from);
    DateTime endTime = new DateTime(to);
    FftGenerator generator = this.calculate(startTime, endTime);
    double[] fftData = generator.getValues();
    double sampleRateHz = 1000d / generator.getAverageSamplePeriodMs();
    double dataLength = (double) fftData.length;
    // Output The Real Steady State Mangitude
    try {
        jgen.writeStartObject();
        // Amplitude
        jgen.writeNumberField("value", fftData[0]);
        double frequency = 0;
        jgen.writeNumberField("frequency", frequency);
        jgen.writeEndObject();
    } catch (IOException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    }
    double realComponent, imaginaryComponent, frequency;
    if (fftData.length % 2 == 0) {
        for (int i = 2; i < fftData.length / 2; i++) {
            try {
                realComponent = fftData[i * 2];
                imaginaryComponent = fftData[2 * i + 1];
                Complex c = new Complex(realComponent, imaginaryComponent);
                jgen.writeStartObject();
                // Amplitude
                jgen.writeNumberField("value", c.abs());
                if (this.returnFrequency)
                    frequency = (double) i * sampleRateHz / dataLength;
                else
                    frequency = 1d / ((double) i * sampleRateHz / dataLength);
                jgen.writeNumberField("frequency", frequency);
                jgen.writeEndObject();
            } catch (Exception e) {
                // TODO Auto-generated catch block
                e.printStackTrace();
            }
        }
    } else {
        for (int i = 2; i < (fftData.length - 1) / 2; i++) {
            try {
                realComponent = fftData[i * 2];
                imaginaryComponent = fftData[2 * i + 1];
                Complex c = new Complex(realComponent, imaginaryComponent);
                jgen.writeStartObject();
                // Amplitude
                jgen.writeNumberField("value", c.abs());
                if (this.returnFrequency)
                    frequency = (double) i * sampleRateHz / dataLength;
                else
                    frequency = 1d / ((double) i * sampleRateHz / dataLength);
                jgen.writeNumberField("frequency", frequency);
                jgen.writeEndObject();
            } catch (Exception e) {
                // TODO Auto-generated catch block
                e.printStackTrace();
            }
        }
        // Write the last value out as it isn't in order in the array
        try {
            realComponent = fftData[fftData.length / 2];
            imaginaryComponent = fftData[1];
            Complex c = new Complex(realComponent, imaginaryComponent);
            jgen.writeStartObject();
            // Amplitude
            jgen.writeNumberField("value", c.abs());
            if (this.returnFrequency)
                frequency = (double) (((fftData.length - 1) / 2) - 1) * sampleRateHz / dataLength;
            else
                frequency = 1d / ((double) (((fftData.length - 1) / 2) - 1) * sampleRateHz / dataLength);
            jgen.writeNumberField("frequency", frequency);
            jgen.writeEndObject();
        } catch (Exception e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }
    }
}
Also used : FftGenerator(com.serotonin.m2m2.view.quantize2.FftGenerator) IOException(java.io.IOException) DateTime(org.joda.time.DateTime) IOException(java.io.IOException) Complex(org.apache.commons.math3.complex.Complex)

Aggregations

Complex (org.apache.commons.math3.complex.Complex)3 RealMatrix (org.apache.commons.math3.linear.RealMatrix)2 DenseMatrix (org.apache.spark.mllib.linalg.DenseMatrix)2 Matrix (org.apache.spark.mllib.linalg.Matrix)2 RowMatrix (org.apache.spark.mllib.linalg.distributed.RowMatrix)2 ReadCountCollection (org.broadinstitute.hellbender.tools.exome.ReadCountCollection)2 TreeMLData (automenta.vivisect.TreeMLData)1 FftGenerator (com.serotonin.m2m2.view.quantize2.FftGenerator)1 IOException (java.io.IOException)1 EigenDecomposition (org.apache.commons.math3.linear.EigenDecomposition)1 DateTime (org.joda.time.DateTime)1 Test (org.junit.Test)1