Search in sources :

Example 41 with NullValue

use of org.apache.flink.types.NullValue in project flink by apache.

the class GraphCreationITCase method testCreateWithoutVertexValues.

@Test
public void testCreateWithoutVertexValues() throws Exception {
    /*
		 * Test create() with edge dataset and no vertex values
	     */
    final ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
    Graph<Long, NullValue, Long> graph = Graph.fromDataSet(TestGraphUtils.getLongLongEdgeData(env), env);
    DataSet<Vertex<Long, NullValue>> data = graph.getVertices();
    List<Vertex<Long, NullValue>> result = data.collect();
    expectedResult = "1,(null)\n" + "2,(null)\n" + "3,(null)\n" + "4,(null)\n" + "5,(null)\n";
    compareResultAsTuples(result, expectedResult);
}
Also used : Vertex(org.apache.flink.graph.Vertex) ExecutionEnvironment(org.apache.flink.api.java.ExecutionEnvironment) NullValue(org.apache.flink.types.NullValue) Test(org.junit.Test)

Example 42 with NullValue

use of org.apache.flink.types.NullValue in project flink by apache.

the class LocalClusteringCoefficientTest method testRMatGraph.

@Test
public void testRMatGraph() throws Exception {
    DataSet<Result<LongValue>> cc = directedRMatGraph.run(new LocalClusteringCoefficient<LongValue, NullValue, NullValue>());
    Checksum checksum = new org.apache.flink.graph.asm.dataset.ChecksumHashCode<Result<LongValue>>().run(cc).execute();
    assertEquals(902, checksum.getCount());
    assertEquals(0x000001bf83866775L, checksum.getChecksum());
}
Also used : NullValue(org.apache.flink.types.NullValue) Checksum(org.apache.flink.graph.asm.dataset.ChecksumHashCode.Checksum) LongValue(org.apache.flink.types.LongValue) Result(org.apache.flink.graph.library.clustering.directed.LocalClusteringCoefficient.Result) Test(org.junit.Test)

Example 43 with NullValue

use of org.apache.flink.types.NullValue in project flink by apache.

the class VertexMetricsTest method testWithEmptyGraph.

@Test
public void testWithEmptyGraph() throws Exception {
    Result expectedResult;
    expectedResult = new Result(0, 0, 0, 0, 0);
    Result withoutZeroDegreeVertices = new VertexMetrics<LongValue, NullValue, NullValue>().setIncludeZeroDegreeVertices(false).run(emptyGraph).execute();
    assertEquals(expectedResult, withoutZeroDegreeVertices);
    assertEquals(Float.NaN, withoutZeroDegreeVertices.getAverageDegree(), ACCURACY);
    assertEquals(Float.NaN, withoutZeroDegreeVertices.getDensity(), ACCURACY);
    expectedResult = new Result(3, 0, 0, 0, 0);
    Result withZeroDegreeVertices = new VertexMetrics<LongValue, NullValue, NullValue>().setIncludeZeroDegreeVertices(true).run(emptyGraph).execute();
    assertEquals(expectedResult, withZeroDegreeVertices);
    assertEquals(0.0f, withZeroDegreeVertices.getAverageDegree(), ACCURACY);
    assertEquals(0.0f, withZeroDegreeVertices.getDensity(), ACCURACY);
}
Also used : NullValue(org.apache.flink.types.NullValue) LongValue(org.apache.flink.types.LongValue) Result(org.apache.flink.graph.library.metric.undirected.VertexMetrics.Result) Test(org.junit.Test)

Example 44 with NullValue

use of org.apache.flink.types.NullValue in project flink by apache.

the class GraphCreationWithCsvITCase method testCreateWithOnlyEdgesCsvFile.

@Test
public void testCreateWithOnlyEdgesCsvFile() throws Exception {
    /*
		 * Test with one Csv file one with Edges data. Also tests the configuration method ignoreFistLineEdges()
		 */
    final ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
    final String fileContent2 = "header\n1,2,ot\n" + "3,2,tt\n" + "3,1,to\n";
    final FileInputSplit split2 = createTempFile(fileContent2);
    Graph<Long, NullValue, String> graph = Graph.fromCsvReader(split2.getPath().toString(), env).ignoreFirstLineEdges().ignoreCommentsVertices("hi").edgeTypes(Long.class, String.class);
    List<Triplet<Long, NullValue, String>> result = graph.getTriplets().collect();
    expectedResult = "1,2,(null),(null),ot\n" + "3,2,(null),(null),tt\n" + "3,1,(null),(null),to\n";
    compareResultAsTuples(result, expectedResult);
}
Also used : FileInputSplit(org.apache.flink.core.fs.FileInputSplit) ExecutionEnvironment(org.apache.flink.api.java.ExecutionEnvironment) NullValue(org.apache.flink.types.NullValue) Triplet(org.apache.flink.graph.Triplet) Test(org.junit.Test)

Example 45 with NullValue

use of org.apache.flink.types.NullValue in project flink by apache.

the class GraphCreationWithCsvITCase method testCsvWithNullEdge.

@Test
public void testCsvWithNullEdge() throws Exception {
    /*
		Test fromCsvReader with edge and vertex path and nullvalue for edge
		 */
    final ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
    final String vertexFileContent = "1,one\n" + "2,two\n" + "3,three\n";
    final String edgeFileContent = "1,2\n" + "3,2\n" + "3,1\n";
    final FileInputSplit split = createTempFile(vertexFileContent);
    final FileInputSplit edgeSplit = createTempFile(edgeFileContent);
    Graph<Long, String, NullValue> graph = Graph.fromCsvReader(split.getPath().toString(), edgeSplit.getPath().toString(), env).vertexTypes(Long.class, String.class);
    List<Triplet<Long, String, NullValue>> result = graph.getTriplets().collect();
    expectedResult = "1,2,one,two,(null)\n" + "3,2,three,two,(null)\n" + "3,1,three,one,(null)\n";
    compareResultAsTuples(result, expectedResult);
}
Also used : FileInputSplit(org.apache.flink.core.fs.FileInputSplit) ExecutionEnvironment(org.apache.flink.api.java.ExecutionEnvironment) NullValue(org.apache.flink.types.NullValue) Triplet(org.apache.flink.graph.Triplet) Test(org.junit.Test)

Aggregations

NullValue (org.apache.flink.types.NullValue)49 Test (org.junit.Test)39 ExecutionEnvironment (org.apache.flink.api.java.ExecutionEnvironment)33 LongValue (org.apache.flink.types.LongValue)23 Edge (org.apache.flink.graph.Edge)18 Vertex (org.apache.flink.graph.Vertex)18 Tuple2 (org.apache.flink.api.java.tuple.Tuple2)15 Checksum (org.apache.flink.graph.asm.dataset.ChecksumHashCode.Checksum)13 Graph (org.apache.flink.graph.Graph)12 DataSet (org.apache.flink.api.java.DataSet)11 ChecksumHashCode (org.apache.flink.graph.asm.dataset.ChecksumHashCode)11 DiscardingOutputFormat (org.apache.flink.api.java.io.DiscardingOutputFormat)7 JDKRandomGeneratorFactory (org.apache.flink.graph.generator.random.JDKRandomGeneratorFactory)7 NumberFormat (java.text.NumberFormat)6 JobExecutionResult (org.apache.flink.api.common.JobExecutionResult)6 Plan (org.apache.flink.api.common.Plan)6 MapFunction (org.apache.flink.api.common.functions.MapFunction)6 FieldList (org.apache.flink.api.common.operators.util.FieldList)6 ParameterTool (org.apache.flink.api.java.utils.ParameterTool)6 ProgramParametrizationException (org.apache.flink.client.program.ProgramParametrizationException)6