Search in sources :

Example 1 with CreateHiveTableAsSelectCommand

use of org.apache.spark.sql.hive.execution.CreateHiveTableAsSelectCommand in project OpenLineage by OpenLineage.

the class CreateHiveTableAsSelectCommandVisitor method apply.

@Override
public List<OpenLineage.OutputDataset> apply(LogicalPlan x) {
    CreateHiveTableAsSelectCommand command = (CreateHiveTableAsSelectCommand) x;
    CatalogTable table = command.tableDesc();
    DatasetIdentifier di = PathUtils.fromCatalogTable(table);
    StructType schema = outputSchema(ScalaConversionUtils.fromSeq(command.outputColumns()));
    return Collections.singletonList(outputDataset().getDataset(di, schema, OpenLineage.LifecycleStateChangeDatasetFacet.LifecycleStateChange.CREATE));
}
Also used : StructType(org.apache.spark.sql.types.StructType) DatasetIdentifier(io.openlineage.spark.agent.util.DatasetIdentifier) CatalogTable(org.apache.spark.sql.catalyst.catalog.CatalogTable) CreateHiveTableAsSelectCommand(org.apache.spark.sql.hive.execution.CreateHiveTableAsSelectCommand)

Example 2 with CreateHiveTableAsSelectCommand

use of org.apache.spark.sql.hive.execution.CreateHiveTableAsSelectCommand in project OpenLineage by OpenLineage.

the class CreateHiveTableAsSelectCommandVisitorTest method testCreateHiveTableAsSelectCommand.

@Test
void testCreateHiveTableAsSelectCommand() {
    CreateHiveTableAsSelectCommandVisitor visitor = new CreateHiveTableAsSelectCommandVisitor(OpenLineageContext.builder().sparkSession(Optional.of(session)).sparkContext(session.sparkContext()).openLineage(new OpenLineage(OpenLineageClient.OPEN_LINEAGE_CLIENT_URI)).build());
    CreateHiveTableAsSelectCommand command = new CreateHiveTableAsSelectCommand(SparkUtils.catalogTable(TableIdentifier$.MODULE$.apply("tablename", Option.apply("db")), CatalogTableType.EXTERNAL(), CatalogStorageFormat$.MODULE$.apply(Option.apply(URI.create("s3://bucket/directory")), null, null, null, false, Map$.MODULE$.empty()), new StructType(new StructField[] { new StructField("key", IntegerType$.MODULE$, false, new Metadata(new HashMap<>())), new StructField("value", StringType$.MODULE$, false, new Metadata(new HashMap<>())) })), new LogicalRelation(new JDBCRelation(new StructType(new StructField[] { new StructField("key", IntegerType$.MODULE$, false, null), new StructField("value", StringType$.MODULE$, false, null) }), new Partition[] {}, new JDBCOptions("", "temp", scala.collection.immutable.Map$.MODULE$.newBuilder().$plus$eq(Tuple2.apply("driver", Driver.class.getName())).result()), session), Seq$.MODULE$.<AttributeReference>newBuilder().$plus$eq(new AttributeReference("key", IntegerType$.MODULE$, false, null, ExprId.apply(1L), Seq$.MODULE$.<String>empty())).$plus$eq(new AttributeReference("value", StringType$.MODULE$, false, null, ExprId.apply(2L), Seq$.MODULE$.<String>empty())).result(), Option.empty(), false), ScalaConversionUtils.fromList(Arrays.asList("key", "value")), SaveMode.Overwrite);
    assertThat(visitor.isDefinedAt(command)).isTrue();
    List<OpenLineage.OutputDataset> datasets = visitor.apply(command);
    assertEquals(1, datasets.size());
    OpenLineage.OutputDataset outputDataset = datasets.get(0);
    assertEquals(OpenLineage.LifecycleStateChangeDatasetFacet.LifecycleStateChange.CREATE, outputDataset.getFacets().getLifecycleStateChange().getLifecycleStateChange());
    assertEquals("directory", outputDataset.getName());
    assertEquals("s3://bucket", outputDataset.getNamespace());
}
Also used : StructType(org.apache.spark.sql.types.StructType) AttributeReference(org.apache.spark.sql.catalyst.expressions.AttributeReference) Metadata(org.apache.spark.sql.types.Metadata) JDBCRelation(org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation) Driver(org.postgresql.Driver) CreateHiveTableAsSelectCommand(org.apache.spark.sql.hive.execution.CreateHiveTableAsSelectCommand) LogicalRelation(org.apache.spark.sql.execution.datasources.LogicalRelation) StructField(org.apache.spark.sql.types.StructField) JDBCOptions(org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions) CreateHiveTableAsSelectCommandVisitor(io.openlineage.spark.agent.lifecycle.plan.CreateHiveTableAsSelectCommandVisitor) OpenLineage(io.openlineage.client.OpenLineage) Test(org.junit.jupiter.api.Test)

Aggregations

CreateHiveTableAsSelectCommand (org.apache.spark.sql.hive.execution.CreateHiveTableAsSelectCommand)2 StructType (org.apache.spark.sql.types.StructType)2 OpenLineage (io.openlineage.client.OpenLineage)1 CreateHiveTableAsSelectCommandVisitor (io.openlineage.spark.agent.lifecycle.plan.CreateHiveTableAsSelectCommandVisitor)1 DatasetIdentifier (io.openlineage.spark.agent.util.DatasetIdentifier)1 CatalogTable (org.apache.spark.sql.catalyst.catalog.CatalogTable)1 AttributeReference (org.apache.spark.sql.catalyst.expressions.AttributeReference)1 LogicalRelation (org.apache.spark.sql.execution.datasources.LogicalRelation)1 JDBCOptions (org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions)1 JDBCRelation (org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation)1 Metadata (org.apache.spark.sql.types.Metadata)1 StructField (org.apache.spark.sql.types.StructField)1 Test (org.junit.jupiter.api.Test)1 Driver (org.postgresql.Driver)1