Search in sources :

Example 6 with OrcReaderConfig

use of io.trino.plugin.hive.orc.OrcReaderConfig in project trino by trinodb.

the class TestHiveFileFormats method testOrcUseColumnNames.

@Test(dataProvider = "rowCount")
public void testOrcUseColumnNames(int rowCount) throws Exception {
    ConnectorSession session = getHiveSession(new HiveConfig(), new OrcReaderConfig().setUseColumnNames(true));
    // Hive binary writers are broken for timestamps
    List<TestColumn> testColumns = TEST_COLUMNS.stream().filter(TestHiveFileFormats::withoutTimestamps).collect(toImmutableList());
    assertThatFileFormat(ORC).withWriteColumns(testColumns).withRowsCount(rowCount).withReadColumns(Lists.reverse(testColumns)).withSession(session).isReadableByPageSource(new OrcPageSourceFactory(new OrcReaderOptions(), HDFS_ENVIRONMENT, STATS, UTC));
}
Also used : OrcReaderConfig(io.trino.plugin.hive.orc.OrcReaderConfig) OrcReaderOptions(io.trino.orc.OrcReaderOptions) ConnectorSession(io.trino.spi.connector.ConnectorSession) TestingConnectorSession(io.trino.testing.TestingConnectorSession) OrcPageSourceFactory(io.trino.plugin.hive.orc.OrcPageSourceFactory) Test(org.testng.annotations.Test)

Example 7 with OrcReaderConfig

use of io.trino.plugin.hive.orc.OrcReaderConfig in project trino by trinodb.

the class TestHiveFileFormats method testORCProjectedColumns.

@Test(dataProvider = "rowCount")
public void testORCProjectedColumns(int rowCount) throws Exception {
    List<TestColumn> supportedColumns = TEST_COLUMNS;
    List<TestColumn> regularColumns = getRegularColumns(supportedColumns);
    List<TestColumn> partitionColumns = getPartitionColumns(supportedColumns);
    // Created projected columns for all regular supported columns
    ImmutableList.Builder<TestColumn> writeColumnsBuilder = ImmutableList.builder();
    ImmutableList.Builder<TestColumn> readeColumnsBuilder = ImmutableList.builder();
    generateProjectedColumns(regularColumns, writeColumnsBuilder, readeColumnsBuilder);
    List<TestColumn> writeColumns = writeColumnsBuilder.addAll(partitionColumns).build();
    List<TestColumn> readColumns = readeColumnsBuilder.addAll(partitionColumns).build();
    ConnectorSession session = getHiveSession(new HiveConfig(), new OrcReaderConfig().setUseColumnNames(true));
    assertThatFileFormat(ORC).withWriteColumns(writeColumns).withReadColumns(readColumns).withRowsCount(rowCount).withSession(session).isReadableByPageSource(new OrcPageSourceFactory(new OrcReaderOptions(), HDFS_ENVIRONMENT, STATS, UTC));
    assertThatFileFormat(ORC).withWriteColumns(writeColumns).withReadColumns(readColumns).withRowsCount(rowCount).isReadableByPageSource(new OrcPageSourceFactory(new OrcReaderOptions(), HDFS_ENVIRONMENT, STATS, UTC));
}
Also used : OrcReaderConfig(io.trino.plugin.hive.orc.OrcReaderConfig) OrcReaderOptions(io.trino.orc.OrcReaderOptions) ImmutableList.toImmutableList(com.google.common.collect.ImmutableList.toImmutableList) ImmutableList(com.google.common.collect.ImmutableList) ConnectorSession(io.trino.spi.connector.ConnectorSession) TestingConnectorSession(io.trino.testing.TestingConnectorSession) OrcPageSourceFactory(io.trino.plugin.hive.orc.OrcPageSourceFactory) Test(org.testng.annotations.Test)

Aggregations

OrcReaderConfig (io.trino.plugin.hive.orc.OrcReaderConfig)7 ConnectorSession (io.trino.spi.connector.ConnectorSession)6 TestingConnectorSession (io.trino.testing.TestingConnectorSession)6 Test (org.testng.annotations.Test)5 OrcReaderOptions (io.trino.orc.OrcReaderOptions)4 OrcPageSourceFactory (io.trino.plugin.hive.orc.OrcPageSourceFactory)4 OrcWriterConfig (io.trino.plugin.hive.orc.OrcWriterConfig)4 ImmutableList (com.google.common.collect.ImmutableList)3 ImmutableList.toImmutableList (com.google.common.collect.ImmutableList.toImmutableList)3 ParquetReaderConfig (io.trino.plugin.hive.parquet.ParquetReaderConfig)3 ConnectorPageSource (io.trino.spi.connector.ConnectorPageSource)3 OrcWriterOptions (io.trino.orc.OrcWriterOptions)2 HiveConfig (io.trino.plugin.hive.HiveConfig)2 OrcFileWriterFactory (io.trino.plugin.hive.orc.OrcFileWriterFactory)2 ParquetWriterConfig (io.trino.plugin.hive.parquet.ParquetWriterConfig)2 Preconditions.checkState (com.google.common.base.Preconditions.checkState)1 AbstractIterator (com.google.common.collect.AbstractIterator)1 ImmutableSet (com.google.common.collect.ImmutableSet)1 Lists (com.google.common.collect.Lists)1 LzoCodec (io.airlift.compress.lzo.LzoCodec)1