Search in sources :

Example 1 with HiveSessionProperties

use of io.trino.plugin.hive.HiveSessionProperties in project trino by trinodb.

the class TestCheckpointWriter method setUp.

@BeforeClass
public void setUp() {
    checkpointSchemaManager = new CheckpointSchemaManager(typeManager);
    HdfsConfig hdfsConfig = new HdfsConfig();
    HdfsConfiguration hdfsConfiguration = new HiveHdfsConfiguration(new HdfsConfigurationInitializer(hdfsConfig), Set.of());
    hdfsEnvironment = new HdfsEnvironment(hdfsConfiguration, hdfsConfig, new NoHdfsAuthentication());
    HiveSessionProperties hiveSessionProperties = getHiveSessionProperties(new HiveConfig());
    session = TestingConnectorSession.builder().setPropertyMetadata(hiveSessionProperties.getSessionProperties()).build();
}
Also used : HdfsConfigurationInitializer(io.trino.plugin.hive.HdfsConfigurationInitializer) HiveHdfsConfiguration(io.trino.plugin.hive.HiveHdfsConfiguration) HdfsConfig(io.trino.plugin.hive.HdfsConfig) HiveHdfsConfiguration(io.trino.plugin.hive.HiveHdfsConfiguration) HdfsConfiguration(io.trino.plugin.hive.HdfsConfiguration) NoHdfsAuthentication(io.trino.plugin.hive.authentication.NoHdfsAuthentication) HiveSessionProperties(io.trino.plugin.hive.HiveSessionProperties) HiveTestUtils.getHiveSessionProperties(io.trino.plugin.hive.HiveTestUtils.getHiveSessionProperties) HdfsEnvironment(io.trino.plugin.hive.HdfsEnvironment) HiveConfig(io.trino.plugin.hive.HiveConfig) BeforeClass(org.testng.annotations.BeforeClass)

Example 2 with HiveSessionProperties

use of io.trino.plugin.hive.HiveSessionProperties in project trino by trinodb.

the class ParquetTester method assertMaxReadBytes.

void assertMaxReadBytes(List<ObjectInspector> objectInspectors, Iterable<?>[] writeValues, Iterable<?>[] readValues, List<String> columnNames, List<Type> columnTypes, Optional<MessageType> parquetSchema, DataSize maxReadBlockSize) throws Exception {
    CompressionCodecName compressionCodecName = UNCOMPRESSED;
    HiveSessionProperties hiveSessionProperties = new HiveSessionProperties(new HiveConfig().setHiveStorageFormat(HiveStorageFormat.PARQUET).setUseParquetColumnNames(false), new OrcReaderConfig(), new OrcWriterConfig(), new ParquetReaderConfig().setMaxReadBlockSize(maxReadBlockSize), new ParquetWriterConfig());
    ConnectorSession session = TestingConnectorSession.builder().setPropertyMetadata(hiveSessionProperties.getSessionProperties()).build();
    try (TempFile tempFile = new TempFile("test", "parquet")) {
        JobConf jobConf = new JobConf();
        jobConf.setEnum(COMPRESSION, compressionCodecName);
        jobConf.setBoolean(ENABLE_DICTIONARY, true);
        jobConf.setEnum(WRITER_VERSION, PARQUET_1_0);
        writeParquetColumn(jobConf, tempFile.getFile(), compressionCodecName, createTableProperties(columnNames, objectInspectors), getStandardStructObjectInspector(columnNames, objectInspectors), getIterators(writeValues), parquetSchema, false);
        Iterator<?>[] expectedValues = getIterators(readValues);
        try (ConnectorPageSource pageSource = fileFormat.createFileFormatReader(session, HDFS_ENVIRONMENT, tempFile.getFile(), columnNames, columnTypes)) {
            assertPageSource(columnTypes, expectedValues, pageSource, Optional.of(getParquetMaxReadBlockSize(session).toBytes()));
            assertFalse(stream(expectedValues).allMatch(Iterator::hasNext));
        }
    }
}
Also used : OrcWriterConfig(io.trino.plugin.hive.orc.OrcWriterConfig) ConnectorPageSource(io.trino.spi.connector.ConnectorPageSource) HiveSessionProperties(io.trino.plugin.hive.HiveSessionProperties) HiveConfig(io.trino.plugin.hive.HiveConfig) OrcReaderConfig(io.trino.plugin.hive.orc.OrcReaderConfig) CompressionCodecName(org.apache.parquet.hadoop.metadata.CompressionCodecName) AbstractIterator(com.google.common.collect.AbstractIterator) Iterator(java.util.Iterator) ConnectorSession(io.trino.spi.connector.ConnectorSession) TestingConnectorSession(io.trino.testing.TestingConnectorSession) JobConf(org.apache.hadoop.mapred.JobConf)

Aggregations

HiveConfig (io.trino.plugin.hive.HiveConfig)2 HiveSessionProperties (io.trino.plugin.hive.HiveSessionProperties)2 AbstractIterator (com.google.common.collect.AbstractIterator)1 HdfsConfig (io.trino.plugin.hive.HdfsConfig)1 HdfsConfiguration (io.trino.plugin.hive.HdfsConfiguration)1 HdfsConfigurationInitializer (io.trino.plugin.hive.HdfsConfigurationInitializer)1 HdfsEnvironment (io.trino.plugin.hive.HdfsEnvironment)1 HiveHdfsConfiguration (io.trino.plugin.hive.HiveHdfsConfiguration)1 HiveTestUtils.getHiveSessionProperties (io.trino.plugin.hive.HiveTestUtils.getHiveSessionProperties)1 NoHdfsAuthentication (io.trino.plugin.hive.authentication.NoHdfsAuthentication)1 OrcReaderConfig (io.trino.plugin.hive.orc.OrcReaderConfig)1 OrcWriterConfig (io.trino.plugin.hive.orc.OrcWriterConfig)1 ConnectorPageSource (io.trino.spi.connector.ConnectorPageSource)1 ConnectorSession (io.trino.spi.connector.ConnectorSession)1 TestingConnectorSession (io.trino.testing.TestingConnectorSession)1 Iterator (java.util.Iterator)1 JobConf (org.apache.hadoop.mapred.JobConf)1 CompressionCodecName (org.apache.parquet.hadoop.metadata.CompressionCodecName)1 BeforeClass (org.testng.annotations.BeforeClass)1