Search in sources :

Example 1 with QueryResult

use of com.amazonaws.services.timestreamquery.model.QueryResult in project aws-athena-query-federation by awslabs.

the class TestUtils method makeMockQueryResult.

public static QueryResult makeMockQueryResult(Schema schemaForRead, int numRows) {
    QueryResult mockResult = mock(QueryResult.class);
    final AtomicLong nextToken = new AtomicLong(0);
    when(mockResult.getRows()).thenAnswer((Answer<List<Row>>) invocationOnMock -> {
        List<Row> rows = new ArrayList<>();
        for (int i = 0; i < 100; i++) {
            nextToken.incrementAndGet();
            List<Datum> columnData = new ArrayList<>();
            for (Field nextField : schemaForRead.getFields()) {
                columnData.add(makeValue(nextField));
            }
            Row row = new Row();
            row.setData(columnData);
            rows.add(row);
        }
        return rows;
    });
    when(mockResult.getNextToken()).thenAnswer((Answer<String>) invocationOnMock -> {
        if (nextToken.get() < numRows) {
            return String.valueOf(nextToken.get());
        }
        return null;
    });
    return mockResult;
}
Also used : QueryResult(com.amazonaws.services.timestreamquery.model.QueryResult) Schema(org.apache.arrow.vector.types.pojo.Schema) FLOAT8(org.apache.arrow.vector.types.Types.MinorType.FLOAT8) Date(java.util.Date) Types(org.apache.arrow.vector.types.Types) SimpleDateFormat(java.text.SimpleDateFormat) HashMap(java.util.HashMap) Random(java.util.Random) TimeSeriesDataPoint(com.amazonaws.services.timestreamquery.model.TimeSeriesDataPoint) ArrayList(java.util.ArrayList) Answer(org.mockito.stubbing.Answer) Map(java.util.Map) GeneratedRowWriter(com.amazonaws.athena.connector.lambda.data.writers.GeneratedRowWriter) FieldResolver(com.amazonaws.athena.connector.lambda.data.FieldResolver) ConstraintProjector(com.amazonaws.athena.connector.lambda.domain.predicate.ConstraintProjector) FieldVector(org.apache.arrow.vector.FieldVector) Datum(com.amazonaws.services.timestreamquery.model.Datum) Field(org.apache.arrow.vector.types.pojo.Field) Mockito.when(org.mockito.Mockito.when) Row(com.amazonaws.services.timestreamquery.model.Row) AtomicLong(java.util.concurrent.atomic.AtomicLong) List(java.util.List) BlockUtils(com.amazonaws.athena.connector.lambda.data.BlockUtils) Extractor(com.amazonaws.athena.connector.lambda.data.writers.extractors.Extractor) Assert.assertEquals(org.junit.Assert.assertEquals) Mockito.mock(org.mockito.Mockito.mock) Field(org.apache.arrow.vector.types.pojo.Field) QueryResult(com.amazonaws.services.timestreamquery.model.QueryResult) AtomicLong(java.util.concurrent.atomic.AtomicLong) ArrayList(java.util.ArrayList) List(java.util.List) Row(com.amazonaws.services.timestreamquery.model.Row)

Example 2 with QueryResult

use of com.amazonaws.services.timestreamquery.model.QueryResult in project aws-athena-query-federation by awslabs.

the class TimestreamMetadataHandlerTest method doGetTable.

@Test
public void doGetTable() throws Exception {
    logger.info("doGetTable - enter");
    when(mockGlue.getTable(any(com.amazonaws.services.glue.model.GetTableRequest.class))).thenReturn(mock(GetTableResult.class));
    when(mockTsQuery.query(any(QueryRequest.class))).thenAnswer((InvocationOnMock invocation) -> {
        QueryRequest request = invocation.getArgumentAt(0, QueryRequest.class);
        assertEquals("DESCRIBE \"default\".\"table1\"", request.getQueryString());
        List<Row> rows = new ArrayList<>();
        // TODO: Add types here
        rows.add(new Row().withData(new Datum().withScalarValue("availability_zone"), new Datum().withScalarValue("varchar"), new Datum().withScalarValue("dimension")));
        rows.add(new Row().withData(new Datum().withScalarValue("measure_value"), new Datum().withScalarValue("double"), new Datum().withScalarValue("measure_value")));
        rows.add(new Row().withData(new Datum().withScalarValue("measure_name"), new Datum().withScalarValue("varchar"), new Datum().withScalarValue("measure_name")));
        rows.add(new Row().withData(new Datum().withScalarValue("time"), new Datum().withScalarValue("timestamp"), new Datum().withScalarValue("timestamp")));
        return new QueryResult().withRows(rows);
    });
    GetTableRequest req = new GetTableRequest(identity, "query-id", "default", new TableName(defaultSchema, "table1"));
    GetTableResponse res = handler.doGetTable(allocator, req);
    logger.info("doGetTable - {}", res);
    assertEquals(4, res.getSchema().getFields().size());
    Field measureName = res.getSchema().findField("measure_name");
    assertEquals(Types.MinorType.VARCHAR, Types.getMinorTypeForArrowType(measureName.getType()));
    Field measureValue = res.getSchema().findField("measure_value");
    assertEquals(Types.MinorType.FLOAT8, Types.getMinorTypeForArrowType(measureValue.getType()));
    Field availabilityZone = res.getSchema().findField("availability_zone");
    assertEquals(Types.MinorType.VARCHAR, Types.getMinorTypeForArrowType(availabilityZone.getType()));
    Field time = res.getSchema().findField("time");
    assertEquals(Types.MinorType.DATEMILLI, Types.getMinorTypeForArrowType(time.getType()));
    logger.info("doGetTable - exit");
}
Also used : Datum(com.amazonaws.services.timestreamquery.model.Datum) QueryRequest(com.amazonaws.services.timestreamquery.model.QueryRequest) ArrayList(java.util.ArrayList) GetTableRequest(com.amazonaws.athena.connector.lambda.metadata.GetTableRequest) TableName(com.amazonaws.athena.connector.lambda.domain.TableName) Field(org.apache.arrow.vector.types.pojo.Field) QueryResult(com.amazonaws.services.timestreamquery.model.QueryResult) GetTableResponse(com.amazonaws.athena.connector.lambda.metadata.GetTableResponse) InvocationOnMock(org.mockito.invocation.InvocationOnMock) Row(com.amazonaws.services.timestreamquery.model.Row) GetTableResult(com.amazonaws.services.glue.model.GetTableResult) Test(org.junit.Test)

Example 3 with QueryResult

use of com.amazonaws.services.timestreamquery.model.QueryResult in project aws-athena-query-federation by awslabs.

the class TimestreamRecordHandlerTest method doReadRecordsNoSpill.

@Test
public void doReadRecordsNoSpill() throws Exception {
    int numRowsGenerated = 1_000;
    String expectedQuery = "SELECT measure_name, measure_value::double, az, time, hostname, region FROM \"my_schema\".\"my_table\" WHERE (\"az\" IN ('us-east-1a','us-east-1b'))";
    QueryResult mockResult = makeMockQueryResult(schemaForRead, numRowsGenerated);
    when(mockClient.query(any(QueryRequest.class))).thenAnswer((Answer<QueryResult>) invocationOnMock -> {
        QueryRequest request = (QueryRequest) invocationOnMock.getArguments()[0];
        assertEquals(expectedQuery, request.getQueryString().replace("\n", ""));
        return mockResult;
    });
    Map<String, ValueSet> constraintsMap = new HashMap<>();
    constraintsMap.put("az", EquatableValueSet.newBuilder(allocator, Types.MinorType.VARCHAR.getType(), true, true).add("us-east-1a").add("us-east-1b").build());
    S3SpillLocation splitLoc = S3SpillLocation.newBuilder().withBucket(UUID.randomUUID().toString()).withSplitId(UUID.randomUUID().toString()).withQueryId(UUID.randomUUID().toString()).withIsDirectory(true).build();
    Split.Builder splitBuilder = Split.newBuilder(splitLoc, keyFactory.create());
    ReadRecordsRequest request = new ReadRecordsRequest(IDENTITY, DEFAULT_CATALOG, "queryId-" + System.currentTimeMillis(), new TableName(DEFAULT_SCHEMA, TEST_TABLE), schemaForRead, splitBuilder.build(), new Constraints(constraintsMap), // 100GB don't expect this to spill
    100_000_000_000L, 100_000_000_000L);
    RecordResponse rawResponse = handler.doReadRecords(allocator, request);
    assertTrue(rawResponse instanceof ReadRecordsResponse);
    ReadRecordsResponse response = (ReadRecordsResponse) rawResponse;
    logger.info("doReadRecordsNoSpill: rows[{}]", response.getRecordCount());
    assertTrue(response.getRecords().getRowCount() > 0);
    // ensure we actually filtered something out
    assertTrue(response.getRecords().getRowCount() < numRowsGenerated);
    logger.info("doReadRecordsNoSpill: {}", BlockUtils.rowToString(response.getRecords(), 0));
}
Also used : QueryResult(com.amazonaws.services.timestreamquery.model.QueryResult) Schema(org.apache.arrow.vector.types.pojo.Schema) Types(org.apache.arrow.vector.types.Types) LoggerFactory(org.slf4j.LoggerFactory) BlockAllocator(com.amazonaws.athena.connector.lambda.data.BlockAllocator) SpillLocation(com.amazonaws.athena.connector.lambda.domain.spill.SpillLocation) Block(com.amazonaws.athena.connector.lambda.data.Block) ByteArrayInputStream(java.io.ByteArrayInputStream) After(org.junit.After) Map(java.util.Map) ValueSet(com.amazonaws.athena.connector.lambda.domain.predicate.ValueSet) AmazonTimestreamQuery(com.amazonaws.services.timestreamquery.AmazonTimestreamQuery) BlockAllocatorImpl(com.amazonaws.athena.connector.lambda.data.BlockAllocatorImpl) Split(com.amazonaws.athena.connector.lambda.domain.Split) ReadRecordsResponse(com.amazonaws.athena.connector.lambda.records.ReadRecordsResponse) UUID(java.util.UUID) TableName(com.amazonaws.athena.connector.lambda.domain.TableName) RecordResponse(com.amazonaws.athena.connector.lambda.records.RecordResponse) Matchers.any(org.mockito.Matchers.any) List(java.util.List) ByteStreams(com.google.common.io.ByteStreams) BlockUtils(com.amazonaws.athena.connector.lambda.data.BlockUtils) S3ObjectInputStream(com.amazonaws.services.s3.model.S3ObjectInputStream) EncryptionKeyFactory(com.amazonaws.athena.connector.lambda.security.EncryptionKeyFactory) Mockito.mock(org.mockito.Mockito.mock) Mock(org.mockito.Mock) EquatableValueSet(com.amazonaws.athena.connector.lambda.domain.predicate.EquatableValueSet) RunWith(org.junit.runner.RunWith) HashMap(java.util.HashMap) Matchers.anyString(org.mockito.Matchers.anyString) ArrayList(java.util.ArrayList) RemoteReadRecordsResponse(com.amazonaws.athena.connector.lambda.records.RemoteReadRecordsResponse) Answer(org.mockito.stubbing.Answer) InvocationOnMock(org.mockito.invocation.InvocationOnMock) S3Object(com.amazonaws.services.s3.model.S3Object) SchemaBuilder(com.amazonaws.athena.connector.lambda.data.SchemaBuilder) TestName(org.junit.rules.TestName) LocalKeyFactory(com.amazonaws.athena.connector.lambda.security.LocalKeyFactory) Matchers.anyObject(org.mockito.Matchers.anyObject) AmazonS3(com.amazonaws.services.s3.AmazonS3) FederatedIdentity(com.amazonaws.athena.connector.lambda.security.FederatedIdentity) PutObjectResult(com.amazonaws.services.s3.model.PutObjectResult) S3BlockSpillReader(com.amazonaws.athena.connector.lambda.data.S3BlockSpillReader) Before(org.junit.Before) Logger(org.slf4j.Logger) AmazonAthena(com.amazonaws.services.athena.AmazonAthena) Assert.assertNotNull(org.junit.Assert.assertNotNull) ReadRecordsRequest(com.amazonaws.athena.connector.lambda.records.ReadRecordsRequest) Assert.assertTrue(org.junit.Assert.assertTrue) AWSSecretsManager(com.amazonaws.services.secretsmanager.AWSSecretsManager) Test(org.junit.Test) IOException(java.io.IOException) Mockito.when(org.mockito.Mockito.when) FieldBuilder(com.amazonaws.athena.connector.lambda.data.FieldBuilder) Constraints(com.amazonaws.athena.connector.lambda.domain.predicate.Constraints) S3SpillLocation(com.amazonaws.athena.connector.lambda.domain.spill.S3SpillLocation) Rule(org.junit.Rule) MockitoJUnitRunner(org.mockito.runners.MockitoJUnitRunner) QueryRequest(com.amazonaws.services.timestreamquery.model.QueryRequest) TestUtils.makeMockQueryResult(com.amazonaws.athena.connectors.timestream.TestUtils.makeMockQueryResult) VIEW_METADATA_FIELD(com.amazonaws.athena.connector.lambda.handlers.GlueMetadataHandler.VIEW_METADATA_FIELD) Collections(java.util.Collections) Assert.assertEquals(org.junit.Assert.assertEquals) InputStream(java.io.InputStream) QueryRequest(com.amazonaws.services.timestreamquery.model.QueryRequest) HashMap(java.util.HashMap) ReadRecordsResponse(com.amazonaws.athena.connector.lambda.records.ReadRecordsResponse) RemoteReadRecordsResponse(com.amazonaws.athena.connector.lambda.records.RemoteReadRecordsResponse) Matchers.anyString(org.mockito.Matchers.anyString) RecordResponse(com.amazonaws.athena.connector.lambda.records.RecordResponse) TableName(com.amazonaws.athena.connector.lambda.domain.TableName) QueryResult(com.amazonaws.services.timestreamquery.model.QueryResult) TestUtils.makeMockQueryResult(com.amazonaws.athena.connectors.timestream.TestUtils.makeMockQueryResult) ReadRecordsRequest(com.amazonaws.athena.connector.lambda.records.ReadRecordsRequest) Constraints(com.amazonaws.athena.connector.lambda.domain.predicate.Constraints) S3SpillLocation(com.amazonaws.athena.connector.lambda.domain.spill.S3SpillLocation) Split(com.amazonaws.athena.connector.lambda.domain.Split) ValueSet(com.amazonaws.athena.connector.lambda.domain.predicate.ValueSet) EquatableValueSet(com.amazonaws.athena.connector.lambda.domain.predicate.EquatableValueSet) Test(org.junit.Test)

Example 4 with QueryResult

use of com.amazonaws.services.timestreamquery.model.QueryResult in project aws-athena-query-federation by awslabs.

the class TimestreamRecordHandlerTest method readRecordsTimeSeriesView.

@Test
public void readRecordsTimeSeriesView() throws Exception {
    logger.info("readRecordsTimeSeriesView - enter");
    Schema schemaForReadView = SchemaBuilder.newBuilder().addField("region", Types.MinorType.VARCHAR.getType()).addField("az", Types.MinorType.VARCHAR.getType()).addField("hostname", Types.MinorType.VARCHAR.getType()).addField(FieldBuilder.newBuilder("cpu_utilization", Types.MinorType.LIST.getType()).addField(FieldBuilder.newBuilder("cpu_utilization", Types.MinorType.STRUCT.getType()).addDateMilliField("time").addFloat8Field("measure_value::double").build()).build()).addMetadata(VIEW_METADATA_FIELD, "select az, hostname, region,  CREATE_TIME_SERIES(time, measure_value::double) as cpu_utilization from \"" + DEFAULT_SCHEMA + "\".\"" + TEST_TABLE + "\" WHERE measure_name = 'cpu_utilization' GROUP BY measure_name, az, hostname, region").build();
    String expectedQuery = "WITH t1 AS ( select az, hostname, region,  CREATE_TIME_SERIES(time, measure_value::double) as cpu_utilization from \"my_schema\".\"my_table\" WHERE measure_name = 'cpu_utilization' GROUP BY measure_name, az, hostname, region )  SELECT region, az, hostname, cpu_utilization FROM t1 WHERE (\"az\" IN ('us-east-1a','us-east-1b'))";
    QueryResult mockResult = makeMockQueryResult(schemaForReadView, 1_000);
    when(mockClient.query(any(QueryRequest.class))).thenAnswer((Answer<QueryResult>) invocationOnMock -> {
        QueryRequest request = (QueryRequest) invocationOnMock.getArguments()[0];
        assertEquals("actual: " + request.getQueryString(), expectedQuery, request.getQueryString().replace("\n", ""));
        return mockResult;
    });
    S3SpillLocation splitLoc = S3SpillLocation.newBuilder().withBucket(UUID.randomUUID().toString()).withSplitId(UUID.randomUUID().toString()).withQueryId(UUID.randomUUID().toString()).withIsDirectory(true).build();
    Split split = Split.newBuilder(splitLoc, null).build();
    Map<String, ValueSet> constraintsMap = new HashMap<>();
    constraintsMap.put("az", EquatableValueSet.newBuilder(allocator, Types.MinorType.VARCHAR.getType(), true, true).add("us-east-1a").add("us-east-1b").build());
    ReadRecordsRequest request = new ReadRecordsRequest(IDENTITY, "default", "queryId-" + System.currentTimeMillis(), new TableName(DEFAULT_SCHEMA, TEST_TABLE), schemaForReadView, split, new Constraints(constraintsMap), // 100GB don't expect this to spill
    100_000_000_000L, 100_000_000_000L);
    RecordResponse rawResponse = handler.doReadRecords(allocator, request);
    ReadRecordsResponse response = (ReadRecordsResponse) rawResponse;
    logger.info("readRecordsTimeSeriesView: rows[{}]", response.getRecordCount());
    for (int i = 0; i < response.getRecordCount() && i < 10; i++) {
        logger.info("readRecordsTimeSeriesView: {}", BlockUtils.rowToString(response.getRecords(), i));
    }
    logger.info("readRecordsTimeSeriesView - exit");
}
Also used : QueryResult(com.amazonaws.services.timestreamquery.model.QueryResult) Schema(org.apache.arrow.vector.types.pojo.Schema) Types(org.apache.arrow.vector.types.Types) LoggerFactory(org.slf4j.LoggerFactory) BlockAllocator(com.amazonaws.athena.connector.lambda.data.BlockAllocator) SpillLocation(com.amazonaws.athena.connector.lambda.domain.spill.SpillLocation) Block(com.amazonaws.athena.connector.lambda.data.Block) ByteArrayInputStream(java.io.ByteArrayInputStream) After(org.junit.After) Map(java.util.Map) ValueSet(com.amazonaws.athena.connector.lambda.domain.predicate.ValueSet) AmazonTimestreamQuery(com.amazonaws.services.timestreamquery.AmazonTimestreamQuery) BlockAllocatorImpl(com.amazonaws.athena.connector.lambda.data.BlockAllocatorImpl) Split(com.amazonaws.athena.connector.lambda.domain.Split) ReadRecordsResponse(com.amazonaws.athena.connector.lambda.records.ReadRecordsResponse) UUID(java.util.UUID) TableName(com.amazonaws.athena.connector.lambda.domain.TableName) RecordResponse(com.amazonaws.athena.connector.lambda.records.RecordResponse) Matchers.any(org.mockito.Matchers.any) List(java.util.List) ByteStreams(com.google.common.io.ByteStreams) BlockUtils(com.amazonaws.athena.connector.lambda.data.BlockUtils) S3ObjectInputStream(com.amazonaws.services.s3.model.S3ObjectInputStream) EncryptionKeyFactory(com.amazonaws.athena.connector.lambda.security.EncryptionKeyFactory) Mockito.mock(org.mockito.Mockito.mock) Mock(org.mockito.Mock) EquatableValueSet(com.amazonaws.athena.connector.lambda.domain.predicate.EquatableValueSet) RunWith(org.junit.runner.RunWith) HashMap(java.util.HashMap) Matchers.anyString(org.mockito.Matchers.anyString) ArrayList(java.util.ArrayList) RemoteReadRecordsResponse(com.amazonaws.athena.connector.lambda.records.RemoteReadRecordsResponse) Answer(org.mockito.stubbing.Answer) InvocationOnMock(org.mockito.invocation.InvocationOnMock) S3Object(com.amazonaws.services.s3.model.S3Object) SchemaBuilder(com.amazonaws.athena.connector.lambda.data.SchemaBuilder) TestName(org.junit.rules.TestName) LocalKeyFactory(com.amazonaws.athena.connector.lambda.security.LocalKeyFactory) Matchers.anyObject(org.mockito.Matchers.anyObject) AmazonS3(com.amazonaws.services.s3.AmazonS3) FederatedIdentity(com.amazonaws.athena.connector.lambda.security.FederatedIdentity) PutObjectResult(com.amazonaws.services.s3.model.PutObjectResult) S3BlockSpillReader(com.amazonaws.athena.connector.lambda.data.S3BlockSpillReader) Before(org.junit.Before) Logger(org.slf4j.Logger) AmazonAthena(com.amazonaws.services.athena.AmazonAthena) Assert.assertNotNull(org.junit.Assert.assertNotNull) ReadRecordsRequest(com.amazonaws.athena.connector.lambda.records.ReadRecordsRequest) Assert.assertTrue(org.junit.Assert.assertTrue) AWSSecretsManager(com.amazonaws.services.secretsmanager.AWSSecretsManager) Test(org.junit.Test) IOException(java.io.IOException) Mockito.when(org.mockito.Mockito.when) FieldBuilder(com.amazonaws.athena.connector.lambda.data.FieldBuilder) Constraints(com.amazonaws.athena.connector.lambda.domain.predicate.Constraints) S3SpillLocation(com.amazonaws.athena.connector.lambda.domain.spill.S3SpillLocation) Rule(org.junit.Rule) MockitoJUnitRunner(org.mockito.runners.MockitoJUnitRunner) QueryRequest(com.amazonaws.services.timestreamquery.model.QueryRequest) TestUtils.makeMockQueryResult(com.amazonaws.athena.connectors.timestream.TestUtils.makeMockQueryResult) VIEW_METADATA_FIELD(com.amazonaws.athena.connector.lambda.handlers.GlueMetadataHandler.VIEW_METADATA_FIELD) Collections(java.util.Collections) Assert.assertEquals(org.junit.Assert.assertEquals) InputStream(java.io.InputStream) QueryRequest(com.amazonaws.services.timestreamquery.model.QueryRequest) HashMap(java.util.HashMap) ReadRecordsResponse(com.amazonaws.athena.connector.lambda.records.ReadRecordsResponse) RemoteReadRecordsResponse(com.amazonaws.athena.connector.lambda.records.RemoteReadRecordsResponse) Schema(org.apache.arrow.vector.types.pojo.Schema) Matchers.anyString(org.mockito.Matchers.anyString) RecordResponse(com.amazonaws.athena.connector.lambda.records.RecordResponse) TableName(com.amazonaws.athena.connector.lambda.domain.TableName) QueryResult(com.amazonaws.services.timestreamquery.model.QueryResult) TestUtils.makeMockQueryResult(com.amazonaws.athena.connectors.timestream.TestUtils.makeMockQueryResult) ReadRecordsRequest(com.amazonaws.athena.connector.lambda.records.ReadRecordsRequest) Constraints(com.amazonaws.athena.connector.lambda.domain.predicate.Constraints) S3SpillLocation(com.amazonaws.athena.connector.lambda.domain.spill.S3SpillLocation) Split(com.amazonaws.athena.connector.lambda.domain.Split) ValueSet(com.amazonaws.athena.connector.lambda.domain.predicate.ValueSet) EquatableValueSet(com.amazonaws.athena.connector.lambda.domain.predicate.EquatableValueSet) Test(org.junit.Test)

Example 5 with QueryResult

use of com.amazonaws.services.timestreamquery.model.QueryResult in project aws-athena-query-federation by awslabs.

the class TimestreamMetadataHandler method doGetTable.

@Override
public GetTableResponse doGetTable(BlockAllocator blockAllocator, GetTableRequest request) throws Exception {
    logger.info("doGetTable: enter", request.getTableName());
    Schema schema = null;
    try {
        if (glue != null) {
            schema = super.doGetTable(blockAllocator, request, TABLE_FILTER).getSchema();
            logger.info("doGetTable: Retrieved schema for table[{}] from AWS Glue.", request.getTableName());
        }
    } catch (RuntimeException ex) {
        logger.warn("doGetTable: Unable to retrieve table[{}:{}] from AWS Glue.", request.getTableName().getSchemaName(), request.getTableName().getTableName(), ex);
    }
    if (schema == null) {
        TableName tableName = request.getTableName();
        String describeQuery = queryFactory.createDescribeTableQueryBuilder().withTablename(tableName.getTableName()).withDatabaseName(tableName.getSchemaName()).build();
        logger.info("doGetTable: Retrieving schema for table[{}] from TimeStream using describeQuery[{}].", request.getTableName(), describeQuery);
        QueryRequest queryRequest = new QueryRequest().withQueryString(describeQuery);
        SchemaBuilder schemaBuilder = SchemaBuilder.newBuilder();
        do {
            QueryResult queryResult = tsQuery.query(queryRequest);
            for (Row next : queryResult.getRows()) {
                List<Datum> datum = next.getData();
                if (datum.size() != 3) {
                    throw new RuntimeException("Unexpected datum size " + datum.size() + " while getting schema from datum[" + datum.toString() + "]");
                }
                Field nextField = TimestreamSchemaUtils.makeField(datum.get(0).getScalarValue(), datum.get(1).getScalarValue());
                schemaBuilder.addField(nextField);
            }
            queryRequest = new QueryRequest().withNextToken(queryResult.getNextToken());
        } while (queryRequest.getNextToken() != null);
        schema = schemaBuilder.build();
    }
    return new GetTableResponse(request.getCatalogName(), request.getTableName(), schema);
}
Also used : TableName(com.amazonaws.athena.connector.lambda.domain.TableName) Field(org.apache.arrow.vector.types.pojo.Field) QueryResult(com.amazonaws.services.timestreamquery.model.QueryResult) Datum(com.amazonaws.services.timestreamquery.model.Datum) QueryRequest(com.amazonaws.services.timestreamquery.model.QueryRequest) GetTableResponse(com.amazonaws.athena.connector.lambda.metadata.GetTableResponse) Schema(org.apache.arrow.vector.types.pojo.Schema) SchemaBuilder(com.amazonaws.athena.connector.lambda.data.SchemaBuilder) Row(com.amazonaws.services.timestreamquery.model.Row)

Aggregations

QueryResult (com.amazonaws.services.timestreamquery.model.QueryResult)9 QueryRequest (com.amazonaws.services.timestreamquery.model.QueryRequest)8 TableName (com.amazonaws.athena.connector.lambda.domain.TableName)7 ArrayList (java.util.ArrayList)6 Block (com.amazonaws.athena.connector.lambda.data.Block)5 BlockUtils (com.amazonaws.athena.connector.lambda.data.BlockUtils)5 SchemaBuilder (com.amazonaws.athena.connector.lambda.data.SchemaBuilder)5 HashMap (java.util.HashMap)5 List (java.util.List)5 BlockAllocator (com.amazonaws.athena.connector.lambda.data.BlockAllocator)4 BlockAllocatorImpl (com.amazonaws.athena.connector.lambda.data.BlockAllocatorImpl)4 FieldBuilder (com.amazonaws.athena.connector.lambda.data.FieldBuilder)4 S3BlockSpillReader (com.amazonaws.athena.connector.lambda.data.S3BlockSpillReader)4 Split (com.amazonaws.athena.connector.lambda.domain.Split)4 Constraints (com.amazonaws.athena.connector.lambda.domain.predicate.Constraints)4 EquatableValueSet (com.amazonaws.athena.connector.lambda.domain.predicate.EquatableValueSet)4 ValueSet (com.amazonaws.athena.connector.lambda.domain.predicate.ValueSet)4 S3SpillLocation (com.amazonaws.athena.connector.lambda.domain.spill.S3SpillLocation)4 SpillLocation (com.amazonaws.athena.connector.lambda.domain.spill.SpillLocation)4 VIEW_METADATA_FIELD (com.amazonaws.athena.connector.lambda.handlers.GlueMetadataHandler.VIEW_METADATA_FIELD)4