Search in sources :

Example 31 with VisibleForTesting

use of org.apache.flink.annotation.VisibleForTesting in project flink by apache.

the class HiveCatalog method getHiveTable.

@VisibleForTesting
public Table getHiveTable(ObjectPath tablePath) throws TableNotExistException {
    try {
        Table table = client.getTable(tablePath.getDatabaseName(), tablePath.getObjectName());
        boolean isHiveTable;
        if (table.getParameters().containsKey(CatalogPropertiesUtil.IS_GENERIC)) {
            isHiveTable = !Boolean.parseBoolean(table.getParameters().remove(CatalogPropertiesUtil.IS_GENERIC));
        } else {
            isHiveTable = !table.getParameters().containsKey(FLINK_PROPERTY_PREFIX + CONNECTOR.key()) && !table.getParameters().containsKey(FLINK_PROPERTY_PREFIX + CONNECTOR_TYPE);
        }
        // for hive table, we add the connector property
        if (isHiveTable) {
            table.getParameters().put(CONNECTOR.key(), IDENTIFIER);
        }
        return table;
    } catch (NoSuchObjectException e) {
        throw new TableNotExistException(getName(), tablePath);
    } catch (TException e) {
        throw new CatalogException(String.format("Failed to get table %s from Hive metastore", tablePath.getFullName()), e);
    }
}
Also used : TException(org.apache.thrift.TException) CatalogTable(org.apache.flink.table.catalog.CatalogTable) SqlCreateHiveTable(org.apache.flink.sql.parser.hive.ddl.SqlCreateHiveTable) Table(org.apache.hadoop.hive.metastore.api.Table) CatalogBaseTable(org.apache.flink.table.catalog.CatalogBaseTable) TableNotExistException(org.apache.flink.table.catalog.exceptions.TableNotExistException) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) NoSuchObjectException(org.apache.hadoop.hive.metastore.api.NoSuchObjectException) VisibleForTesting(org.apache.flink.annotation.VisibleForTesting)

Example 32 with VisibleForTesting

use of org.apache.flink.annotation.VisibleForTesting in project flink by apache.

the class StandaloneApplicationClusterEntryPoint method loadConfigurationFromClusterConfig.

@VisibleForTesting
static Configuration loadConfigurationFromClusterConfig(StandaloneApplicationClusterConfiguration clusterConfiguration) {
    Configuration configuration = loadConfiguration(clusterConfiguration);
    setStaticJobId(clusterConfiguration, configuration);
    SavepointRestoreSettings.toConfiguration(clusterConfiguration.getSavepointRestoreSettings(), configuration);
    return configuration;
}
Also used : Configuration(org.apache.flink.configuration.Configuration) VisibleForTesting(org.apache.flink.annotation.VisibleForTesting)

Example 33 with VisibleForTesting

use of org.apache.flink.annotation.VisibleForTesting in project flink by apache.

the class FlinkConfMountDecorator method getFlinkConfData.

@VisibleForTesting
String getFlinkConfData(Map<String, String> propertiesMap) throws IOException {
    try (StringWriter sw = new StringWriter();
        PrintWriter out = new PrintWriter(sw)) {
        propertiesMap.forEach((k, v) -> {
            out.print(k);
            out.print(": ");
            out.println(v);
        });
        return sw.toString();
    }
}
Also used : StringWriter(java.io.StringWriter) PrintWriter(java.io.PrintWriter) VisibleForTesting(org.apache.flink.annotation.VisibleForTesting)

Example 34 with VisibleForTesting

use of org.apache.flink.annotation.VisibleForTesting in project flink by apache.

the class KafkaSource method createReader.

@VisibleForTesting
SourceReader<OUT, KafkaPartitionSplit> createReader(SourceReaderContext readerContext, Consumer<Collection<String>> splitFinishedHook) throws Exception {
    FutureCompletingBlockingQueue<RecordsWithSplitIds<ConsumerRecord<byte[], byte[]>>> elementsQueue = new FutureCompletingBlockingQueue<>();
    deserializationSchema.open(new DeserializationSchema.InitializationContext() {

        @Override
        public MetricGroup getMetricGroup() {
            return readerContext.metricGroup().addGroup("deserializer");
        }

        @Override
        public UserCodeClassLoader getUserCodeClassLoader() {
            return readerContext.getUserCodeClassLoader();
        }
    });
    final KafkaSourceReaderMetrics kafkaSourceReaderMetrics = new KafkaSourceReaderMetrics(readerContext.metricGroup());
    Supplier<KafkaPartitionSplitReader> splitReaderSupplier = () -> new KafkaPartitionSplitReader(props, readerContext, kafkaSourceReaderMetrics);
    KafkaRecordEmitter<OUT> recordEmitter = new KafkaRecordEmitter<>(deserializationSchema);
    return new KafkaSourceReader<>(elementsQueue, new KafkaSourceFetcherManager(elementsQueue, splitReaderSupplier::get, splitFinishedHook), recordEmitter, toConfiguration(props), readerContext, kafkaSourceReaderMetrics);
}
Also used : KafkaSourceFetcherManager(org.apache.flink.connector.kafka.source.reader.fetcher.KafkaSourceFetcherManager) MetricGroup(org.apache.flink.metrics.MetricGroup) KafkaSourceReaderMetrics(org.apache.flink.connector.kafka.source.metrics.KafkaSourceReaderMetrics) KafkaSourceReader(org.apache.flink.connector.kafka.source.reader.KafkaSourceReader) RecordsWithSplitIds(org.apache.flink.connector.base.source.reader.RecordsWithSplitIds) KafkaRecordDeserializationSchema(org.apache.flink.connector.kafka.source.reader.deserializer.KafkaRecordDeserializationSchema) DeserializationSchema(org.apache.flink.api.common.serialization.DeserializationSchema) UserCodeClassLoader(org.apache.flink.util.UserCodeClassLoader) FutureCompletingBlockingQueue(org.apache.flink.connector.base.source.reader.synchronization.FutureCompletingBlockingQueue) KafkaRecordEmitter(org.apache.flink.connector.kafka.source.reader.KafkaRecordEmitter) KafkaPartitionSplitReader(org.apache.flink.connector.kafka.source.reader.KafkaPartitionSplitReader) VisibleForTesting(org.apache.flink.annotation.VisibleForTesting)

Example 35 with VisibleForTesting

use of org.apache.flink.annotation.VisibleForTesting in project flink by apache.

the class LocalRecoverableWriter method generateStagingTempFilePath.

@VisibleForTesting
public static File generateStagingTempFilePath(File targetFile) {
    checkArgument(!targetFile.isDirectory(), "targetFile must not be a directory");
    final File parent = targetFile.getParentFile();
    final String name = targetFile.getName();
    checkArgument(parent != null, "targetFile must not be the root directory");
    while (true) {
        File candidate = new File(parent, "." + name + ".inprogress." + UUID.randomUUID().toString());
        if (!candidate.exists()) {
            return candidate;
        }
    }
}
Also used : File(java.io.File) VisibleForTesting(org.apache.flink.annotation.VisibleForTesting)

Aggregations

VisibleForTesting (org.apache.flink.annotation.VisibleForTesting)64 HashMap (java.util.HashMap)11 IOException (java.io.IOException)8 ArrayList (java.util.ArrayList)7 Configuration (org.apache.flink.configuration.Configuration)7 Map (java.util.Map)6 File (java.io.File)5 URI (java.net.URI)4 List (java.util.List)4 Tuple2 (org.apache.flink.api.java.tuple.Tuple2)4 Field (java.lang.reflect.Field)3 Set (java.util.Set)3 Nullable (javax.annotation.Nullable)3 ByteArrayOutputStream (java.io.ByteArrayOutputStream)2 InputStream (java.io.InputStream)2 Path (java.nio.file.Path)2 ConcurrentHashMap (java.util.concurrent.ConcurrentHashMap)2 Matcher (java.util.regex.Matcher)2 MetricGroup (org.apache.flink.metrics.MetricGroup)2 ExecutionJobVertex (org.apache.flink.runtime.executiongraph.ExecutionJobVertex)2