Search in sources :

Example 36 with DataSourceOptions

use of org.apache.spark.sql.sources.v2.DataSourceOptions in project spark-bigquery-connector by GoogleCloudDataproc.

the class SparkBigQueryProxyAndHttpConfigTest method testWhenProxyIsSetAndUserNameIsNull.

@Test
public void testWhenProxyIsSetAndUserNameIsNull() {
    ImmutableMap<String, String> optionsMap = ImmutableMap.<String, String>builder().put("proxyAddress", "http://bq-connector-host:1234").put("proxyPassword", "bq-connector-password").build();
    Configuration emptyHadoopConfiguration = new Configuration();
    DataSourceOptions options = new DataSourceOptions(optionsMap);
    IllegalArgumentException exception = assertThrows(IllegalArgumentException.class, () -> SparkBigQueryProxyAndHttpConfig.from(options.asMap(), ImmutableMap.of(), emptyHadoopConfiguration));
    assertThat(exception).hasMessageThat().contains("Both proxyUsername and proxyPassword should be defined or not defined together");
}
Also used : Configuration(org.apache.hadoop.conf.Configuration) DataSourceOptions(org.apache.spark.sql.sources.v2.DataSourceOptions) Test(org.junit.Test)

Example 37 with DataSourceOptions

use of org.apache.spark.sql.sources.v2.DataSourceOptions in project spark-bigquery-connector by GoogleCloudDataproc.

the class SparkBigQueryProxyAndHttpConfigTest method testWhenProxyIsNotSetAndUserNamePasswordAreNotNull.

@Test
public void testWhenProxyIsNotSetAndUserNamePasswordAreNotNull() {
    ImmutableMap<String, String> optionsMap = ImmutableMap.<String, String>builder().put("proxyUsername", "bq-connector-user").put("proxyPassword", "bq-connector-password").build();
    Configuration emptyHadoopConfiguration = new Configuration();
    DataSourceOptions options = new DataSourceOptions(optionsMap);
    IllegalArgumentException exception = assertThrows(IllegalArgumentException.class, () -> SparkBigQueryProxyAndHttpConfig.from(options.asMap(), ImmutableMap.of(), emptyHadoopConfiguration));
    assertThat(exception).hasMessageThat().contains("Please set proxyAddress in order to use a proxy. " + "Setting proxyUsername or proxyPassword is not enough");
}
Also used : Configuration(org.apache.hadoop.conf.Configuration) DataSourceOptions(org.apache.spark.sql.sources.v2.DataSourceOptions) Test(org.junit.Test)

Example 38 with DataSourceOptions

use of org.apache.spark.sql.sources.v2.DataSourceOptions in project spark-bigquery-connector by GoogleCloudDataproc.

the class SparkBigQueryProxyAndHttpConfigTest method testConfigViaSparkBigQueryConfigWithGlobalOptionsAndHadoopConfiguration.

@Test
public void testConfigViaSparkBigQueryConfigWithGlobalOptionsAndHadoopConfiguration() throws URISyntaxException {
    HashMap<String, String> sparkConfigOptions = new HashMap<>();
    sparkConfigOptions.put("table", "dataset.table");
    ImmutableMap<String, String> globalOptions = SparkBigQueryConfig.normalizeConf(defaultGlobalOptions);
    DataSourceOptions options = new DataSourceOptions(sparkConfigOptions);
    SparkBigQueryConfig sparkConfig = SparkBigQueryConfig.from(// contains only one key "table"
    options.asMap(), globalOptions, defaultHadoopConfiguration, 10, new SQLConf(), "2.4.0", Optional.empty());
    SparkBigQueryProxyAndHttpConfig config = (SparkBigQueryProxyAndHttpConfig) sparkConfig.getBigQueryProxyConfig();
    assertThat(config.getProxyUri()).isEqualTo(Optional.of(getURI("http", "bq-connector-host-global", 1234)));
    assertThat(config.getProxyUsername()).isEqualTo(Optional.of("bq-connector-user-global"));
    assertThat(config.getProxyPassword()).isEqualTo(Optional.of("bq-connector-password-global"));
    assertThat(config.getHttpMaxRetry()).isEqualTo(Optional.of(20));
    assertThat(config.getHttpConnectTimeout()).isEqualTo(Optional.of(20000));
    assertThat(config.getHttpReadTimeout()).isEqualTo(Optional.of(30000));
}
Also used : DataSourceOptions(org.apache.spark.sql.sources.v2.DataSourceOptions) SQLConf(org.apache.spark.sql.internal.SQLConf) Test(org.junit.Test)

Example 39 with DataSourceOptions

use of org.apache.spark.sql.sources.v2.DataSourceOptions in project java-pubsublite-spark by googleapis.

the class PslReadDataSourceOptionsTest method testInvalidSubPath.

@Test
public void testInvalidSubPath() {
    DataSourceOptions options = new DataSourceOptions(ImmutableMap.of(Constants.SUBSCRIPTION_CONFIG_KEY, "invalid/path"));
    assertThrows(IllegalArgumentException.class, () -> PslReadDataSourceOptions.fromSparkDataSourceOptions(options));
}
Also used : DataSourceOptions(org.apache.spark.sql.sources.v2.DataSourceOptions) Test(org.junit.Test)

Example 40 with DataSourceOptions

use of org.apache.spark.sql.sources.v2.DataSourceOptions in project java-pubsublite-spark by googleapis.

the class PslWriteDataSourceOptionsTest method testInvalidTopicPath.

@Test
public void testInvalidTopicPath() {
    DataSourceOptions options = new DataSourceOptions(ImmutableMap.of(Constants.TOPIC_CONFIG_KEY, "invalid/path"));
    assertThrows(IllegalArgumentException.class, () -> PslWriteDataSourceOptions.fromSparkDataSourceOptions(options));
}
Also used : DataSourceOptions(org.apache.spark.sql.sources.v2.DataSourceOptions) Test(org.junit.Test)

Aggregations

DataSourceOptions (org.apache.spark.sql.sources.v2.DataSourceOptions)38 Test (org.junit.Test)33 HashMap (java.util.HashMap)13 Configuration (org.apache.hadoop.conf.Configuration)13 SQLConf (org.apache.spark.sql.internal.SQLConf)10 ArrayList (java.util.ArrayList)4 HoodieWriteConfig (org.apache.hudi.config.HoodieWriteConfig)4 Row (org.apache.spark.sql.Row)4 InternalRow (org.apache.spark.sql.catalyst.InternalRow)4 ParameterizedTest (org.junit.jupiter.params.ParameterizedTest)4 MethodSource (org.junit.jupiter.params.provider.MethodSource)4 List (java.util.List)3 DataSourceReader (org.apache.spark.sql.sources.v2.reader.DataSourceReader)3 Layout (io.tiledb.java.api.Layout)2 ByteArrayOutputStream (java.io.ByteArrayOutputStream)2 File (java.io.File)2 ObjectOutputStream (java.io.ObjectOutputStream)2 URI (java.net.URI)2 DataFile (org.apache.iceberg.DataFile)2 Table (org.apache.iceberg.Table)2