Search in sources :

Example 36 with TableException

use of org.apache.flink.table.api.TableException in project flink by apache.

the class TableEnvironmentImpl method compilePlanAndWrite.

private CompiledPlan compilePlanAndWrite(String filePath, boolean ifNotExists, Operation operation) {
    File file = Paths.get(filePath).toFile();
    if (file.exists()) {
        if (ifNotExists) {
            return loadPlan(PlanReference.fromFile(filePath));
        }
        if (!tableConfig.getConfiguration().get(TableConfigOptions.PLAN_FORCE_RECOMPILE)) {
            throw new TableException(String.format("Cannot overwrite the plan file '%s'. " + "Either manually remove the file or, " + "if you're debugging your job, " + "set the option '%s' to true.", filePath, TableConfigOptions.PLAN_FORCE_RECOMPILE.key()));
        }
    }
    CompiledPlan compiledPlan;
    if (operation instanceof StatementSetOperation) {
        compiledPlan = compilePlan(((StatementSetOperation) operation).getOperations());
    } else if (operation instanceof ModifyOperation) {
        compiledPlan = compilePlan(Collections.singletonList((ModifyOperation) operation));
    } else {
        throw new TableException("Unsupported operation to compile: " + operation.getClass() + ". This is a bug, please file an issue.");
    }
    compiledPlan.writeToFile(file, false);
    return compiledPlan;
}
Also used : CompiledPlan(org.apache.flink.table.api.CompiledPlan) TableException(org.apache.flink.table.api.TableException) ModifyOperation(org.apache.flink.table.operations.ModifyOperation) SinkModifyOperation(org.apache.flink.table.operations.SinkModifyOperation) CollectModifyOperation(org.apache.flink.table.operations.CollectModifyOperation) File(java.io.File) StatementSetOperation(org.apache.flink.table.operations.StatementSetOperation)

Example 37 with TableException

use of org.apache.flink.table.api.TableException in project flink by apache.

the class TableEnvironmentImpl method alterCatalogFunction.

private TableResultInternal alterCatalogFunction(AlterCatalogFunctionOperation alterCatalogFunctionOperation) {
    String exMsg = getDDLOpExecuteErrorMsg(alterCatalogFunctionOperation.asSummaryString());
    try {
        CatalogFunction function = alterCatalogFunctionOperation.getCatalogFunction();
        if (alterCatalogFunctionOperation.isTemporary()) {
            throw new ValidationException("Alter temporary catalog function is not supported");
        } else {
            Catalog catalog = getCatalogOrThrowException(alterCatalogFunctionOperation.getFunctionIdentifier().getCatalogName());
            catalog.alterFunction(alterCatalogFunctionOperation.getFunctionIdentifier().toObjectPath(), function, alterCatalogFunctionOperation.isIfExists());
        }
        return TableResultImpl.TABLE_RESULT_OK;
    } catch (ValidationException e) {
        throw e;
    } catch (FunctionNotExistException e) {
        throw new ValidationException(e.getMessage(), e);
    } catch (Exception e) {
        throw new TableException(exMsg, e);
    }
}
Also used : FunctionNotExistException(org.apache.flink.table.catalog.exceptions.FunctionNotExistException) TableException(org.apache.flink.table.api.TableException) ValidationException(org.apache.flink.table.api.ValidationException) CatalogFunction(org.apache.flink.table.catalog.CatalogFunction) Catalog(org.apache.flink.table.catalog.Catalog) GenericInMemoryCatalog(org.apache.flink.table.catalog.GenericInMemoryCatalog) FunctionCatalog(org.apache.flink.table.catalog.FunctionCatalog) FunctionAlreadyExistException(org.apache.flink.table.catalog.exceptions.FunctionAlreadyExistException) DatabaseNotExistException(org.apache.flink.table.catalog.exceptions.DatabaseNotExistException) TableAlreadyExistException(org.apache.flink.table.catalog.exceptions.TableAlreadyExistException) TableException(org.apache.flink.table.api.TableException) IOException(java.io.IOException) ExecutionException(java.util.concurrent.ExecutionException) CatalogException(org.apache.flink.table.catalog.exceptions.CatalogException) FunctionNotExistException(org.apache.flink.table.catalog.exceptions.FunctionNotExistException) DatabaseNotEmptyException(org.apache.flink.table.catalog.exceptions.DatabaseNotEmptyException) DatabaseAlreadyExistException(org.apache.flink.table.catalog.exceptions.DatabaseAlreadyExistException) SqlParserException(org.apache.flink.table.api.SqlParserException) ValidationException(org.apache.flink.table.api.ValidationException) TableNotExistException(org.apache.flink.table.catalog.exceptions.TableNotExistException)

Example 38 with TableException

use of org.apache.flink.table.api.TableException in project flink by apache.

the class TableEnvironmentImpl method createTemporaryView.

private void createTemporaryView(UnresolvedIdentifier identifier, Table view) {
    if (((TableImpl) view).getTableEnvironment() != this) {
        throw new TableException("Only table API objects that belong to this TableEnvironment can be registered.");
    }
    ObjectIdentifier tableIdentifier = catalogManager.qualifyIdentifier(identifier);
    QueryOperation queryOperation = qualifyQueryOperation(tableIdentifier, view.getQueryOperation());
    CatalogBaseTable tableTable = new QueryOperationCatalogView(queryOperation);
    catalogManager.createTemporaryTable(tableTable, tableIdentifier, false);
}
Also used : TableException(org.apache.flink.table.api.TableException) CatalogBaseTable(org.apache.flink.table.catalog.CatalogBaseTable) QueryOperationCatalogView(org.apache.flink.table.catalog.QueryOperationCatalogView) ObjectIdentifier(org.apache.flink.table.catalog.ObjectIdentifier) QueryOperation(org.apache.flink.table.operations.QueryOperation) TableSourceQueryOperation(org.apache.flink.table.operations.TableSourceQueryOperation) SourceQueryOperation(org.apache.flink.table.operations.SourceQueryOperation)

Example 39 with TableException

use of org.apache.flink.table.api.TableException in project flink by apache.

the class FunctionCatalog method dropCatalogFunction.

/**
 * Drops a catalog function by also considering temporary catalog functions. Returns true if a
 * function was dropped.
 */
public boolean dropCatalogFunction(UnresolvedIdentifier unresolvedIdentifier, boolean ignoreIfNotExist) {
    final ObjectIdentifier identifier = catalogManager.qualifyIdentifier(unresolvedIdentifier);
    final ObjectIdentifier normalizedIdentifier = FunctionIdentifier.normalizeObjectIdentifier(identifier);
    final Catalog catalog = catalogManager.getCatalog(normalizedIdentifier.getCatalogName()).orElseThrow(IllegalStateException::new);
    final ObjectPath path = identifier.toObjectPath();
    // we force users to deal with temporary catalog functions first
    if (tempCatalogFunctions.containsKey(normalizedIdentifier)) {
        throw new ValidationException(String.format("Could not drop catalog function. A temporary function '%s' does already exist. " + "Please drop the temporary function first.", identifier.asSummaryString()));
    }
    if (!catalog.functionExists(path)) {
        if (ignoreIfNotExist) {
            return false;
        }
        throw new ValidationException(String.format("Could not drop catalog function. A function '%s' doesn't exist.", identifier.asSummaryString()));
    }
    try {
        catalog.dropFunction(path, ignoreIfNotExist);
    } catch (Throwable t) {
        throw new TableException(String.format("Could not drop catalog function '%s'.", identifier.asSummaryString()), t);
    }
    return true;
}
Also used : TableException(org.apache.flink.table.api.TableException) ValidationException(org.apache.flink.table.api.ValidationException)

Example 40 with TableException

use of org.apache.flink.table.api.TableException in project flink by apache.

the class FunctionCatalog method registerCatalogFunction.

/**
 * Registers a catalog function by also considering temporary catalog functions.
 */
public void registerCatalogFunction(UnresolvedIdentifier unresolvedIdentifier, Class<? extends UserDefinedFunction> functionClass, boolean ignoreIfExists) {
    final ObjectIdentifier identifier = catalogManager.qualifyIdentifier(unresolvedIdentifier);
    final ObjectIdentifier normalizedIdentifier = FunctionIdentifier.normalizeObjectIdentifier(identifier);
    try {
        UserDefinedFunctionHelper.validateClass(functionClass);
    } catch (Throwable t) {
        throw new ValidationException(String.format("Could not register catalog function '%s' due to implementation errors.", identifier.asSummaryString()), t);
    }
    final Catalog catalog = catalogManager.getCatalog(normalizedIdentifier.getCatalogName()).orElseThrow(IllegalStateException::new);
    final ObjectPath path = identifier.toObjectPath();
    // we force users to deal with temporary catalog functions first
    if (tempCatalogFunctions.containsKey(normalizedIdentifier)) {
        if (ignoreIfExists) {
            return;
        }
        throw new ValidationException(String.format("Could not register catalog function. A temporary function '%s' does already exist. " + "Please drop the temporary function first.", identifier.asSummaryString()));
    }
    if (catalog.functionExists(path)) {
        if (ignoreIfExists) {
            return;
        }
        throw new ValidationException(String.format("Could not register catalog function. A function '%s' does already exist.", identifier.asSummaryString()));
    }
    final CatalogFunction catalogFunction = new CatalogFunctionImpl(functionClass.getName(), FunctionLanguage.JAVA);
    try {
        catalog.createFunction(path, catalogFunction, ignoreIfExists);
    } catch (Throwable t) {
        throw new TableException(String.format("Could not register catalog function '%s'.", identifier.asSummaryString()), t);
    }
}
Also used : TableException(org.apache.flink.table.api.TableException) ValidationException(org.apache.flink.table.api.ValidationException)

Aggregations

TableException (org.apache.flink.table.api.TableException)163 RowData (org.apache.flink.table.data.RowData)35 RowType (org.apache.flink.table.types.logical.RowType)35 Transformation (org.apache.flink.api.dag.Transformation)28 ArrayList (java.util.ArrayList)27 ExecEdge (org.apache.flink.table.planner.plan.nodes.exec.ExecEdge)24 LogicalType (org.apache.flink.table.types.logical.LogicalType)24 List (java.util.List)22 DataType (org.apache.flink.table.types.DataType)19 OneInputTransformation (org.apache.flink.streaming.api.transformations.OneInputTransformation)18 ValidationException (org.apache.flink.table.api.ValidationException)17 IOException (java.io.IOException)13 AggregateCall (org.apache.calcite.rel.core.AggregateCall)13 ValueLiteralExpression (org.apache.flink.table.expressions.ValueLiteralExpression)13 RowDataKeySelector (org.apache.flink.table.runtime.keyselector.RowDataKeySelector)13 Optional (java.util.Optional)11 Configuration (org.apache.flink.configuration.Configuration)11 StreamExecutionEnvironment (org.apache.flink.streaming.api.environment.StreamExecutionEnvironment)11 Constructor (java.lang.reflect.Constructor)10 Arrays (java.util.Arrays)9