Search in sources :

Example 21 with ValidationException

use of org.apache.calcite.tools.ValidationException in project druid by apache.

the class DruidPlanner method plan.

/**
 * Plan an SQL query for execution, returning a {@link PlannerResult} which can be used to actually execute the query.
 *
 * Ideally, the query can be planned into a native Druid query, using {@link #planWithDruidConvention}, but will
 * fall-back to {@link #planWithBindableConvention} if this is not possible.
 *
 * In some future this could perhaps re-use some of the work done by {@link #validate(boolean)}
 * instead of repeating it, but that day is not today.
 */
public PlannerResult plan() throws SqlParseException, ValidationException, RelConversionException {
    resetPlanner();
    final ParsedNodes parsed = ParsedNodes.create(planner.parse(plannerContext.getSql()), plannerContext.getTimeZone());
    try {
        if (parsed.getIngestionGranularity() != null) {
            plannerContext.getQueryContext().addSystemParam(DruidSqlInsert.SQL_INSERT_SEGMENT_GRANULARITY, plannerContext.getJsonMapper().writeValueAsString(parsed.getIngestionGranularity()));
        }
    } catch (JsonProcessingException e) {
        throw new ValidationException("Unable to serialize partition granularity.");
    }
    if (parsed.getReplaceIntervals() != null) {
        plannerContext.getQueryContext().addSystemParam(DruidSqlReplace.SQL_REPLACE_TIME_CHUNKS, String.join(",", parsed.getReplaceIntervals()));
    }
    // the planner's type factory is not available until after parsing
    this.rexBuilder = new RexBuilder(planner.getTypeFactory());
    final SqlNode parameterizedQueryNode = rewriteDynamicParameters(parsed.getQueryNode());
    final SqlNode validatedQueryNode = planner.validate(parameterizedQueryNode);
    final RelRoot rootQueryRel = planner.rel(validatedQueryNode);
    try {
        return planWithDruidConvention(rootQueryRel, parsed.getExplainNode(), parsed.getInsertOrReplace());
    } catch (Exception e) {
        Throwable cannotPlanException = Throwables.getCauseOfType(e, RelOptPlanner.CannotPlanException.class);
        if (null == cannotPlanException) {
            // Not a CannotPlanningException, rethrow without trying with bindable
            throw e;
        }
        // any error, if it is plannable by the bindable convention
        if (parsed.getInsertOrReplace() == null) {
            // Try again with BINDABLE convention. Used for querying Values and metadata tables.
            try {
                return planWithBindableConvention(rootQueryRel, parsed.getExplainNode());
            } catch (Exception e2) {
                e.addSuppressed(e2);
            }
        }
        Logger logger = log;
        if (!plannerContext.getQueryContext().isDebug()) {
            logger = log.noStackTrace();
        }
        String errorMessage = buildSQLPlanningErrorMessage(cannotPlanException);
        logger.warn(e, errorMessage);
        throw new UnsupportedSQLQueryException(errorMessage);
    }
}
Also used : ValidationException(org.apache.calcite.tools.ValidationException) RexBuilder(org.apache.calcite.rex.RexBuilder) RelRoot(org.apache.calcite.rel.RelRoot) Logger(org.apache.druid.java.util.common.logger.Logger) EmittingLogger(org.apache.druid.java.util.emitter.EmittingLogger) JsonProcessingException(com.fasterxml.jackson.core.JsonProcessingException) ValidationException(org.apache.calcite.tools.ValidationException) SqlParseException(org.apache.calcite.sql.parser.SqlParseException) JsonProcessingException(com.fasterxml.jackson.core.JsonProcessingException) RelConversionException(org.apache.calcite.tools.RelConversionException) SqlNode(org.apache.calcite.sql.SqlNode)

Example 22 with ValidationException

use of org.apache.calcite.tools.ValidationException in project druid by apache.

the class PlannerFactory method createPlannerForTesting.

/**
 * Not just visible for, but only for testing. Create a planner pre-loaded with an escalated authentication result
 * and ready to go authorization result.
 */
@VisibleForTesting
public DruidPlanner createPlannerForTesting(final Map<String, Object> queryContext, String query) {
    final DruidPlanner thePlanner = createPlanner(query, new QueryContext(queryContext));
    thePlanner.getPlannerContext().setAuthenticationResult(NoopEscalator.getInstance().createEscalatedAuthenticationResult());
    try {
        thePlanner.validate(false);
    } catch (SqlParseException | ValidationException e) {
        throw new RuntimeException(e);
    }
    thePlanner.getPlannerContext().setAuthorizationResult(Access.OK);
    return thePlanner;
}
Also used : ValidationException(org.apache.calcite.tools.ValidationException) SqlParseException(org.apache.calcite.sql.parser.SqlParseException) QueryContext(org.apache.druid.query.QueryContext) VisibleForTesting(com.google.common.annotations.VisibleForTesting)

Example 23 with ValidationException

use of org.apache.calcite.tools.ValidationException in project druid by druid-io.

the class DruidSqlParserUtils method validateQueryAndConvertToIntervals.

/**
 * This method validates and converts a {@link SqlNode} representing a query into an optmizied list of intervals to
 * be used in creating an ingestion spec. If the sqlNode is an SqlLiteral of {@link #ALL}, returns a singleton list of
 * "ALL". Otherwise, it converts and optimizes the query using {@link MoveTimeFiltersToIntervals} into a list of
 * intervals which contain all valid values of time as per the query.
 *
 * The following validations are performed
 * 1. Only __time column and timestamp literals are present in the query
 * 2. The interval after optimization is not empty
 * 3. The operands in the expression are supported
 * 4. The intervals after adjusting for timezone are aligned with the granularity parameter
 *
 * @param replaceTimeQuery Sql node representing the query
 * @param granularity granularity of the query for validation
 * @param dateTimeZone timezone
 * @return List of string representation of intervals
 * @throws ValidationException if the SqlNode cannot be converted to a list of intervals
 */
public static List<String> validateQueryAndConvertToIntervals(SqlNode replaceTimeQuery, Granularity granularity, DateTimeZone dateTimeZone) throws ValidationException {
    if (replaceTimeQuery instanceof SqlLiteral && ALL.equalsIgnoreCase(((SqlLiteral) replaceTimeQuery).toValue())) {
        return ImmutableList.of(ALL);
    }
    DimFilter dimFilter = convertQueryToDimFilter(replaceTimeQuery, dateTimeZone);
    Filtration filtration = Filtration.create(dimFilter);
    filtration = MoveTimeFiltersToIntervals.instance().apply(filtration);
    List<Interval> intervals = filtration.getIntervals();
    if (filtration.getDimFilter() != null) {
        throw new ValidationException("Only " + ColumnHolder.TIME_COLUMN_NAME + " column is supported in OVERWRITE WHERE clause");
    }
    if (intervals.isEmpty()) {
        throw new ValidationException("Intervals for replace are empty");
    }
    for (Interval interval : intervals) {
        DateTime intervalStart = interval.getStart();
        DateTime intervalEnd = interval.getEnd();
        if (!granularity.bucketStart(intervalStart).equals(intervalStart) || !granularity.bucketStart(intervalEnd).equals(intervalEnd)) {
            throw new ValidationException("OVERWRITE WHERE clause contains an interval " + intervals + " which is not aligned with PARTITIONED BY granularity " + granularity);
        }
    }
    return intervals.stream().map(AbstractInterval::toString).collect(Collectors.toList());
}
Also used : Filtration(org.apache.druid.sql.calcite.filtration.Filtration) ValidationException(org.apache.calcite.tools.ValidationException) SqlLiteral(org.apache.calcite.sql.SqlLiteral) BoundDimFilter(org.apache.druid.query.filter.BoundDimFilter) AndDimFilter(org.apache.druid.query.filter.AndDimFilter) NotDimFilter(org.apache.druid.query.filter.NotDimFilter) DimFilter(org.apache.druid.query.filter.DimFilter) OrDimFilter(org.apache.druid.query.filter.OrDimFilter) ZonedDateTime(java.time.ZonedDateTime) DateTime(org.joda.time.DateTime) AbstractInterval(org.joda.time.base.AbstractInterval) Interval(org.joda.time.Interval)

Example 24 with ValidationException

use of org.apache.calcite.tools.ValidationException in project druid by druid-io.

the class DruidSqlParserUtils method convertQueryToDimFilter.

/**
 * This method is used to convert an {@link SqlNode} representing a query into a {@link DimFilter} for the same query.
 * It takes the timezone as a separate parameter, as Sql timestamps don't contain that information. Supported functions
 * are AND, OR, NOT, >, <, >=, <= and BETWEEN operators in the sql query.
 *
 * @param replaceTimeQuery Sql node representing the query
 * @param dateTimeZone timezone
 * @return Dimfilter for the query
 * @throws ValidationException if the SqlNode cannot be converted a Dimfilter
 */
public static DimFilter convertQueryToDimFilter(SqlNode replaceTimeQuery, DateTimeZone dateTimeZone) throws ValidationException {
    if (!(replaceTimeQuery instanceof SqlBasicCall)) {
        log.error("Expected SqlBasicCall during parsing, but found " + replaceTimeQuery.getClass().getName());
        throw new ValidationException("Invalid OVERWRITE WHERE clause");
    }
    String columnName;
    SqlBasicCall sqlBasicCall = (SqlBasicCall) replaceTimeQuery;
    List<SqlNode> operandList = sqlBasicCall.getOperandList();
    switch(sqlBasicCall.getOperator().getKind()) {
        case AND:
            List<DimFilter> dimFilters = new ArrayList<>();
            for (SqlNode sqlNode : sqlBasicCall.getOperandList()) {
                dimFilters.add(convertQueryToDimFilter(sqlNode, dateTimeZone));
            }
            return new AndDimFilter(dimFilters);
        case OR:
            dimFilters = new ArrayList<>();
            for (SqlNode sqlNode : sqlBasicCall.getOperandList()) {
                dimFilters.add(convertQueryToDimFilter(sqlNode, dateTimeZone));
            }
            return new OrDimFilter(dimFilters);
        case NOT:
            return new NotDimFilter(convertQueryToDimFilter(sqlBasicCall.getOperandList().get(0), dateTimeZone));
        case GREATER_THAN_OR_EQUAL:
            columnName = parseColumnName(operandList.get(0));
            return new BoundDimFilter(columnName, parseTimeStampWithTimeZone(operandList.get(1), dateTimeZone), null, false, null, null, null, StringComparators.NUMERIC);
        case LESS_THAN_OR_EQUAL:
            columnName = parseColumnName(operandList.get(0));
            return new BoundDimFilter(columnName, null, parseTimeStampWithTimeZone(operandList.get(1), dateTimeZone), null, false, null, null, StringComparators.NUMERIC);
        case GREATER_THAN:
            columnName = parseColumnName(operandList.get(0));
            return new BoundDimFilter(columnName, parseTimeStampWithTimeZone(operandList.get(1), dateTimeZone), null, true, null, null, null, StringComparators.NUMERIC);
        case LESS_THAN:
            columnName = parseColumnName(operandList.get(0));
            return new BoundDimFilter(columnName, null, parseTimeStampWithTimeZone(operandList.get(1), dateTimeZone), null, true, null, null, StringComparators.NUMERIC);
        case BETWEEN:
            columnName = parseColumnName(operandList.get(0));
            return new BoundDimFilter(columnName, parseTimeStampWithTimeZone(operandList.get(1), dateTimeZone), parseTimeStampWithTimeZone(operandList.get(2), dateTimeZone), false, false, null, null, StringComparators.NUMERIC);
        default:
            throw new ValidationException("Unsupported operation in OVERWRITE WHERE clause: " + sqlBasicCall.getOperator().getName());
    }
}
Also used : ValidationException(org.apache.calcite.tools.ValidationException) NotDimFilter(org.apache.druid.query.filter.NotDimFilter) BoundDimFilter(org.apache.druid.query.filter.BoundDimFilter) AndDimFilter(org.apache.druid.query.filter.AndDimFilter) ArrayList(java.util.ArrayList) SqlBasicCall(org.apache.calcite.sql.SqlBasicCall) OrDimFilter(org.apache.druid.query.filter.OrDimFilter) BoundDimFilter(org.apache.druid.query.filter.BoundDimFilter) AndDimFilter(org.apache.druid.query.filter.AndDimFilter) NotDimFilter(org.apache.druid.query.filter.NotDimFilter) DimFilter(org.apache.druid.query.filter.DimFilter) OrDimFilter(org.apache.druid.query.filter.OrDimFilter) SqlNode(org.apache.calcite.sql.SqlNode)

Example 25 with ValidationException

use of org.apache.calcite.tools.ValidationException in project druid by druid-io.

the class DruidSqlParserUtils method parseTimeStampWithTimeZone.

/**
 * Converts a {@link SqlNode} into a timestamp, taking into account the timezone
 *
 * @param sqlNode the sql node
 * @param timeZone timezone
 * @return the timestamp string as milliseconds from epoch
 * @throws ValidationException if the sql node is not a SqlTimestampLiteral
 */
public static String parseTimeStampWithTimeZone(SqlNode sqlNode, DateTimeZone timeZone) throws ValidationException {
    if (!(sqlNode instanceof SqlTimestampLiteral)) {
        throw new ValidationException("Expressions must be of the form __time <operator> TIMESTAMP");
    }
    Timestamp sqlTimestamp = Timestamp.valueOf(((SqlTimestampLiteral) sqlNode).toFormattedString());
    ZonedDateTime zonedTimestamp = sqlTimestamp.toLocalDateTime().atZone(timeZone.toTimeZone().toZoneId());
    return String.valueOf(zonedTimestamp.toInstant().toEpochMilli());
}
Also used : ValidationException(org.apache.calcite.tools.ValidationException) ZonedDateTime(java.time.ZonedDateTime) Timestamp(java.sql.Timestamp) SqlTimestampLiteral(org.apache.calcite.sql.SqlTimestampLiteral)

Aggregations

ValidationException (org.apache.calcite.tools.ValidationException)35 SqlNode (org.apache.calcite.sql.SqlNode)16 SqlParseException (org.apache.calcite.sql.parser.SqlParseException)16 RelConversionException (org.apache.calcite.tools.RelConversionException)12 RelRoot (org.apache.calcite.rel.RelRoot)7 Planner (org.apache.calcite.tools.Planner)6 FrameworkConfig (org.apache.calcite.tools.FrameworkConfig)5 PreparedStatement (java.sql.PreparedStatement)4 SQLException (java.sql.SQLException)4 ZonedDateTime (java.time.ZonedDateTime)4 SqlPrettyWriter (org.apache.calcite.sql.pretty.SqlPrettyWriter)4 AndDimFilter (org.apache.druid.query.filter.AndDimFilter)4 BoundDimFilter (org.apache.druid.query.filter.BoundDimFilter)4 DimFilter (org.apache.druid.query.filter.DimFilter)4 NotDimFilter (org.apache.druid.query.filter.NotDimFilter)4 OrDimFilter (org.apache.druid.query.filter.OrDimFilter)4 SQLPlannedOperationStatement (herddb.model.commands.SQLPlannedOperationStatement)3 ScanStatement (herddb.model.commands.ScanStatement)3 ArrayList (java.util.ArrayList)3 HashSet (java.util.HashSet)3