Search in sources :

Example 6 with LiteralParseNode

use of org.apache.phoenix.parse.LiteralParseNode in project phoenix by apache.

the class UpsertCompiler method getNodeForRowTimestampColumn.

private static LiteralParseNode getNodeForRowTimestampColumn(PColumn col) {
    PDataType type = col.getDataType();
    long dummyValue = 0L;
    if (type.isCoercibleTo(PTimestamp.INSTANCE)) {
        return new LiteralParseNode(new Timestamp(dummyValue), PTimestamp.INSTANCE);
    } else if (type == PLong.INSTANCE || type == PUnsignedLong.INSTANCE) {
        return new LiteralParseNode(dummyValue, PLong.INSTANCE);
    }
    throw new IllegalArgumentException();
}
Also used : PDataType(org.apache.phoenix.schema.types.PDataType) Timestamp(java.sql.Timestamp) PTimestamp(org.apache.phoenix.schema.types.PTimestamp) LiteralParseNode(org.apache.phoenix.parse.LiteralParseNode)

Example 7 with LiteralParseNode

use of org.apache.phoenix.parse.LiteralParseNode in project phoenix by apache.

the class ListJarsQueryPlan method iterator.

@Override
public ResultIterator iterator(ParallelScanGrouper scanGrouper) throws SQLException {
    return new ResultIterator() {

        private RemoteIterator<LocatedFileStatus> listFiles = null;

        @Override
        public void close() throws SQLException {
        }

        @Override
        public Tuple next() throws SQLException {
            try {
                if (first) {
                    String dynamicJarsDir = stmt.getConnection().getQueryServices().getProps().get(QueryServices.DYNAMIC_JARS_DIR_KEY);
                    if (dynamicJarsDir == null) {
                        throw new SQLException(QueryServices.DYNAMIC_JARS_DIR_KEY + " is not configured for the listing the jars.");
                    }
                    dynamicJarsDir = dynamicJarsDir.endsWith("/") ? dynamicJarsDir : dynamicJarsDir + '/';
                    Configuration conf = HBaseFactoryProvider.getConfigurationFactory().getConfiguration();
                    Path dynamicJarsDirPath = new Path(dynamicJarsDir);
                    FileSystem fs = dynamicJarsDirPath.getFileSystem(conf);
                    listFiles = fs.listFiles(dynamicJarsDirPath, true);
                    first = false;
                }
                if (listFiles == null || !listFiles.hasNext())
                    return null;
                ImmutableBytesWritable ptr = new ImmutableBytesWritable();
                ParseNodeFactory factory = new ParseNodeFactory();
                LiteralParseNode literal = factory.literal(listFiles.next().getPath().toString());
                LiteralExpression expression = LiteralExpression.newConstant(literal.getValue(), PVarchar.INSTANCE, Determinism.ALWAYS);
                expression.evaluate(null, ptr);
                byte[] rowKey = ByteUtil.copyKeyBytesIfNecessary(ptr);
                Cell cell = CellUtil.createCell(rowKey, HConstants.EMPTY_BYTE_ARRAY, HConstants.EMPTY_BYTE_ARRAY, EnvironmentEdgeManager.currentTimeMillis(), Type.Put.getCode(), HConstants.EMPTY_BYTE_ARRAY);
                List<Cell> cells = new ArrayList<Cell>(1);
                cells.add(cell);
                return new ResultTuple(Result.create(cells));
            } catch (IOException e) {
                throw new SQLException(e);
            }
        }

        @Override
        public void explain(List<String> planSteps) {
        }
    };
}
Also used : Path(org.apache.hadoop.fs.Path) ImmutableBytesWritable(org.apache.hadoop.hbase.io.ImmutableBytesWritable) Configuration(org.apache.hadoop.conf.Configuration) SQLException(java.sql.SQLException) LiteralExpression(org.apache.phoenix.expression.LiteralExpression) ResultTuple(org.apache.phoenix.schema.tuple.ResultTuple) ResultIterator(org.apache.phoenix.iterate.ResultIterator) ArrayList(java.util.ArrayList) IOException(java.io.IOException) LiteralParseNode(org.apache.phoenix.parse.LiteralParseNode) RemoteIterator(org.apache.hadoop.fs.RemoteIterator) FileSystem(org.apache.hadoop.fs.FileSystem) List(java.util.List) ArrayList(java.util.ArrayList) Cell(org.apache.hadoop.hbase.Cell) ParseNodeFactory(org.apache.phoenix.parse.ParseNodeFactory)

Aggregations

LiteralParseNode (org.apache.phoenix.parse.LiteralParseNode)7 Cell (org.apache.hadoop.hbase.Cell)3 PColumn (org.apache.phoenix.schema.PColumn)3 ArrayList (java.util.ArrayList)2 List (java.util.List)2 ImmutableBytesWritable (org.apache.hadoop.hbase.io.ImmutableBytesWritable)2 LiteralExpression (org.apache.phoenix.expression.LiteralExpression)2 ResultIterator (org.apache.phoenix.iterate.ResultIterator)2 ParseNode (org.apache.phoenix.parse.ParseNode)2 ParseNodeFactory (org.apache.phoenix.parse.ParseNodeFactory)2 ResultTuple (org.apache.phoenix.schema.tuple.ResultTuple)2 PDataType (org.apache.phoenix.schema.types.PDataType)2 ByteString (com.google.protobuf.ByteString)1 IOException (java.io.IOException)1 SQLException (java.sql.SQLException)1 Timestamp (java.sql.Timestamp)1 Configuration (org.apache.hadoop.conf.Configuration)1 FileSystem (org.apache.hadoop.fs.FileSystem)1 Path (org.apache.hadoop.fs.Path)1 RemoteIterator (org.apache.hadoop.fs.RemoteIterator)1