Search in sources :

Example 1 with RecoveryLogsIterator

use of org.apache.accumulo.tserver.log.RecoveryLogsIterator in project accumulo by apache.

the class LogReader method execute.

@SuppressFBWarnings(value = "DM_EXIT", justification = "System.exit is fine here because it's a utility class executed by a main()")
@Override
public void execute(String[] args) throws Exception {
    Opts opts = new Opts();
    opts.parseArgs("accumulo wal-info", args);
    if (opts.files.isEmpty()) {
        System.err.println("No WAL files were given");
        System.exit(1);
    }
    var siteConfig = SiteConfiguration.auto();
    ServerContext context = new ServerContext(siteConfig);
    try (VolumeManager fs = context.getVolumeManager()) {
        Matcher rowMatcher = null;
        KeyExtent ke = null;
        Text row = null;
        if (opts.row != null) {
            row = new Text(opts.row);
        }
        if (opts.extent != null) {
            String[] sa = opts.extent.split(";");
            ke = new KeyExtent(TableId.of(sa[0]), new Text(sa[1]), new Text(sa[2]));
        }
        if (opts.regexp != null) {
            Pattern pattern = Pattern.compile(opts.regexp);
            rowMatcher = pattern.matcher("");
        }
        Set<Integer> tabletIds = new HashSet<>();
        for (String file : opts.files) {
            Path path = new Path(file);
            LogFileKey key = new LogFileKey();
            LogFileValue value = new LogFileValue();
            // ensure it's a regular non-sorted WAL file, and not a single sorted WAL in RFile format
            if (fs.getFileStatus(path).isFile()) {
                if (file.endsWith(".rf")) {
                    log.error("Unable to read from a single RFile. A non-sorted WAL file was expected. " + "To read sorted WALs, please pass in a directory containing the sorted recovery logs.");
                    continue;
                }
                try (final FSDataInputStream fsinput = fs.open(path);
                    DataInputStream input = DfsLogger.getDecryptingStream(fsinput, siteConfig)) {
                    while (true) {
                        try {
                            key.readFields(input);
                            value.readFields(input);
                        } catch (EOFException ex) {
                            break;
                        }
                        printLogEvent(key, value, row, rowMatcher, ke, tabletIds, opts.maxMutations);
                    }
                } catch (LogHeaderIncompleteException e) {
                    log.warn("Could not read header for {} . Ignoring...", path);
                    continue;
                }
            } else {
                // finished file.
                try (var rli = new RecoveryLogsIterator(context, Collections.singletonList(path), null, null, false)) {
                    while (rli.hasNext()) {
                        Entry<LogFileKey, LogFileValue> entry = rli.next();
                        printLogEvent(entry.getKey(), entry.getValue(), row, rowMatcher, ke, tabletIds, opts.maxMutations);
                    }
                }
            }
        }
    }
}
Also used : Path(org.apache.hadoop.fs.Path) VolumeManager(org.apache.accumulo.server.fs.VolumeManager) Pattern(java.util.regex.Pattern) Matcher(java.util.regex.Matcher) Text(org.apache.hadoop.io.Text) DataInputStream(java.io.DataInputStream) FSDataInputStream(org.apache.hadoop.fs.FSDataInputStream) KeyExtent(org.apache.accumulo.core.dataImpl.KeyExtent) LogHeaderIncompleteException(org.apache.accumulo.tserver.log.DfsLogger.LogHeaderIncompleteException) ServerContext(org.apache.accumulo.server.ServerContext) RecoveryLogsIterator(org.apache.accumulo.tserver.log.RecoveryLogsIterator) EOFException(java.io.EOFException) FSDataInputStream(org.apache.hadoop.fs.FSDataInputStream) HashSet(java.util.HashSet) SuppressFBWarnings(edu.umd.cs.findbugs.annotations.SuppressFBWarnings)

Aggregations

SuppressFBWarnings (edu.umd.cs.findbugs.annotations.SuppressFBWarnings)1 DataInputStream (java.io.DataInputStream)1 EOFException (java.io.EOFException)1 HashSet (java.util.HashSet)1 Matcher (java.util.regex.Matcher)1 Pattern (java.util.regex.Pattern)1 KeyExtent (org.apache.accumulo.core.dataImpl.KeyExtent)1 ServerContext (org.apache.accumulo.server.ServerContext)1 VolumeManager (org.apache.accumulo.server.fs.VolumeManager)1 LogHeaderIncompleteException (org.apache.accumulo.tserver.log.DfsLogger.LogHeaderIncompleteException)1 RecoveryLogsIterator (org.apache.accumulo.tserver.log.RecoveryLogsIterator)1 FSDataInputStream (org.apache.hadoop.fs.FSDataInputStream)1 Path (org.apache.hadoop.fs.Path)1 Text (org.apache.hadoop.io.Text)1