Search in sources :

Example 11 with RuntimeInterruptedException

use of edu.stanford.nlp.util.RuntimeInterruptedException in project CoreNLP by stanfordnlp.

the class BZip2PipedOutputStream method close.

public void close() throws IOException {
    process.getOutputStream().close();
    try {
        outGobbler.join();
        errGobbler.join();
        outGobbler.getOutputStream().close();
        process.waitFor();
    } catch (InterruptedException ex) {
        throw new RuntimeInterruptedException(ex);
    }
// log.info("getBZip2PipedOutputStream: Closed. ");
}
Also used : RuntimeInterruptedException(edu.stanford.nlp.util.RuntimeInterruptedException) RuntimeInterruptedException(edu.stanford.nlp.util.RuntimeInterruptedException)

Example 12 with RuntimeInterruptedException

use of edu.stanford.nlp.util.RuntimeInterruptedException in project CoreNLP by stanfordnlp.

the class FastNeuralCorefAlgorithm method runCoref.

@Override
public void runCoref(Document document) {
    Map<Integer, List<Integer>> mentionToCandidateAntecedents = CorefUtils.heuristicFilter(CorefUtils.getSortedMentions(document), maxMentionDistance, maxMentionDistanceWithStringMatch);
    Map<Pair<Integer, Integer>, Boolean> mentionPairs = new HashMap<>();
    for (Map.Entry<Integer, List<Integer>> e : mentionToCandidateAntecedents.entrySet()) {
        for (int m1 : e.getValue()) {
            mentionPairs.put(new Pair<>(m1, e.getKey()), true);
        }
    }
    Compressor<String> compressor = new Compressor<>();
    DocumentExamples examples = featureExtractor.extract(0, document, mentionPairs, compressor);
    Counter<Pair<Integer, Integer>> pairwiseScores = new ClassicCounter<>();
    // We cache representations for mentions so we compute them O(n) rather than O(n^2) times
    Map<Integer, SimpleMatrix> antecedentCache = new HashMap<>();
    Map<Integer, SimpleMatrix> anaphorCache = new HashMap<>();
    // Score all mention pairs on how likely they are to be coreferent
    for (Example mentionPair : examples.examples) {
        if (Thread.interrupted()) {
            // Allow interrupting
            throw new RuntimeInterruptedException();
        }
        pairwiseScores.incrementCount(new Pair<>(mentionPair.mentionId1, mentionPair.mentionId2), model.score(document.predictedMentionsByID.get(mentionPair.mentionId1), document.predictedMentionsByID.get(mentionPair.mentionId2), compressor.uncompress(examples.mentionFeatures.get(mentionPair.mentionId1)), compressor.uncompress(examples.mentionFeatures.get(mentionPair.mentionId2)), compressor.uncompress(mentionPair.pairwiseFeatures), antecedentCache, anaphorCache));
    }
    // Score each mention for anaphoricity
    for (int anaphorId : mentionToCandidateAntecedents.keySet()) {
        if (Thread.interrupted()) {
            // Allow interrupting
            throw new RuntimeInterruptedException();
        }
        pairwiseScores.incrementCount(new Pair<>(-1, anaphorId), model.score(null, document.predictedMentionsByID.get(anaphorId), null, compressor.uncompress(examples.mentionFeatures.get(anaphorId)), null, antecedentCache, anaphorCache));
    }
    // Link each mention to the highest-scoring candidate antecedent
    for (Map.Entry<Integer, List<Integer>> e : mentionToCandidateAntecedents.entrySet()) {
        int antecedent = -1;
        int anaphor = e.getKey();
        double bestScore = pairwiseScores.getCount(new Pair<>(-1, anaphor)) - 50 * (greedyness - 0.5);
        for (int ca : e.getValue()) {
            double score = pairwiseScores.getCount(new Pair<>(ca, anaphor));
            if (score > bestScore) {
                bestScore = score;
                antecedent = ca;
            }
        }
        if (antecedent > 0) {
            CorefUtils.mergeCoreferenceClusters(new Pair<>(antecedent, anaphor), document);
        }
    }
}
Also used : HashMap(java.util.HashMap) RuntimeInterruptedException(edu.stanford.nlp.util.RuntimeInterruptedException) Compressor(edu.stanford.nlp.coref.statistical.Compressor) DocumentExamples(edu.stanford.nlp.coref.statistical.DocumentExamples) SimpleMatrix(org.ejml.simple.SimpleMatrix) Example(edu.stanford.nlp.coref.statistical.Example) ClassicCounter(edu.stanford.nlp.stats.ClassicCounter) List(java.util.List) HashMap(java.util.HashMap) Map(java.util.Map) Pair(edu.stanford.nlp.util.Pair)

Example 13 with RuntimeInterruptedException

use of edu.stanford.nlp.util.RuntimeInterruptedException in project CoreNLP by stanfordnlp.

the class SentenceAnnotator method annotate.

@Override
public void annotate(Annotation annotation) {
    if (annotation.containsKey(CoreAnnotations.SentencesAnnotation.class)) {
        if (nThreads() != 1 || maxTime() > 0) {
            InterruptibleMulticoreWrapper<CoreMap, CoreMap> wrapper = buildWrapper(annotation);
            for (CoreMap sentence : annotation.get(CoreAnnotations.SentencesAnnotation.class)) {
                boolean success = false;
                // If the sentence fails a second time we give up.
                for (int attempt = 0; attempt < 2; ++attempt) {
                    try {
                        wrapper.put(sentence);
                        success = true;
                        break;
                    } catch (RejectedExecutionException e) {
                        // If we time out, for now, we just throw away all jobs which were running at the time.
                        // Note that in order for this to be useful, the underlying job needs to handle Thread.interrupted()
                        List<CoreMap> failedSentences = wrapper.joinWithTimeout();
                        if (failedSentences != null) {
                            for (CoreMap failed : failedSentences) {
                                doOneFailedSentence(annotation, failed);
                            }
                        }
                        // We don't wait for termination here, and perhaps this
                        // is a mistake.  If the processor used does not respect
                        // interruption, we could easily create many threads
                        // which are all doing useless work.  However, there is
                        // no clean way to interrupt the thread and then
                        // guarantee it finishes without running the risk of
                        // waiting forever for the thread to finish, which is
                        // exactly what we don't want with the timeout.
                        wrapper = buildWrapper(annotation);
                    }
                }
                if (!success) {
                    doOneFailedSentence(annotation, sentence);
                }
                while (wrapper.peek()) {
                    wrapper.poll();
                }
            }
            List<CoreMap> failedSentences = wrapper.joinWithTimeout();
            while (wrapper.peek()) {
                wrapper.poll();
            }
            if (failedSentences != null) {
                for (CoreMap failed : failedSentences) {
                    doOneFailedSentence(annotation, failed);
                }
            }
        } else {
            for (CoreMap sentence : annotation.get(CoreAnnotations.SentencesAnnotation.class)) {
                if (Thread.interrupted()) {
                    throw new RuntimeInterruptedException();
                }
                doOneSentence(annotation, sentence);
            }
        }
    } else {
        throw new IllegalArgumentException("unable to find sentences in: " + annotation);
    }
}
Also used : RuntimeInterruptedException(edu.stanford.nlp.util.RuntimeInterruptedException) CoreAnnotations(edu.stanford.nlp.ling.CoreAnnotations) List(java.util.List) CoreMap(edu.stanford.nlp.util.CoreMap) RejectedExecutionException(java.util.concurrent.RejectedExecutionException)

Example 14 with RuntimeInterruptedException

use of edu.stanford.nlp.util.RuntimeInterruptedException in project CoreNLP by stanfordnlp.

the class DistributionPackage method make.

/**
 * Create the distribution and name the file according to the specified parameter.
 *
 * @param distribName The name of distribution
 * @return True if the distribution is built. False otherwise.
 */
public boolean make(String distribName) {
    boolean createdDir = (new File(distribName)).mkdir();
    if (createdDir) {
        String currentFile = "";
        try {
            for (String filename : distFiles) {
                currentFile = filename;
                File destFile = new File(filename);
                String relativePath = distribName + "/" + destFile.getName();
                destFile = new File(relativePath);
                FileSystem.copyFile(new File(filename), destFile);
            }
            String tarFileName = String.format("%s.tar", distribName);
            Runtime r = Runtime.getRuntime();
            Process p = r.exec(String.format("tar -cf %s %s/", tarFileName, distribName));
            if (p.waitFor() == 0) {
                File tarFile = new File(tarFileName);
                FileSystem.gzipFile(tarFile, new File(tarFileName + ".gz"));
                tarFile.delete();
                FileSystem.deleteDir(new File(distribName));
                lastCreatedDistribution = distribName;
                return true;
            } else {
                System.err.printf("%s: Unable to create tar file %s\n", this.getClass().getName(), tarFileName);
            }
        } catch (IOException e) {
            System.err.printf("%s: Unable to add file %s to distribution %s\n", this.getClass().getName(), currentFile, distribName);
        } catch (InterruptedException e) {
            System.err.printf("%s: tar did not return from building %s.tar\n", this.getClass().getName(), distribName);
            throw new RuntimeInterruptedException(e);
        }
    } else {
        System.err.printf("%s: Unable to create temp directory %s\n", this.getClass().getName(), distribName);
    }
    return false;
}
Also used : RuntimeInterruptedException(edu.stanford.nlp.util.RuntimeInterruptedException) RuntimeInterruptedException(edu.stanford.nlp.util.RuntimeInterruptedException)

Aggregations

RuntimeInterruptedException (edu.stanford.nlp.util.RuntimeInterruptedException)14 List (java.util.List)3 CountDownLatch (java.util.concurrent.CountDownLatch)3 CoreAnnotations (edu.stanford.nlp.ling.CoreAnnotations)2 CoreLabel (edu.stanford.nlp.ling.CoreLabel)2 ParserConstraint (edu.stanford.nlp.parser.common.ParserConstraint)2 ClassicCounter (edu.stanford.nlp.stats.ClassicCounter)2 Pair (edu.stanford.nlp.util.Pair)2 ArrayList (java.util.ArrayList)2 HashMap (java.util.HashMap)2 Map (java.util.Map)2 MentionType (edu.stanford.nlp.coref.data.Dictionaries.MentionType)1 Compressor (edu.stanford.nlp.coref.statistical.Compressor)1 DocumentExamples (edu.stanford.nlp.coref.statistical.DocumentExamples)1 Example (edu.stanford.nlp.coref.statistical.Example)1 HasTag (edu.stanford.nlp.ling.HasTag)1 HasWord (edu.stanford.nlp.ling.HasWord)1 Label (edu.stanford.nlp.ling.Label)1 TaggedWord (edu.stanford.nlp.ling.TaggedWord)1 Word (edu.stanford.nlp.ling.Word)1