You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
// If the sentence could not be annotated, automatically mark it as invalid
// It turns out that there are four sentences that eidos failed to annotate.
preds.append(0)
invalidSentCount+=1
}
vallabel= sentenceClassifierEvaluationData(i)._2
, the test assigns 0 to the prediction and takes whatever label there is to make a calculation even when there is no sentence. I don't think this should happen, Both numbers should be thrown out as if it never happened. In the four cases, there are run-on sentences from tables or captions that we are skipping.
The text was updated successfully, but these errors were encountered:
At
eidos/src/test/scala/org/clulab/wm/eidos/document/TestSentenceClassifier.scala
Lines 138 to 145 in 6d19122
The text was updated successfully, but these errors were encountered: