BEAL is a deep active learning method that uses Bayesian deep learning with dropout to infer the model’s posterior predictive distribution and introduces an expected confidence-based acquisition function to select uncertain samples. Experiments show that BEAL outperforms other active learning methods, requiring fewer labeled samples for efficient training.
Each time you run the model, the results may vary a little bit. Overall, after 5 tries, I can conclude that SBERT has a bit better performance in terms of best f1 score while Data2vec used way less memory. The average f1 scores for both models are very close.