Document Type

Poster

Publication Title

Northrop Grumman Engineering & Science Student Design Showcase

Abstract

The use of an entropy-based loss function to improve BERT’s sentiment analysis on the Stanford Sentiment Treebank (SST-2) dataset. By studying entropy trends in a fine-tuned BERT model, we crafted a custom loss that stabilizes entropy in early layers (1–9) and penalizes entropy rises in later layers (10–12) using a mean entropy threshold. Our approach achieved 92.09% accuracy and a 92.31% F1 score, surpassing a cross-entropy baseline by 1.95%. These results highlight entropy-guided optimization’s potential for transformer models.

Advisor

Ryan White

Publication Date

4-25-2025

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.