Repository logo
  • English
  • Català
  • Čeština
  • Deutsch
  • Español
  • Français
  • Gàidhlig
  • Italiano
  • Latviešu
  • Magyar
  • Nederlands
  • Polski
  • Português
  • Português do Brasil
  • Srpski (lat)
  • Suomi
  • Svenska
  • Türkçe
  • Tiếng Việt
  • Қазақ
  • বাংলা
  • हिंदी
  • Ελληνικά
  • Српски
  • Yкраї́нська
  • Log In
    New user? Click here to register.Have you forgotten your password?
Repository logo
  • Communities & Collections
  • Research Outputs
  • Fundings & Projects
  • People
  • Statistics
  • English
  • Català
  • Čeština
  • Deutsch
  • Español
  • Français
  • Gàidhlig
  • Italiano
  • Latviešu
  • Magyar
  • Nederlands
  • Polski
  • Português
  • Português do Brasil
  • Srpski (lat)
  • Suomi
  • Svenska
  • Türkçe
  • Tiếng Việt
  • Қазақ
  • বাংলা
  • हिंदी
  • Ελληνικά
  • Српски
  • Yкраї́нська
  • Log In
    New user? Click here to register.Have you forgotten your password?
  1. Home
  2. Academic Research Output
  3. Journal Article
  4. A Multi-Channel Text Sentiment Analysis Model Integrating Pre-training Mechanism
 
  • Details
Options

A Multi-Channel Text Sentiment Analysis Model Integrating Pre-training Mechanism

Date Issued
2023
Author(s)
Liang, Shengbin 
Institute for Data Engineering and Science 
Jin, Jiangyong
Du, George 
Institute for Data Engineering and Science 
Qu, Shenming
DOI
10.5755/j01.itc.52.2.31803
Abstract
The number of tourist attractions reviews, travel notes and other texts has grown exponentially in the Internet age. Effectively mining users� potential opinions and emotions on tourist attractions, and helping to provide users with better recommendation services, which is of great practical significance. This paper proposes a multi-channel neural network model called Pre-BiLSTM combined with a pre-training mechanism. The model uses a combination of coarse and fine- granularity strategies to extract the features of text information such as reviews and travel notes to improve the performance of text sentiment analysis. First, we construct three channels and use the improved BERT and skip-gram methods with negative sampling to vectorize the word-level and vocabulary-level text, respectively, so as to obtain more abundant textual information. Second, we use the pre-training mechanism of BERT to generate deep bidirectional language representation relationships. Third, the vectors of the three channels are input into the BiLSTM network in parallel to extract global and local features. Finally, the model fuses the text features of the three channels and classifies them using SoftMax classifier. Furthermore, numerical experiments are conducted to demonstrate that Pre-BiLSTM outperforms the baselines by 6.27%, 12.83% and 18.12% in average in terms of accuracy, precision and F1-score.
Subjects

Pre-training mechanis...

File(s)
No Thumbnail Available
Name

Waiting for Repository Version.pdf

Size

37.66 KB

Format

Adobe PDF

Checksum

(MD5):70439f9ac5a8bde2f366653765cefe3c


  • YouTube
  • Instagram
  • Facebook


USJ Library

Estrada Marginal da Ilha Verde
14-17, Macau, China

E-mail:library@usj.edu.mo
Tel:+853 8592 5633

Quick Link

Direction & Parking
USJ website
Contact Us

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science

  • Cookie settings
  • Privacy policy
  • End User Agreement
  • Send Feedback