• Home
  • About Rimag
  • Contact Us
  • Register
  • Log in
  • Order
Advanced
  • Home
  •  
  • Current Issue

    43
    Issue 43   Vol 11 Summer 2023
    Submit Your Paper List of Reviewers

    Published Issues

    • Vol. 11
      • ✓ Issue 43 - Summer 2023
      • ✓ Issue 42 - Spring 2023
      • ✓ Issue 41 - Winter 2023
    • Vol. 10
      • ✓ Issue 40 - Autumn 2022
      • ✓ Issue 39 - Summer 2022
      • ✓ Issue 38 - Spring 2022
      • ✓ Issue 37 - Winter 2022
    • Vol. 9
      • ✓ Issue 36 - Autumn 2021
      • ✓ Issue 35 - Summer 2021
      • ✓ Issue 34 - Spring 2021
      • ✓ Issue 33 - Winter 2021
      • ✓ SPECIAL ISSUE
    • Vol. 8
      • ✓ Issue 32 - Autumn 2020
      • ✓ Issue 31 - Summer 2020
      • ✓ Issue 30 - Spring 2020
      • ✓ Issue 29 - Winter 2020
    • Vol. 7
      • ✓ Issue 28 - Autumn 2019
      • ✓ Issue 27 - Summer 2019
      • ✓ Issue 26 - Spring 2019
      • ✓ Issue 25 - Winter 2019
    • Vol. 6
      • ✓ Issue 24 - Autumn 2018
      • ✓ Issue 23 - Summer 2018
      • ✓ Issue 22 - Spring 2018
      • ✓ Issue 21 - Winter 2018
    • Vol. 5
      • ✓ Issue 20 - Autumn 2017
      • ✓ Issue 19 - Summer 2017
      • ✓ Issue 18 - Spring 2017
      • ✓ Issue 17 - Winter 2017
    • Vol. 4
      • ✓ Issue 16 - Autumn 2016
      • ✓ Issue 15 - Summer 2016
      • ✓ Issue 14 - Spring 2016
      • ✓ Issue 13 - Winter 2016
    • Vol. 3
      • ✓ Issue 12 - Autumn 2015
      • ✓ Issue 11 - Summer 2015
      • ✓ Issue 10 - Spring 2015
      • ✓ Issue 9 - Winter 2015
    • Vol. 2
      • ✓ Issue 8 - Autumn 2014
      • ✓ Issue 7 - Summer 2014
      • ✓ Issue 6 - Spring 2014
      • ✓ Issue 5 - Winter 2014
    • Vol. 1
      • ✓ Issue 4 - Autumn 2013
      • ✓ Issue 3 - Summer 2013
      • ✓ Issue 2 - Spring 2013
      • ✓ Issue 1 - Winter 2013

    Menu

    • •  Editorial Board
    • •  Journal Policy
    • •  About Journal
    • •  Special Issues
    • •  Author's Guide
    • •  Article Processing Charges (APC)
    • •  Evaluation Process
    • Contact Journal

    Browse

    • •  Current Issue
    • •  By Issue
    • • Author Index
    • •  By Subject
    • •  By Author
    OpenAccess
    • List of Articles  

      • Open Access Article
        • Abstract Page
        • Full-Text

        1 - Deep Transformer-based Representation for Text Chunking
        Parsa Kavehzadeh Mohammad Mahdi  Abdollah Pour Saeedeh Momtazi
        10.61186/jist.19894.11.43.176
        20.1001.1.23221437.2023.11.43.2.1
        Text chunking is one of the basic tasks in natural language processing. Most proposed models in recent years were employed on chunking and other sequence labeling tasks simultaneously and they were mostly based on Recurrent Neural Networks (RNN) and Conditional Random F More
        Text chunking is one of the basic tasks in natural language processing. Most proposed models in recent years were employed on chunking and other sequence labeling tasks simultaneously and they were mostly based on Recurrent Neural Networks (RNN) and Conditional Random Field (CRF). In this article, we use state-of-the-art transformer-based models in combination with CRF, Long Short-Term Memory (LSTM)-CRF as well as a simple dense layer to study the impact of different pre-trained models on the overall performance in text chunking. To this aim, we evaluate BERT, RoBERTa, Funnel Transformer, XLM, XLM-RoBERTa, BART, and GPT2 as candidates of contextualized models. Our experiments exhibit that all transformer-based models except GPT2 achieved close and high scores on text chunking. Due to the unique unidirectional architecture of GPT2, it shows a relatively poor performance on text chunking in comparison to other bidirectional transformer-based architectures. Our experiments also revealed that adding a LSTM layer to transformer-based models does not significantly improve the results since LSTM does not add additional features to assist the model to achieve more information from the input compared to the deep contextualized models. Manuscript profile
  • Home Page
  • Site Map
  • Contact Us
  • Home
  • Site Map
  • Regional Science and Technology Information Center
  • Contact Us

The rights to this website are owned by the Raimag Press Management System.
Copyright © 2017-2023

Home| Login| About Rimag| Contact Us|
[فارسی] [العربية] [fa] [ar]
  • Ricest
  • Login
  • email