Comparison of 2D convolutions and dense neural networks for natural language processing models with multi-sentence input
DOI:
https://doi.org/10.17721/1812-5409.2023/2.3Keywords:
Natural Language Processing, language models, convolutions, dense neural networksAbstract
This paper is devoted to the analysis of court cases based on multiple sentences that represent plaintiff's claim, claim motivation and defendant's response. Based on these parameters we classify a given case into one of seven categories designed for our task and then predict its decision in the first court's instance. We use fine-tuned XLM\RoBERTa for this task. There were compared two approaches for building fine-tuned model's head. One is based on stacking the numerical representation of multiple sentences so that they form a matrix and applying 2D convolutions. Second approach is based on concatenated sentences and application of dense neural networks. The latter demonstrates a slightly better performance in our experiments, while the former exhibits the simpler training process.
Pages of the article in the issue: 20 - 29
Language of the article: English
References
DEVLIN, J., CHANG, M., LEE, K., and TOUTANOVA, K. (2019) BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171-4186, Available from: https://aclanthology.org/N19-1423/
GOLOMOZIY, V., MISHURA, Y., IZAROVA, I., and IANEVYCH, T. (2023) Processing Big Data of Court Decisions, BALTIC JOURNAL OF MODERN COMPUTING, Vol. 11, No. 4. Available from: https://www.bjmc.lu.lv/contents/
JOHNSON, R., and ZHANG, T. (2015) Effective Use of Word Order for Text Categorization with Convolutional Neural Networks. In Proceedings of the 2015 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 103-112. Available from: https://aclanthology.org/N15-1011/
JOHNSON, R., and ZHANG, T. (2015) Semi-supervised Convolutional Neural Networks for Text Categorization via Region Embedding. In Advances in Neural Information Processing Systems 28, pp. 919-927.
KALCHBRENNER, N., GREFENSTETTE, E., and BLUNSOM, P. (2014) A Convolutional Neural Network for Modelling Sentences. In Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 655-665. Available from: https://aclanthology.org/P14-1062/
KIM, Y., (2014) Convolutional Neural Networks for Sentence Classification. In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, pp. 1746-1751.
NGUYEN, T. H., and GRISHMAN, R. (2015) Relation Extraction: Perspective from Convolutional Neural Networks. In Proceedings of the 1st Workshop on Vector Space Modeling for Natural Language Processing, pp. 39-48. Available from: https://aclanthology.org/W15-1506/
SANTOS, C. N. DOS, and GATTI, M. 2014. Deep Convolutional Neural Networks for Sentiment Analysis of Short Texts. In Proceedings of COLING 2014, the 25th International Conference on Computational Linguistics: Technical Papers, pp. 69-78. Available from: https://aclanthology.org/C14-1008/
SHEN, Y., HE, X., GAO, J., DENG, L., and MESNIL, G. (2014) A Latent Semantic Model with Convolutional-Pooling Structure for Information Retrieval. In Proceedings of the 23rd ACM International Conference on Conference on Information and Knowledge Management, pp. 101-110. https://doi.org/10.1145/2661829.2661935
SONI, S., CHOUHAN, S.S. and RATHORE, S.S. (2023) TextConvoNet: a convolutional neural network based architecture for text classification. Appl Intell 53, 14249-14268. https://doi.org/10.1007/s10489-022-04221-9
WANG, P., XU, J., XU, B., LIU, C., ZHANG, H., WANG, F., and HAO, H. (2015) Semantic Clustering and Convolutional Neural Network for Short Text Categorization. In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pp. 352-357. Available from: https://aclanthology.org/P15-2058/
WESTON, J., and ADAMS, K. (2014) TagSpace: Semantic Embeddings from Hashtags. In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing., pp. 1822-1827. Available from: https://aclanthology.org/D14-1194/
WIDIASTUTI, N., (2019) Convolution Neural Network for Text Mining and Natural Language Processing. IOP Conference Series: Materials Science and Engineering, Issue 5, Vol. 662. doi: 10.1088/1757-899X/662/5/052010
YUAN, K., GUO, S., LIU, Z., ZHOU, A., YU, F., and WU, W. (2021) Incorporating Convolution Designs Into Visual Transformers. In Proceedings of the IEEE/CVF International Conference on Computer Vision., pp. 559-568. doi: 10.1109/ICCV48922.2021.00062
ZENG, D., LIU, K., LAI, S., ZHOU, G., and ZHAO, J. (2014) Relation Classification via Convolutional Deep Neural Network. In Proceedings of COLING 2014, the 25th International Conference on Computational Linguistics: Technical Papers, pp. 2335-2344. Available from: https://aclanthology.org/C14-1220/
Downloads
Published
Issue
Section
License
Copyright (c) 2023 Tetyana Yanevych, Vitaliy Golomoziy, Yuliya Mishura, Iryna Izarova

This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).