Welcome to Francis Academic Press

Academic Journal of Computing & Information Science, 2024, 7(4); doi: 10.25236/AJCIS.2024.070405.

Importance of matrix operations in artificial neural networks

Author(s)

Jinghan Kang1, Junxi Li2, Nana Guo3, Xin Guo1

Corresponding Author:
Xin Guo
Affiliation(s)

1Peter the Great St. Petersburg Polytechnic University, St. Petersburg, Russian Federation

2Herzen University, St. Petersburg, Russian Federation

3Gansu Agricultural University, Gansu, China

Abstract

In this paper, the importance of matrices in artificial neural networks is studied. It is mainly argued from three architectures, traditional convolutional neural network, attention mechanism network, and generative adversarial network. The results show that various mathematical operations on matrices and transformations of matrices are particularly important in artificial neural networks.

Keywords

Linear Planning; Matrix Operations; Artificial Neural Networks

Cite This Paper

Jinghan Kang, Junxi Li, Nana Guo, Xin Guo. Importance of matrix operations in artificial neural networks. Academic Journal of Computing & Information Science (2024), Vol. 7, Issue 4: 29-35. https://doi.org/10.25236/AJCIS.2024.070405.

References

[1] Chintalapudi N, Battineni G, Hossain MA, et al. Cascaded Deep Learning Frameworks in Contribution to the Detection of Parkinson's Disease. Bioengineering (Basel). 2022.

[2] Fawzi, A. et al. Artificial intelligence finds faster algorithms for multiplying matrices. Nature. 2022; doi: 10.1038/d41586-022-03023-w

[3] Hofmann M, Mader P. Synaptic Scaling-An Artificial Neural Network Regularization Inspired by Nature. IEEE Trans Neural Netw Learn Syst. 2022; 33 (7):3094-3108. doi:10.1109/TNNLS. 2021. 3050422

[4] Ciresan, D., Meier, U., Schmidhuber, J.: Multi-column deep neural networks for image classification. In: Computer Vision and Pattern Recognition (CVPR), 2012 IEEE Conference on. pp. 3642–3649. IEEE (2012).

[5] Jay Alammar. Visualizing A Neural Machine Translation Model (Mechanics of Seq2seq Models With Attention). 2018.

[6] Wang X, Liu J, Zhang C, et al. SSGraphCPI: A Novel Model for Predicting Compound-Protein Interactions Based on Deep Learning. Int J Mol Sci. 2022.

[7] Hemati W, Mehler A. LSTMVoter: chemical named entity recognition using a conglomerate of sequence labeling tools. J Cheminform. 2019.

[8] Zhou Y, Rosen MC, Swaminathan SK, et al. Distributed functions of prefrontal and parietal cortices during sequential categorical decisions. Elife. 2021.

[9] Nozomu Miyamoto, Masaru Isonuma, Sho Takase, Junichiro Mori, and Ichiro Sakata. 2023. Dynamic Structured Neural Topic Model with Self-Attention Mechanism. In Findings of the Association for Computational Linguistics: ACL 2023, pages 5916–5930, Toronto, Canada. Association for Computational Linguistics.

[10] Chengshen Xu, Lin Li, Xiao Hu, et al. The Cross Products of M Vectors in N-dimensional Spaces and Their Geometric Significance. arXiv preprint arXiv:2206.13809, 2022.

[11] Goodfellow I, Pouget-Abadie J, Mirza M, Xu B, Warde-Farley D, Ozair S, et al. Generative Adversarial Networks. In: Advances in Neural Information Processing Systems. 2014; 2672-2680.