Current Computer Science

Author(s): Jabar H. Yousif* and Mohammed J. Yousif

DOI: 10.2174/0129503779282967240315040931

DownloadDownload PDF Flyer Cite As
Evolutionary Perspectives on Neural Network Generations: A Critical Examination of Models and Design Strategies

Article ID: e050424228693 Pages: 16

  • * (Excluding Mailing and Handling)

Abstract

In the last few years, Neural Networks have become more common in different areas due to their ability to learn intricate patterns and provide precise predictions. Nonetheless, creating an efficient neural network model is a difficult task that demands careful thought of multiple factors, such as architecture, optimization method, and regularization technique. This paper aims to comprehensively overview the state-of-the-art artificial neural network (ANN) generation and highlight key challenges and opportunities in machine learning applications. It provides a critical analysis of current neural network model design methodologies, focusing on the strengths and weaknesses of different approaches. Also, it explores the use of different deep neural networks (DNN) in image recognition, natural language processing, and time series analysis. In addition, the text explores the advantages of selecting optimal values for various components of an Artificial Neural Network (ANN). These components include the number of input/output layers, the number of hidden layers, the type of activation function used, the number of epochs, and the model type selection. Setting these components to their ideal values can help enhance the model's overall performance and generalization. Furthermore, it identifies some common pitfalls and limitations of existing design methodologies, such as overfitting, lack of interpretability, and computational complexity. Finally, it proposes some directions for future research, such as developing more efficient and interpretable neural network architectures, improving the scalability of training algorithms, and exploring the potential of new paradigms, such as Spiking Neural Networks, quantum neural networks, and neuromorphic computing.

Keywords: Neural network generations, machine learning, convolutional neural networks, deep neural networks, model performance, quantum neural networks.

[1]
D. Kreuzberger, N. Kühl, and S. Hirschl, "Machine learning operations (mlops): Overview, definition, and architecture", IEEE Access, vol. 11, pp. 31866-31879, 2023.
[http://dx.doi.org/10.1109/ACCESS.2023.3262138]
[2]
X. Shu, and Y. Ye, "Knowledge discovery: Methods from data mining and machine learning", Soc. Sci. Res., vol. 110, p. 102817, 2023.
[http://dx.doi.org/10.1016/j.ssresearch.2022.102817] [PMID: 36796993]
[3]
J.H. Yousif, and D.K. Saini, Big data analysis on smart tools and techniques., Cyber Defense Mechanisms, 2020, pp. 111-130.
[http://dx.doi.org/10.1201/9780367816438-8]
[4]
W. Hu, X. Li, C. Li, R. Li, T. Jiang, H. Sun, X. Huang, M. Grzegorzek, and X. Li, "A state-of-the-art survey of artificial neural networks for Whole-slide Image analysis: From popular convolutional neural networks to potential visual transformers", Comput. Biol. Med., vol. 161, p. 107034, 2023.
[http://dx.doi.org/10.1016/j.compbiomed.2023.107034] [PMID: 37230019]
[5]
J.H. Yousif, H.A. Kazem, H. Al-Balushi, K. Abuhmaidan, and R. Al-Badi, "Artificial Neural network modelling and experimental evaluation of dust and thermal energy impact on monocrystalline and polycrystalline photovoltaic modules", Energies, vol. 15, no. 11, p. 4138, 2022.
[http://dx.doi.org/10.3390/en15114138]
[6]
J. Gawlikowski, C.R.N. Tassi, M. Ali, J. Lee, M. Humt, J. Feng, A. Kruspe, R. Triebel, P. Jung, R. Roscher, M. Shahzad, W. Yang, R. Bamler, and X.X. Zhu, "A survey of uncertainty in deep neural networks", Artif. Intell. Rev., vol. 56, no. S1, pp. 1513-1589, 2023.
[http://dx.doi.org/10.1007/s10462-023-10562-9]
[7]
M.M. Mariani, I. Machado, V. Magrelli, and Y.K. Dwivedi, "Artificial intelligence in innovation research: A systematic review, conceptual framework, and future research directions", Technovation, vol. 122, p. 102623, 2023.
[http://dx.doi.org/10.1016/j.technovation.2022.102623]
[8]
I. Yilmaz, and O. Kaynar, "Multiple regression, ANN (RBF, MLP) and ANFIS models for prediction of swell potential of clayey soils", Expert Syst. Appl., vol. 38, no. 5, pp. 5958-5966, 2011.
[http://dx.doi.org/10.1016/j.eswa.2010.11.027]
[9]
V. Singh, P. Gangsar, R. Porwal, and A. Atulkar, "Artificial intelligence application in fault diagnostics of rotating industrial machines: A state-of-the-art review", J. Intell. Manuf., vol. 34, no. 3, pp. 931-960, 2023.
[http://dx.doi.org/10.1007/s10845-021-01861-5]
[10]
X. Wu, R. Niu, F. Ren, and L. Peng, "Landslide susceptibility mapping using rough sets and back-propagation neural networks in the Three Gorges, China", Environ. Earth Sci., vol. 70, no. 3, pp. 1307-1318, 2013.
[http://dx.doi.org/10.1007/s12665-013-2217-2]
[11]
Y. Lin, J. Ma, Q. Wang, and D.W. Sun, "Applications of machine learning techniques for enhancing nondestructive food quality and safety detection", Crit. Rev. Food Sci. Nutr., vol. 63, no. 12, pp. 1649-1669, 2023.
[http://dx.doi.org/10.1080/10408398.2022.2131725] [PMID: 36222697]
[12]
S. Larabi-Marie-Sainte, M. Bin Alamir, and A. Alameer, "Arabic text clustering using self-organizing maps and grey wolf optimization", Appl. Sci., vol. 13, no. 18, p. 10168, 2023.
[http://dx.doi.org/10.3390/app131810168]
[13]
H. Song, M. Kim, D. Park, Y. Shin, and J.G. Lee, "Learning from noisy labels with deep neural networks: A survey", IEEE Trans. Neural Netw. Learn. Syst., vol. 34, no. 11, pp. 8135-8153, 2022.
[PMID: 35254993]
[14]
J. Yousif, "Neural computing based part of speech tagger for Arabic language: A review study", Int. J. Comp. Appl. Sci. IJOCAAS, vol. 5, no. 1, 2018.
[15]
S. Sahoo, S. Kumar, M.Z. Abedin, W.M. Lim, and S.K. Jakhar, "Deep learning applications in manufacturing operations: A review of trends and ways forward", J. Enterp. Inf. Manag., vol. 36, no. 1, pp. 221-251, 2023.
[http://dx.doi.org/10.1108/JEIM-01-2022-0025]
[16]
A. Mehrish, N. Majumder, R. Bharadwaj, R. Mihalcea, and S. Poria, "A review of deep learning techniques for speech processing", Inf. Fusion, vol. 99, p. 101869, 2023.
[http://dx.doi.org/10.1016/j.inffus.2023.101869]
[17]
M. Gheisari, F. Ebrahimzadeh, M. Rahimi, M. Moazzamigodarzi, Y. Liu, P.K. Dutta Pramanik, M.A. Heravi, A. Mehbodniya, M. Ghaderzadeh, M.R. Feylizadeh, and S. Kosari, "Deep learning: Applications, architectures, models, tools, and frameworks: A comprehensive survey", CAAI Trans. Intell. Technol., vol. 8, no. 3, pp. 581-606, 2023.
[http://dx.doi.org/10.1049/cit2.12180]
[18]
J.H. Yousif, and T. Sembok, "Arabic part-of-speech tagger based neural networks", In proceedings of International Arab Conference on Information Technology, 2022, pp. 22-24
[19]
A.D.A. Garcez, and L.C. Lamb, "Neurosymbolic AI: The 3rd wave", Artif. Intell. Rev., pp. 1-20, 2023.
[20]
A. Sheth, K. Roy, and M. Gaur, "Neurosymbolic ai-why, what, and how", arXiv:2305.00813, 2023.
[21]
A. Krenzer, S. Heil, D. Fitting, S. Matti, W.G. Zoller, A. Hann, and F. Puppe, "Automated classification of polyps using deep learning architectures and few-shot learning", BMC Med. Imaging, vol. 23, no. 1, p. 59, 2023.
[http://dx.doi.org/10.1186/s12880-023-01007-4] [PMID: 37081495]
[22]
A.R. Javed, W. Ahmed, S. Pandya, P.K.R. Maddikunta, M. Alazab, and T.R. Gadekallu, "A survey of explainable artificial intelligence for smart cities", Electronics, vol. 12, no. 4, p. 1020, 2023.
[http://dx.doi.org/10.3390/electronics12041020]
[23]
N. Tang, S. Gong, J. Zhou, M. Shen, and T. Gao, "Generative visual common sense: Testing analysis-by-synthesis on Mondrian-style image", J. Exp. Psychol. Gen., vol. 152, no. 10, pp. 2713-2734, 2023.
[http://dx.doi.org/10.1037/xge0001413] [PMID: 37199976]
[24]
S.S. Pany, S.G. Singh, S. Kar, and B. Dikshit, "Stochastic modelling of diffused and specular reflector efficiencies for scintillation detectors", ournal of Optics, vol. 24, pp. 1-12, 2023.
[http://dx.doi.org/10.1007/s12596-023-01190-1]
[25]
L. Xi, W. Tang, and T. Wan, "TreeNet: Structure preserving multi-class 3D point cloud completion", Pattern Recognit., vol. 139, p. 109476, 2023.
[http://dx.doi.org/10.1016/j.patcog.2023.109476]
[26]
F.Y.H. Ahmed, A.A. Masli, B. Khassawneh, J.H. Yousif, and D.A. Zebari, "Optimized downlink scheduling over lte network based on artificial neural network", Computers, vol. 12, no. 9, p. 179, 2023.
[http://dx.doi.org/10.3390/computers12090179]
[27]
R.K. Chunduri, and D.G. Perera, "Neuromorphic sentiment analysis using spiking neural networks", Sensors, vol. 23, no. 18, p. 7701, 2023.
[http://dx.doi.org/10.3390/s23187701] [PMID: 37765758]
[28]
Z. Yi, J. Lian, Q. Liu, H. Zhu, D. Liang, and J. Liu, "Learning rules in spiking neural networks: A survey", Neurocomputing, vol. 531, pp. 163-179, 2023.
[http://dx.doi.org/10.1016/j.neucom.2023.02.026]
[29]
Y.H. Ali, F.Y. Ahmed, A.M. Abdelrhman, S.M. Ali, A.A. Borhana, and R.I.R. Hamzah, "Novel spiking neural network model for gear fault diagnosis", In 2022 2nd International Conference on Emerging Smart Technologies and Applications , 25-26 October, Ibb, Yemen, 2022, pp. 1-6
[http://dx.doi.org/10.1109/eSmarTA56775.2022.9935414]
[30]
L. Budach, M. Feuerpfeil, N. Ihde, A. Nathansen, N. Noack, H. Patzlaff, F. Naumann, and H. Harmouch, "The effects of data quality on machine learning performance", arXiv:2207.14529, 2022.
[31]
N. Li, L. Ma, T. Xing, G. Yu, C. Wang, Y. Wen, S. Cheng, and S. Gao, "Automatic design of machine learning via evolutionary computation: A survey", Appl. Soft Comput., vol. 143, p. 110412, 2023.
[http://dx.doi.org/10.1016/j.asoc.2023.110412]
[32]
S. Liu, Q. Lin, and J. Li, "A survey on learnable evolutionary algorithms for scalable multiobjective optimization", IEEE Transactions on Evolutionary Computation. , vol. 27, no. 6, 2023, pp. 1941-1961, .
[33]
T. Li, and C. Merkel, "Model extraction and adversarial attacks on neural networks using switching power information", In International Conference on Artificial Neural Networks, vol. 30. no. Part I, 2021, pp. 91-101
[34]
H.R. Maier, S. Galelli, S. Razavi, A. Castelletti, A. Rizzoli, I.N. Athanasiadis, M. Sànchez-Marrè, M. Acutis, W. Wu, and G.B. Humphrey, "Exploding the myths: An introduction to artificial neural networks for prediction and forecasting", Environ. Model. Softw., vol. 167, p. 105776, 2023.
[http://dx.doi.org/10.1016/j.envsoft.2023.105776]
[35]
J. Yousif, "Implementation of big data analytics for simulating, predicting & optimizing the solar energy production", In: Appl. comput. J., vol. 1. 2021, pp. 133-140.
[http://dx.doi.org/10.52098/acj.202140]
[36]
Y. Zhang, P. Tiňo, A. Leonardis, and K. Tang, "A survey on neural network interpretability", IEEE Trans. Emerg. Top. Comput. Intell., vol. 5, no. 5, pp. 726-742, 2021.
[http://dx.doi.org/10.1109/TETCI.2021.3100641]
[37]
W. Samek, G. Montavon, S. Lapuschkin, C.J. Anders, and K.R. Müller, "Explaining deep neural networks and beyond: A review of methods and applications", Proc. IEEE, vol. 109, no. 3, pp. 247-278, 2021.
[http://dx.doi.org/10.1109/JPROC.2021.3060483]
[38]
W. Alkishri, and M. Al-Bahri, Deepfake image detection methods using discrete fourier transform analysis and convolutional neural network.Journal of Jilin University., Online Open Access, 2023.
[39]
Y. Khamis, and J.H. Yousif, "Deep learning feedforward neural network in predicting model of environmental risk factors in the sohar region", In: Arti. Intel. & Robo. Devel. J., 2022, pp. 1-201.
[40]
W. Alkishri, S. Widyarto, J.H. Yousif, and M. Al-Bahri, "Fake face detection based on colour textual analysis using deep convolutional neural network", J. Int. Ser. Info. Sec., vol. 13, no. 3, pp. 143-155, 2023.
[http://dx.doi.org/10.58346/JISIS.2023.I3.009]
[41]
J.H. Yousif, and T. Sembok, "Design and implement an automatic neural tagger based arabic language for nlp applications", Asian J. Info. Techno., vol. 5, no. 7, pp. 784-789, 2006.
[42]
S. Radhoush, B.M. Whitaker, and H. Nehrir, "An overview of supervised machine learning approaches for applications in active distribution networks", Energies, vol. 16, no. 16, p. 5972, 2023.
[http://dx.doi.org/10.3390/en16165972]
[43]
K. Eltouny, M. Gomaa, and X. Liang, "Unsupervised learning methods for data-driven vibration-based structural health monitoring: A review", Sensors, vol. 23, no. 6, p. 3290, 2023.
[http://dx.doi.org/10.3390/s23063290] [PMID: 36992001]
[44]
J. Bagherzadeh, and H. Asil, "A review of various semi-supervised learning models with a deep learning and memory approach", Iran J. Comput. Sci., vol. 2, no. 2, pp. 65-80, 2019.
[http://dx.doi.org/10.1007/s42044-018-00027-6]
[45]
S.S. Sankari, and P.S. Kumar, "A review of deep transfer learning strategy for energy forecasting", Nat. Environ. Poll. Techno., vol. 22, no. 4, pp. 1781-1793, 2023.
[http://dx.doi.org/10.46488/NEPT.2023.v22i04.007]
[46]
A. Ray, M.H. Kolekar, R. Balasubramanian, and A. Hafiane, "Transfer learning enhanced vision-based human activity recognition: A decade-long analysis", Intern. J. Info.Manag. Data Ins., vol. 3, no. 1, p. 100142, 2023.
[http://dx.doi.org/10.1016/j.jjimei.2022.100142]
[47]
E. Figueiredo, M. Omori Yano, S. da Silva, I. Moldovan, and M. Adrian Bud, "Transfer learning to enhance the damage detection performance in bridges when using numerical models", J. Bridge Eng., vol. 28, no. 1, p. 04022134, 2023.
[http://dx.doi.org/10.1061/(ASCE)BE.1943-5592.0001979]
[48]
N.S. Khan, and M.S. Ghani, "A survey of deep learning based models for human activity recognition", Wirel. Pers. Commun., vol. 120, no. 2, pp. 1593-1635, 2021.
[http://dx.doi.org/10.1007/s11277-021-08525-w]
[49]
V. Pomazan, I. Tvoroshenko, and V. Gorokhovatskyi, "Development of an application for recognizing emotions using convolutional neural networks", IJAISR, vol. 7, no. 7, pp. 25-36, 2023.
[50]
D. Srivastava, N. Sharma, D. Sinwar, J.H. Yousif, H.P. Gupta, Eds., Intelligent internet of things for smart healthcare systems., CRC Press: Boca Raton, 2023, pp. 1-268.
[http://dx.doi.org/10.1201/9781003326182]
[51]
M.S. Alam, F.B. Mohamed, A. Selamat, and A.B. Hossain, "A review of recurrent neural network based camera localization for indoor environments", IEEE Access, vol. 11, pp. 43985-44009, 2023.
[http://dx.doi.org/10.1109/ACCESS.2023.3272479]
[52]
Y.O. Cha, A.A. Ihalage, and Y. Hao, "Antennas and propagation research from large-scale unstructured data with machine learning: A review and predictions", IEEE Antennas Propag. Mag., vol. 65, no. 5, pp. 10-24, 2023.
[http://dx.doi.org/10.1109/MAP.2023.3290385]
[53]
P. Li, Y. Pei, and J. Li, "A comprehensive survey on design and application of autoencoder in deep learning", Appl. Soft Comput., vol. 138, p. 110176, 2023.
[http://dx.doi.org/10.1016/j.asoc.2023.110176]
[54]
A. Seghiour, H.A. Abbas, A. Chouder, and A. Rabhi, "Deep learning method based on autoencoder neural network applied to faults detection and diagnosis of photovoltaic system", Simul. Model. Pract. Theory, vol. 123, p. 102704, 2023.
[http://dx.doi.org/10.1016/j.simpat.2022.102704]
[55]
R. Wang, V. Bashyam, Z. Yang, F. Yu, V. Tassopoulou, S.S. Chintapalli, I. Skampardoni, L.P. Sreepada, D. Sahoo, K. Nikita, A. Abdulkadir, J. Wen, and C. Davatzikos, "Applications of generative adversarial networks in neuroimaging and clinical neuroscience", Neuroimage, vol. 269, p. 119898, 2023.
[http://dx.doi.org/10.1016/j.neuroimage.2023.119898] [PMID: 36702211]
[56]
J. Gui, Z. Sun, Y. Wen, D. Tao, and J. Ye, "A review on generative adversarial networks: Algorithms, theory, and applications", IEEE Trans. Knowl. Data Eng., vol. 35, no. 4, pp. 3313-3332, 2023.
[http://dx.doi.org/10.1109/TKDE.2021.3130191]
[57]
P. Xie, A. Zhou, and B. Chai, "The application of long short-term memory (LSTM) method on displacement prediction of multifactor-induced landslides", IEEE Access, vol. 7, pp. 54305-54311, 2019.
[http://dx.doi.org/10.1109/ACCESS.2019.2912419]
[58]
L. Fang, and D. Shao, "Application of long short-term memory (LSTM) on the prediction of rainfall-runoff in karst area", Front. Phys., vol. 9, p. 790687, 2022.
[http://dx.doi.org/10.3389/fphy.2021.790687]
[59]
Z. Guo, C. Yang, D. Wang, and H. Liu, "A novel deep learning model integrating CNN and GRU to predict particulate matter concentrations", Process Saf. Environ. Prot., vol. 173, pp. 604-613, 2023.
[http://dx.doi.org/10.1016/j.psep.2023.03.052]
[60]
X. Li, N. Zou, and Z. Wang, "Application of a deep learning fusion model in fine particulate matter concentration prediction", Atmosphere, vol. 14, no. 5, p. 816, 2023.
[http://dx.doi.org/10.3390/atmos14050816]
[61]
M. Krichen, "Convolutional neural networks: A survey", Computers, vol. 12, no. 8, p. 151, 2023.
[http://dx.doi.org/10.3390/computers12080151]
[62]
J. Wang, X. Li, J. Li, Q. Sun, and H. Wang, "NGCU: A new RNN model for time-series data prediction", Big Data Research, vol. 27, p. 100296, 2022.
[http://dx.doi.org/10.1016/j.bdr.2021.100296]
[63]
R. Meyes, J. Donauer, A. Schmeing, and T. Meisen, "A recurrent neural network architecture for failure prediction in deep drawing sensory time series data", Procedia Manuf., vol. 34, pp. 789-797, 2019.
[http://dx.doi.org/10.1016/j.promfg.2019.06.205]
[64]
T.A. McQueen, A.A. Hopgood, J.A. Tepper, and T.J. Allen, A recurrent self-organizing map for temporal sequence processing. In: Applications and science in soft computing., Springer Berlin Heidelberg, 2004, pp. 3-8.
[http://dx.doi.org/10.1007/978-3-540-45240-9_1]
[65]
J. Kim, W. Jung, J. An, H.J. Oh, and J. Park, "Self-optimization of training dataset improves forecasting of cyanobacterial bloom by machine learning", Sci. Total Environ., vol. 866, p. 161398, 2023.
[http://dx.doi.org/10.1016/j.scitotenv.2023.161398] [PMID: 36621510]
[66]
A. Shah, M. Shah, A. Pandya, R. Sushra, R. Sushra, M. Mehta, K. Patel, and K. Patel, "A comprehensive study on skin cancer detection using artificial neural network (ANN) and convolutional neural network", Clin. eHeal., vol. 6, pp. 76-84, 2023.
[67]
M.H. Esfe, S.A. Eftekhari, A. Alizadeh, N. Emami, and D. Toghraie, "Investigation of best artificial neural network topology to model the dynamic viscosity of MWCNT-ZnO/SAE 5W30 nano-lubricant", Mater. Today Commun., vol. 35, p. 106074, 2023.
[http://dx.doi.org/10.1016/j.mtcomm.2023.106074]
[68]
K. Haritha, S. Shailesh, M.V. Judy, K.S. Ravichandran, R. Krishankumar, and A.H. Gandomi, "A novel neural network model with distributed evolutionary approach for big data classification", Sci. Rep., vol. 13, no. 1, p. 11052, 2023.
[http://dx.doi.org/10.1038/s41598-023-37540-z] [PMID: 37422487]
[69]
H. Chen, R. Tao, Y. Fan, Y. Wang, J. Wang, B. Schiele, X. Xie, B. Raj, and M. Savvides, "Softmatch: Addressing the quantity-quality trade-off in semi-supervised learning", arXiv:2301.10921, 2023.
[70]
K. Abdalgader, and J.H. Yousif, "Agricultural irrigation control using sensor-enabled architecture", Trans. Internet Inf. Syst., vol. 16, no. 10, 2022.
[71]
O. Besbes, W. Ma, and O. Mouchtaki, "Quality vs. quantity of data in contextual decision-making: Exact analysis under newsvendor loss", arXiv:2302.08424, 2023.
[72]
J. Behler, "Four generations of high-dimensional neural network potentials", Chem. Rev., vol. 121, no. 16, pp. 10037-10072, 2021.
[http://dx.doi.org/10.1021/acs.chemrev.0c00868] [PMID: 33779150]
[73]
S. Lakra, T.V. Prasad, and G. Ramakrishna, " The future of neural networks", arXiv:1209.4855, 2012.
[74]
J. Ghosh, and A. Nag, "An overview of radial basis function networks., vol. 67. Radial basis function networks",
2001, pp. 1-36. [http://dx.doi.org/10.1007/978-3-7908-1826-0_1]
[75]
R.J. Howlett, Radial basis function networks 1.In: Recent developments in theory and applications., vol. 66. Studies in Fuzziness and Soft Computing, 2001.
[76]
E.A. Lim, W.H. Tan, and A.K. Junoh, "An improved radial basis function networks based on quantum evolutionary algorithm for training nonlinear datasets", Intern. J. Art. Intel., vol. 8, no. 2, p. 120, 2019.
[77]
W. Liu, Z. Wang, X. Liu, N. Zeng, Y. Liu, and F.E. Alsaadi, "A survey of deep neural network architectures and their applications", Neurocomputing, vol. 234, pp. 11-26, 2017.
[http://dx.doi.org/10.1016/j.neucom.2016.12.038]
[78]
L. Deng, G. Hinton, and B. Kingsbury, New types of deep neural network learning for speech recognition and related applications: An overview.In 2013 IEEE international conference on acoustics, speech and signal processing, year., 2013, pp. 8599-8603. IEEE
[79]
J.H. Yousif, and T. Sembok, "Recurrent neural approach based Arabic part-of-speech tagging", proceedings of International Conference on Computer and Communication Engineering (ICCCE’06), vol. 2, pp. 9-11, 2006.
[80]
D.A. Bashar, "Survey on evolving deep learning neural network architectures", J. Arti. Intel. Caps. Net., vol. 2019, no. 2, pp. 73-82, 2019.
[http://dx.doi.org/10.36548/jaicn.2019.2.003]
[81]
A. Dhillon, and G.K. Verma, "Convolutional neural network: A review of models, methodologies and applications to object detection", Prog. Arti. Intel., vol. 9, no. 2, pp. 85-112, 2020.
[http://dx.doi.org/10.1007/s13748-019-00203-0]
[82]
I. Sutskever, J. Martens, and G.E. Hinton, "Generating text with recurrent neural networks", Proceedings of the 28th international conference on machine learning (ICML-11) year, pp. 1017-1024, 2011.
[83]
W. Samek, and K.R. Müller, "Towards explainable artificial intelligence", In: Explainable AI: Interpreting, Explaining and Visualizing Deep Learning., Springer, 2019, pp. 5-22.
[http://dx.doi.org/10.1007/978-3-030-28954-6_1]
[84]
E. Tjoa, and C. Guan, "A survey on explainable artificial intelligence (xai): Toward medical xai", IEEE Trans. Neural Netw. Learn. Syst., vol. 32, no. 11, pp. 4793-4813, 2021.
[http://dx.doi.org/10.1109/TNNLS.2020.3027314] [PMID: 33079674]
[85]
Y. Wu, L. Deng, G. Li, J. Zhu, Y. Xie, and L. Shi, "Direct training for spiking neural networks: Faster, larger, better", Proc. Conf. AAAI Artif. Intell., vol. 33, no. 1, pp. 1311-1318, 2019.
[http://dx.doi.org/10.1609/aaai.v33i01.33011311]
[86]
A. Tavanaei, M. Ghodrati, S.R. Kheradpisheh, T. Masquelier, and A. Maida, "Deep learning in spiking neural networks", Neural Netw., vol. 111, pp. 47-63, 2019.
[http://dx.doi.org/10.1016/j.neunet.2018.12.002] [PMID: 30682710]
[87]
H. Paugam-Moisy, and S.M. Bohte, "Computing with spiking neuron networks. Handbook of natural computing. ",
2012, pp. 1-47. [http://dx.doi.org/10.1007/978-3-540-92910-9_10]
[88]
L.P. Maguire, T.M. McGinnity, B. Glackin, A. Ghani, A. Belatreche, and J. Harkin, "Challenges for large-scale implementations of spiking neural networks on FPGAs", Neurocomputing, vol. 71, no. 1-3, pp. 13-29, 2007.
[http://dx.doi.org/10.1016/j.neucom.2006.11.029]
[89]
J.D. Nunes, M. Carvalho, D. Carneiro, and J.S. Cardoso, "Spiking neural networks: A survey", IEEE Access, vol. 10, pp. 60738-60764, 2022.
[http://dx.doi.org/10.1109/ACCESS.2022.3179968]
[90]
H. Yonaba, F. Anctil, and V. Anctil, "Comparing sigmoid transfer functions for neural network multistep ahead streamflow forecasting", J. Hydrol. Eng., vol. 15, no. 4, pp. 275-283, 2010.
[91]
T.H. Le, H. Jang, and S. Shin, "Determination of the optimal neural network transfer function for response surface methodology and robust design", Appl. Sci., vol. 11, no. 15, p. 6768, 2021.
[http://dx.doi.org/10.3390/app11156768]
[92]
J.H. Yousif, and T.M.T. Sembok, "Arabic part-of-speech tagger based support vectors machines", 2008 International Symposium on Information Technology IEEE, vol. 3, pp. 1-7, 2008.
[http://dx.doi.org/10.1109/ITSIM.2008.4632066]
[93]
J.H. Yousif, "Natural language processing based soft computing techniques", Int. J. Comput. Appl., vol. 77, no. 8, pp. 43-49, 2013.
[http://dx.doi.org/10.5120/13418-1089]
[94]
I. Shafi, J. Ahmad, S.I. Shah, and F.M. Kashif, "Impact of varying neurons and hidden layers in neural network architecture for a time frequency application", 2006 IEEE International Multitopic Conference, pp. 188-193, 2006.
[http://dx.doi.org/10.1109/INMIC.2006.358160]
[95]
H. Ramchoun, Y. Ghanou, M. Ettaouil, and M.A. Janati Idrissi, "Multilayer perceptron: Architecture optimization and training", Intern. J. Int. Mult. Art. Intel., vol. 4, pp. 1-30, 2016.
[96]
J. Heaton, Introduction to neural networks with Java., Heaton Research, Inc, 2008.
[97]
M. Yang, M.K. Lim, Y. Qu, X. Li, and D. Ni, "Deep neural networks with L1 and L2 regularization for high dimensional corporate credit risk prediction", Expert Syst. Appl., vol. 213, p. 118873, 2023.
[http://dx.doi.org/10.1016/j.eswa.2022.118873]
[98]
J. Yousif, "Hidden Markov Model tagger for applications based Arabic text: A review", SSRN Electronic Journal, vol. 7, no. 1, 2019.
[http://dx.doi.org/10.2139/ssrn.3451440]
[99]
S. Adige, R. Kurban, A. Durmuş, and E. Karaköse, "Classification of apple images using support vector machines and deep residual networks", Neural Comput. Appl., vol. 35, no. 16, pp. 12073-12087, 2023.
[http://dx.doi.org/10.1007/s00521-023-08340-3]
[100]
A.A. Alsumaiei, "Utility of artificial neural networks in modeling pan evaporation in hyper-arid climates", Water, vol. 12, no. 5, p. 1508, 2020.
[http://dx.doi.org/10.3390/w12051508]
[101]
Y. Bai, "RELU-function and derived function review", In: SHS Web of Conferences vol. 144. EDP Sciences., p. 02006, 2022.
[102]
F. Alahmari, A. Naim, and H. Alqahtani, E-Learning modeling technique and convolution neural networks in online education. In IoT-enabled Convolutional Neural Networks: Techniques and Applications., River Publishers, 2023, pp. 261-295.
[103]
H.A. Kazem, J.H. Yousif, M.T. Chaichan, A.H.A. Al-Waeli, and K. Sopian, "Long-term power forecasting using FRNN and PCA models for calculating output parameters in solar photovoltaic generation", Heliyon, vol. 8, no. 1, p. e08803, 2022.
[http://dx.doi.org/10.1016/j.heliyon.2022.e08803] [PMID: 35128098]
[104]
J.H. Yousif, and H.A. Kazem, "Prediction and evaluation of photovoltaic-thermal energy systems production using artificial neural network and experimental dataset", Case Stud. Therm. Eng., vol. 27, p. 101297, 2021.
[http://dx.doi.org/10.1016/j.csite.2021.101297]
[105]
A. Aimen, B. Ladrecha, S. Sidheekh, and N.C. Krishnan, "Leveraging task variability in meta-learning", SN Computer Science, vol. 4, no. 5, p. 539, 2023.
[http://dx.doi.org/10.1007/s42979-023-01951-6]
[106]
F. Shao, and Z. Shen, "How can artificial neural networks approximate the brain", Front. Psychol., vol. 13, pp. 1-970, 2023.
[107]
José Salvador, João Oliveira, and Maurício Breternitz, "Reinforcement learning", A lit. rev., vol. 2020, pp. 1-36, 2020.