Top Research Papers on Neural Networks
Explore our curated list of top research papers on Neural Networks. Delve into cutting-edge innovations and advancements that are shaping the future of artificial intelligence. Perfect for researchers, students, and enthusiasts who want to stay updated with the latest trends and findings in this dynamic field.
Looking for research-backed answers?Try AI Search
How Neural Networks Extrapolate: From Feedforward to Graph Neural Networks
107 Citations 2020Keyulu Xu, Mozhi Zhang, Jingling Li + 3 more
arXiv (Cornell University)
The success of GNNs in extrapolating algorithmic tasks to new data relies on encoding task-specific non-linearities in the architecture or features, and a hypothesis is suggested for which theoretical and empirical evidence is provided.
Optimal Conversion of Conventional Artificial Neural Networks to Spiking Neural Networks
100 Citations 2021Shikuang Deng, Shi Gu
arXiv (Cornell University)
Spiking neural networks (SNNs) are biology-inspired artificial neural networks (ANNs) that comprise of spiking neurons to process asynchronous discrete signals. While more efficient in power consumption and inference speed on the neuromorphic hardware, SNNs are usually difficult to train directly from scratch with spikes due to the discreteness. As an alternative, many efforts have been devoted to converting conventional ANNs into SNNs by copying the weights from ANNs and adjusting the spiking threshold potential of neurons in SNNs. Researchers have designed new SNN architectures and conversio...
Graph Neural Networks in Network Neuroscience
279 Citations 2022Alaa Bessadok, Mohamed Ali Mahjoub, Islem Rekik
IEEE Transactions on Pattern Analysis and Machine Intelligence
Current GNN-based methods are reviewed, highlighting the ways that they have been used in several applications related to brain graphs such as missing brain graph synthesis and disease classification, and charting a path toward a better application of GNN models in network neuroscience field for neurological disorder diagnosis and population graph integration.
This textbook provides a compact but comprehensive treatment that provides analytical and design steps to recurrent neural networks from scratch.
Operational neural networks
112 Citations 2022Serkan Kıranyaz, Türker İnce, Alexandros Iosifidis + 1 more
Qatar University QSpace (Qatar University)
This study proposes a novel network model, called operational neural networks (ONNs), which can be heterogeneous and encapsulate neurons with any set of operators to boost diversity and to learn highly complex and multi-modal functions or spaces with minimal network complexity and training data.
Graph neural networks
210 Citations 2024Gabriele Corso, H. Stärk, Stefanie Jegelka + 2 more
Nature Reviews Methods Primers
This Primer provides a practical and accessible introduction to GNNs, describing their properties and applications to the life and physical sciences and explores how they are applied across the life and physical sciences.
A biomimetic neural encoder for spiking neural network
171 Citations 2021Shiva Subbulakshmi Radhakrishnan, Amritanand Sebastian, Aaryan Oberoi + 2 more
Nature Communications
A biomimetic device based on a dual gated MoS2 field effect transistor capable of encoding analog signals into stochastic spike trains at energy cost of 1–5 pJ/spike is demonstrated.
Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph Neural Networks
162 Citations 2020Fernando Gama, Elvin Isufi, Geert Leus + 1 more
IEEE Signal Processing Magazine
The role of graph convolutional filters in GNNs is discussed and it is shown that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
Neural Bursting and Synchronization Emulated by Neural Networks and Circuits
140 Citations 2021Hairong Lin, Chunhua Wang, Chengjie Chen + 4 more
IEEE Transactions on Circuits and Systems I Regular Papers
In this paper, neural bursting and synchronization are imitated by modeling two neural network models based on the Hopfield neural network, which consists of four neurons, which correspond to realizing neural bursting firings.
Distributionally Robust Neural Networks
110 Citations 2020Shiori Sagawa, Pang Wei Koh, Tatsunori Hashimoto + 1 more
International Conference on Learning Representations
The results suggest that regularization is critical for worst-group performance in the overparameterized regime, even if it is not needed for average performance, and introduce and provide convergence guarantees for a stochastic optimizer for this group DRO setting.
A High-Efficient Hybrid Physics-Informed Neural Networks Based on Convolutional Neural Network
157 Citations 2021Z. Fang
IEEE Transactions on Neural Networks and Learning Systems
This is the first work that the machine learning PDE’s solver has a convergent rate, such as in numerical methods, and can be applied in inverse problems and surface PDEs, although without proof.
Neural Bellman-Ford Networks: A General Graph Neural Network Framework for Link Prediction
112 Citations 2021Zhaocheng Zhu, Zuobai Zhang, Louis-Pascal Xhonneux + 1 more
arXiv (Cornell University)
Link prediction is a very fundamental task on graphs. Inspired by traditional path-based methods, in this paper we propose a general and flexible representation learning framework based on paths for link prediction. Specifically, we define the representation of a pair of nodes as the generalized sum of all path representations, with each path representation as the generalized product of the edge representations in the path. Motivated by the Bellman-Ford algorithm for solving the shortest path problem, we show that the proposed path formulation can be efficiently solved by the generalized Bellm...
Brain Tumor Detection and Classification Using Convolutional Neural Network and Deep Neural Network
168 Citations 2020Chirodip Lodh Choudhury, Chandrakanta Mahanty, Raghvendra Kumar + 1 more
2020 International Conference on Computer Science, Engineering and Applications (ICCSEA)
The proposed work involves the approach of deep neural network and incorporates a CNN based model to classify the MRI as "Tumour DETECTED" or "TUMOUR Not DETECTed" and captures a mean accuracy score of 96.08% with fscore of 97.3.
Binary neural networks: A survey
518 Citations 2020Haotong Qin, Ruihao Gong, Xianglong Liu + 3 more
Pattern Recognition
A comprehensive survey of algorithms proposed for binary neural networks, mainly categorized into the native solutions directly conducting binarization, and the optimized ones using techniques like minimizing the quantization error, improving the network loss function, and reducing the gradient error are presented.
A Review of Optical Neural Networks
211 Citations 2020Xiubao Sui, Qiuhao Wu, Jia Liu + 2 more
IEEE Access
A review of the progress of optical neural networks based on the principle of artificial neural networks, the essence of optical matrix multiplier for linear operation and the nonlinearity in optical neural network is introduced.
Streaming Graph Neural Networks
189 Citations 2020Yao Ma, Ziyi Guo, Zhaocun Ren + 2 more
journal unavailable
DyGNN, a Dynamic Graph Neural Network model, which can model the dynamic information as the graph evolving is proposed, which keeps updating node information by capturing the sequential information of edges (interactions), the time intervals between edges and information propagation coherently.
Review of Convolutional Neural Network
164 Citations 2024Zhenyuan Du
Science and Technology of Engineering Chemistry and Environmental Protection
This paper scrutinizes the application of CNNs in various fields, including image classification, facial recognition, audio retrieval, electrocardiogram analysis, and object detection, and posits the amalgamation of CNNs with recurrent neural networks as a potential alternative for training datasets.
A Review of Convolutional Neural Networks
410 Citations 2020Arohan Ajit, Koustav Acharya, Abhishek Samanta
2020 International Conference on Emerging Trends in Information Technology and Engineering (ic-ETITE)
Convolutional Neural Networks employs a definitely algorithm of steps to follow including methods like Backpropagation, Convolutional Layers, Feature formation and Pooling.
Artificial Neural Networks in Agriculture
153 Citations 2021Sebastian Kujawa, Gniewko Niedbała
Agriculture
The purpose of this Special Issue was to publish high-quality research and review papers that cover the application of various types of artificial neural networks in solving relevant tasks and problems of widely defined agriculture.
An Introduction to Convolutional Neural Networks
281 Citations 2022Aarush Saxena
International Journal for Research in Applied Science and Engineering Technology
CNNs are primarily used to solve difficult image-driven pattern recognition tasks and with their precise yet simple architecture, offer a simplified method of getting started with ANNs.
This textbook covers both classical and modern models in deep learning and includes examples and exercises throughout the chapters.
Attention Spiking Neural Networks
189 Citations 2023Man Yao, Guangshe Zhao, Hengyu Zhang + 5 more
IEEE Transactions on Pattern Analysis and Machine Intelligence
This work lights up SNN's potential as a general backbone to support various applications in the field of SNN research, with a great balance between effectiveness and energy efficiency.
Spiking Neural Networks: A Survey
151 Citations 2022João D. Nunes, Marcelo Carvalho, Diogo Carneiro + 1 more
IEEE Access
This survey covers the main ideas behind SNNs and reviews recent trends in learning rules and network architectures, with a particular focus on biologically inspired strategies.
Graph Neural Networks with Heterophily
189 Citations 2021Jiong Zhu, Ryan A. Rossi, Anup Rao + 4 more
Proceedings of the AAAI Conference on Artificial Intelligence
The proposed framework incorporates an interpretable compatibility matrix for modeling the heterophily or homophily level in the graph, which can be learned in an end-to-end fashion, enabling it to go beyond the assumption of stronghomophily.
ACTIVATION FUNCTIONS IN NEURAL NETWORKS
1157 Citations 2020Siddharth Sharma, Simone Sharma, Anidhya Athaiya
International Journal of Engineering Applied Sciences and Technology
In an Artificial Neural Network, activation functions are very important as they help in learning and making sense of non-linear and complicated mappings between the inputs and corresponding outputs.
Benchmarking Graph Neural Networks
233 Citations 2020Vijay Prakash Dwivedi, Chaitanya K. Joshi, Thomas Laurent + 3 more
arXiv (Cornell University)
A reproducible GNN benchmarking framework is introduced, with the facility for researchers to add new models conveniently for arbitrary datasets, and a principled investigation into the recent Weisfeiler-Lehman GNNs (WL-GNNs) compared to message passing-based graph convolutional networks (GCNs).
Orthogonal Convolutional Neural Networks
172 Citations 2020Jiayun Wang, Yubei Chen, Rudrasis Chakraborty + 1 more
journal unavailable
The proposed orthogonal convolution requires no additional parameters and little computational overhead and consistently outperforms the kernel orthogonality alternative on a wide range of tasks such as image classification and inpainting under supervised, semi-supervised and unsupervised settings.
Dynamic Neural Networks: A Survey
672 Citations 2021Yizeng Han, Gao Huang, Shiji Song + 3 more
IEEE Transactions on Pattern Analysis and Machine Intelligence
This survey comprehensively review this rapidly developing area of dynamic networks by dividing dynamic networks into three main categories: sample-wise dynamic models that process each sample with data-dependent architectures or parameters; spatial-wiseynamic networks that conduct adaptive computation with respect to different spatial locations of image data; and temporal-wise Dynamic networks that perform adaptive inference along the temporal dimension for sequential data.
Image Classification Using Convolutional Neural Network (CNN) and Recurrent Neural Network (RNN): A Review
124 Citations 2020Patel Dhruv, Subham Naskar
Advances in intelligent systems and computing
This paper concentrates upon the use of RNN and CNN in the feature extraction of images and the challenges and a brief literature review of the neural networks like CNN and RNN.
An Overview on the Application of Graph Neural Networks in Wireless Networks
115 Citations 2021Shiwen He, Shaowen Xiong, Yeyu Ou + 4 more
IEEE Open Journal of the Communications Society
An overview of the construction method of wireless communication graph for various wireless networks and the progress of several classical paradigms of graph neural networks are introduced, as well as several applications of GNNs in wireless networks such as resource allocation and several emerging fields.
An optical neural chip for implementing complex-valued neural network
611 Citations 2021Hui Zhang, Mile Gu, Xudong Jiang + 15 more
Nature Communications
This article implements complex-valued operations in an optical neural chip that integrates input preparation, weight multiplication and output generation within a single device.
Pattern Recognition with Neural Networks in C++
235 Citations 2021Abhijit S. Pandya, Robert B. Macy
journal unavailable
Pattern Recognition with Neural Networks in C++ covers pattern classification and neural network approaches within the same framework and provides an intuitive explanation of each method for each network paradigm.
Solar radiation prediction using recurrent neural network and artificial neural network: A case study with comparisons
304 Citations 2020Zhihong Pang, Fuxin Niu, Zheng O’Neill
Renewable Energy
Compared with the ANN model, the solar radiation prediction using the RNN model has a higher prediction accuracy, with a 47% improvement in Normalized Mean Bias Error (NMBE) and a 26% improved in Root-Mean-Squared Error (RMSE).
Graph Neural Networks in IoT: A Survey
140 Citations 2022Guimin Dong, Mingyue Tang, Zhiyuan Wang + 7 more
ACM Transactions on Sensor Networks
A comprehensive review of recent advances in the application of graph neural networks to the IoT field is presented, including a deep dive analysis of GNN design in various IoT sensing environments, an overarching list of public data and source codes from the collected publications, and future research directions.
Stability Properties of Graph Neural Networks
203 Citations 2020Fernando Gama, Joan Bruna, Alejandro Ribeiro
IEEE Transactions on Signal Processing
This work proves that graph convolutions with integral Lipschitz filters, in combination with the frequency mixing effect of the corresponding nonlinearities, yields an architecture that is both stable to small changes in the underlying topology, and discriminative of information located at high frequencies.
Understanding Pooling in Graph Neural Networks
102 Citations 2022Daniele Grattarola, Daniele Zambon, Filippo Maria Bianchi + 1 more
IEEE Transactions on Neural Networks and Learning Systems
An operational framework to unify this vast and diverse literature by describing pooling operators as the combination of three functions: selection, reduction, and connection (SRC).
HGNN<sup>+</sup>: General Hypergraph Neural Networks
394 Citations 2022Yue Gao, Yifan Feng, Shuyi Ji + 1 more
IEEE Transactions on Pattern Analysis and Machine Intelligence
Comprehensive evaluations indicate that the proposed HGNN framework can consistently outperform existing methods with a significant margin, especially when modeling implicit data correlations.
Artificial neural networks in microgrids: A review
135 Citations 2020Tania B. López-García, Alberto Coronado‐Mendoza, José A. Domínguez‐Navarro
Engineering Applications of Artificial Intelligence
Attention is brought to the promising applicability of artificial neural networks applied to the control of microgrid distributed generation sources, as well as in scheduling, power sharing, supervisory control and optimization.
A survey of uncertainty in deep neural networks
1006 Citations 2023Jakob Gawlikowski, Cedrique Rovile Njieutcheu Tassi, Mohsin Ali + 11 more
Artificial Intelligence Review
This work gives a comprehensive overview of uncertainty estimation in neural networks, reviews recent advances in the field, highlights current challenges, and identifies potential research opportunities, and a comprehensive introduction to the most crucial sources of uncertainty is given.
A survey of methods for pruning deep neural networks is presented, categorising over 150 studies based on the underlying approach used and focusing on three categories: methods that use magnitude based pruning, methods that utilise clustering to identify redundancy, and methods that using sensitivity analysis to assess the effect of pruning.