Date of Award
5-2021
Degree Type
Capstone
Degree Name
Master of Science in Information Technology (MSIT)
Department
School of Professional Studies
Chief Instructor
Richard Aroian
Abstract
This thesis is about neural networks and how their algorithmic systems work. Neural networks are well-suited to aiding people with complex challenges in real-world situations. Thesis topics include nonlinear and complicated interactions between inputs and outputs, as well as making inferences, discovering hidden links, patterns, and predictions, and modeling highly volatile data and variations to forecast uncommon events. Neural networks have the potential to help people make better decisions. NLP is a technique for analyzing, interpreting, and comprehending large amounts of text. We can no longer evaluate the text using traditional approaches due to the massive volumes of text data and the exceedingly unstructured data source, which is where NLP comes in. As a result, the research focuses on what a neural network is and how different types of neural networks are used in natural language processing. NLP (natural language processing) is a method for analyzing, interpreting, and comprehending vast amounts of text. Due to the huge volumes of text data and the extremely unstructured data source, we can no longer analyze the text using standard approaches, which is where NLP comes in. As a result, the study concentrates on what a neural network is and how various types of neural networks are used in natural language processing. Due to their exceptional success in numerous NLP tasks, BERT in particular has gotten a lot of attention. Google's Bidirectional Encoder Representations from Transformers (BERT) is a Transformer-based machine learning methodology for pre-training in natural language processing (NLP).
Recommended Citation
Sharma, Harshita and Jain, Tinkle, "Neural Network With Nlp" (2021). School of Professional Studies. 88.
https://commons.clarku.edu/sps_masters_papers/88