Show simple item record

dc.rights.licenseIn Copyrighten_US
dc.creatorGazin, Jackson Mark
dc.date.accessioned2022-05-13T14:32:16Z
dc.date.available2022-05-13T14:32:16Z
dc.date.created2022
dc.identifierWLURG38_Gazin_MATH_2022
dc.identifier.urihttp://hdl.handle.net/11021/35853
dc.descriptionThesis; [FULL-TEXT FREELY AVAILABLE ONLINE]en_US
dc.descriptionJackson Mark Gazin is a member of the Class of 2022 of Washington and Lee University.en_US
dc.description.abstractThis thesis is about some of the methods and concepts of linear algebra that are particularly helpful for data analysis. After a brief review of some linear algebra concepts in chapter 1, the second chapter of the thesis centers around the singular value decomposition (SVD) which expresses any matrix A as a product of an orthogonal matrix, a diagonal matrix, and another orthogonal matrix. Understanding the SVD requires understanding the properties of symmetric matrices, which are explained first. Chapter 3 focuses on the applications of the SVD. It begins with using the SVD for low-rank approximation, and then explores how the SVD is applied in principle component analysis. Chapter 4 introduces neural networks, a machine learning architecture useful for image recognition among other applications. It introduces the structure of a neural network in linear algebraic notation; one of the main goals of chapter 4 is to reinforce the idea that neural networks can be seen as compositions of matrix transformations with non-linear activation functions. We then introduce how the parameters in a neural network are optimized. Chapter 5 deals with convolutional neural networks. It also focuses heavily on circulant matrices, and the relationship between convolution and circulant matrices. Understanding the properties of circulant matrices will be instrumental in understanding the benefits of convolution. We finish the chapter by showing how PCA can be used as a data pre-processing tool before running the data through a convolutional neural network.en_US
dc.description.statementofresponsibilityJackson Gazin
dc.format.extent53 pagesen_US
dc.language.isoen_USen_US
dc.rightsThis material is made available for use in research, teaching, and private study, pursuant to U.S. Copyright law. The user assumes full responsibility for any use of the materials, including but not limited to, infringement of copyright and publication rights of reproduced materials. Any materials used should be fully credited with the source.en_US
dc.rights.urihttp://rightsstatements.org/vocab/InC/1.0/en_US
dc.subject.otherWashington and Lee University -- Honors in Mathematicsen_US
dc.titleLinear Algebraic Methods in Data Science and Neural Networks (thesis)en_US
dc.typeTexten_US
dcterms.isPartOfRG38 - Student Papers
dc.rights.holderGazin, Jackson Mark
dc.subject.fastAlgebras, Linearen_US
dc.subject.fastSingular value decompositionen_US
dc.subject.fastNeural networks (Computer science) -- Mathematicsen_US
local.departmentMathematicsen_US
local.scholarshiptypeHonors Thesisen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record