menu
{ "item_title" : "A Graph Theoretical Approach to Pruning Deep Neural Networks", "item_author" : [" David Hoffmann "], "item_description" : "Bachelor Thesis from the year 2024 in the subject Computer Sciences - Artificial Intelligence, grade: 100/100, Baden-Wuerttemberg Cooperative State University (DHBW) (Economics), course: Business Information Systems - Data Science, language: English, abstract: Imagine a world where deep learning models, despite their immense power, are no longer constrained by computational limitations. This vision fuels the innovative research presented herein, a journey into the realm of efficient deep neural networks through the lens of graph theory. This work introduces MLP-Rank, a groundbreaking method for network pruning that leverages the principles of weighted PageRank to identify and strategically remove redundant connections within multilayer perceptrons (MLPs). By representing the neural network architecture as a graph, the algorithm meticulously assigns importance scores to each connection, allowing for the targeted elimination of less crucial pathways, drastically reducing computational overhead without sacrificing accuracy. The core of this research delves into the algorithm's theoretical underpinnings, exploring its structural adaptations and modifications to the standard PageRank to optimize performance within neural network topologies. Rigorous experimentation across diverse datasets, including MNIST, Fashion-MNIST, and CIFAR-10, and various MLP architectures validates the efficacy of the MLP-Rank method, demonstrating significant improvements in inference speed and model compression. This exploration extends to a critical analysis of the theoretical assumptions against empirical results, bridging the gap between predicted and observed performance, and paving the way for future advancements in deep learning optimization. Discover how the synergy of graph theory and network pruning unlocks a new era of efficient, streamlined deep learning models, poised to revolutionize applications in resource-constrained environments, making AI more accessible and practical than ev", "item_img_path" : "https://covers2.booksamillion.com/covers/bam/3/38/908/668/3389086684_b.jpg", "price_data" : { "retail_price" : "58.50", "online_price" : "58.50", "our_price" : "58.50", "club_price" : "58.50", "savings_pct" : "0", "savings_amt" : "0.00", "club_savings_pct" : "0", "club_savings_amt" : "0.00", "discount_pct" : "10", "store_price" : "" } }
A Graph Theoretical Approach to Pruning Deep Neural Networks|David Hoffmann

A Graph Theoretical Approach to Pruning Deep Neural Networks

local_shippingShip to Me
In Stock.
FREE Shipping for Club Members help

Overview

Bachelor Thesis from the year 2024 in the subject Computer Sciences - Artificial Intelligence, grade: 100/100, Baden-Wuerttemberg Cooperative State University (DHBW) (Economics), course: Business Information Systems - Data Science, language: English, abstract: Imagine a world where deep learning models, despite their immense power, are no longer constrained by computational limitations. This vision fuels the innovative research presented herein, a journey into the realm of efficient deep neural networks through the lens of graph theory. This work introduces MLP-Rank, a groundbreaking method for network pruning that leverages the principles of weighted PageRank to identify and strategically remove redundant connections within multilayer perceptrons (MLPs). By representing the neural network architecture as a graph, the algorithm meticulously assigns importance scores to each connection, allowing for the targeted elimination of less crucial pathways, drastically reducing computational overhead without sacrificing accuracy. The core of this research delves into the algorithm's theoretical underpinnings, exploring its structural adaptations and modifications to the standard PageRank to optimize performance within neural network topologies. Rigorous experimentation across diverse datasets, including MNIST, Fashion-MNIST, and CIFAR-10, and various MLP architectures validates the efficacy of the MLP-Rank method, demonstrating significant improvements in inference speed and model compression. This exploration extends to a critical analysis of the theoretical assumptions against empirical results, bridging the gap between predicted and observed performance, and paving the way for future advancements in deep learning optimization. Discover how the synergy of graph theory and network pruning unlocks a new era of efficient, streamlined deep learning models, poised to revolutionize applications in resource-constrained environments, making AI more accessible and practical than ev

This item is Non-Returnable

Details

  • ISBN-13: 9783389086681
  • ISBN-10: 3389086684
  • Publisher: Grin Verlag
  • Publish Date: November 2024
  • Dimensions: 8.27 x 5.83 x 0.19 inches
  • Shipping Weight: 0.25 pounds
  • Page Count: 80

Related Categories

You May Also Like...

    1

BAM Customer Reviews