menu
{ "item_title" : "Using Upper Layer Weights to Efficiently Construct and Train Feedforward Neural Networks Executing Backpropagation", "item_author" : [" Harmon J. a. Gage "], "item_description" : "Feed-forward neural networks executing back propagation are a common tool for regression and pattern recognition problems. These types of neural networks can adjust themselves to data without any prior knowledge of the input data. Feed-forward neural networks with a hidden layer can approximate any function with arbitrary accuracy. In this research, the upper layer weights of the neural network structure are used to determine an effective middle layer structure and when to terminate training. By combining these two techniques with signal-to-noise ratio feature selection, a process is created to construct an efficient neural network structure. The results of this research show that for data sets tested thus far, these methods yield efficient neural network structure in minimal training time. Data sets used include an XOR data set, Fisher's Iris problem, a financial industry data set, among others.", "item_img_path" : "https://covers3.booksamillion.com/covers/bam/1/24/961/346/1249613469_b.jpg", "price_data" : { "retail_price" : "57.95", "online_price" : "57.95", "our_price" : "57.95", "club_price" : "57.95", "savings_pct" : "0", "savings_amt" : "0.00", "club_savings_pct" : "0", "club_savings_amt" : "0.00", "discount_pct" : "10", "store_price" : "" } }
Using Upper Layer Weights to Efficiently Construct and Train Feedforward Neural Networks Executing Backpropagation|Harmon J. a. Gage

Using Upper Layer Weights to Efficiently Construct and Train Feedforward Neural Networks Executing Backpropagation

local_shippingShip to Me
On Order. Usually ships in 2-4 weeks
FREE Shipping for Club Members help

Overview

Feed-forward neural networks executing back propagation are a common tool for regression and pattern recognition problems. These types of neural networks can adjust themselves to data without any prior knowledge of the input data. Feed-forward neural networks with a hidden layer can approximate any function with arbitrary accuracy. In this research, the upper layer weights of the neural network structure are used to determine an effective middle layer structure and when to terminate training. By combining these two techniques with signal-to-noise ratio feature selection, a process is created to construct an efficient neural network structure. The results of this research show that for data sets tested thus far, these methods yield efficient neural network structure in minimal training time. Data sets used include an XOR data set, Fisher's Iris problem, a financial industry data set, among others.

This item is Non-Returnable

Details

  • ISBN-13: 9781249613466
  • ISBN-10: 1249613469
  • Publisher: Biblioscholar
  • Publish Date: October 2012
  • Dimensions: 9.69 x 7.44 x 0.22 inches
  • Shipping Weight: 0.45 pounds
  • Page Count: 106

Related Categories

You May Also Like...

    1

BAM Customer Reviews