comparison of some error estimates for neural network models
Read Online

comparison of some error estimates for neural network models

  • 171 Want to read
  • ·
  • 31 Currently reading

Published by University of Toronto, Dept. of Statistics in [Toronto] .
Written in English


  • Bootstrap (Statistics),
  • Error analysis (Mathematics),
  • Estimation theory

Book details:

Edition Notes

StatementRobert Tibshirani.
SeriesTechnical report series / University of Toronto, Dept. of Statistics -- no. 9410), Technical report (University of Toronto. Dept. of Statistics -- no. 9410)
LC ClassificationsQA276.8 .T5 1994
The Physical Object
Pagination11 p.
Number of Pages11
ID Numbers
Open LibraryOL16922789M

Download comparison of some error estimates for neural network models


by Robert Tibshirani Citations: 41 - 0 self. The use of resampling techniques in neural network models is increasing (Refenes and Zapranis, ; White and Racine, ; Giordano et al. Giordano et al., ; La Rocca and Perna, a. CiteSeerX - Document Details (Isaac Councill, Lee Giles, Pradeep Teregowda).   Training set for neural network models and estimation set for regression analysis included 32 cases for the first experiment and 60 for the second experiment. Forecasts of the dependent variable, EFH, for regression and neural network models were computed using test data covering seven cases. 6. Results of the experimental studies

The work at hand proposes Artificial Neural Network (ANN) models for the PL prediction in urban environments. The ANN technique has been introduced in several articles of the literature, for the.   Comparison of artificial neural network and decision tree models in estimating spatial distribution of snow depth in a semi-arid region of Iran which constitute some rule-based models and can be produced by Cubist. When the initial slope and analytical hill shading are common influential parameters on SD detected utilizing both the.   As David MacKay explains in his info theory book, logistic regression is a simple neural network with N inputs, one output, and no hidden layers (he called it “classification with one neuron” rather than logistic regression). With appropriate link functions, neural networks can be used as generalized linear models. Finally, it’s time for neural networks. The network will have (n+1) inputs, n for prices and one for dividend indicator, and one output. We still need to determine n. For this, we will write a function that creates a neural network with a specified number of inputs. We use input_shape=(n+1,) expression to include the dividend indicator.

[17]. However, as mentioned above, machine learning models, especially neural network models, have gained a lot of popularity in this field [18]. As discussed in the Background and Related Work Section (Section 2), several types of neural network models have been used to . Deep feedforward networks, also called feedforward neural networks, or multilayer perceptrons (MLPs), represent the most typical deep learning models. The goal of an MLP is to approximate some. In this chapter, artificial neural networks (ANNs) inverse model is applied for estimating the thermal performance in parabolic trough concentrator (PTC). A recurrent neural network architecture is trained using the Kalman Filter learning from experimental database obtained from PTCs operations. Rim angle (φr), inlet (Tin), outlet (Tout) fluid temperatures, ambient temperature (Ta), water.   Neural networks is an algorithm inspired by the neurons in our brain. It is designed to recognize patterns in complex data, and often performs the best when recognizing patterns in audio, images or video. Neurons — Connected. A neural network simply consists of neurons (also called nodes). These nodes are connected in some way.