locked
Test Run - Deep Neural Network Training RRS feed

  • General discussion

  • James McCaffrey explains how to train a DNN using the back-propagation algorithm and describes the associated "vanishing gradient" problem. You'll get code to experiment with, and a better understanding of what goes on behind the scenes when you use a neural network library such as Microsoft CNTK or Google TensorFlow.

    Read this article in the September 2017 issue of MSDN Magazine

    Friday, September 1, 2017 5:50 PM

All replies

  • Really appreciated, this was significantly useful especially for narcissist programmers who would like to build something from the scratch rather than using a predefined library without understanding what's going on.

    I would like to suggest if Dr. James McCaffrey would tackle the issue of parallel computation within the deep learning or so-called training over GPU

    Wednesday, November 22, 2017 8:29 AM
  • I read the article and tinkered with the C# code. However, I am having problems of using the ANN to get a set of outputs, based on the inputs I give to it. I can't seem to find a method or a hint on how to do it. The cycle stops only at training ANN.

    Can somebody help me here?

    Saturday, November 3, 2018 11:28 AM
  • Where can i find the code?

    Thnks.

    Monday, November 25, 2019 6:31 PM