Backpropagation Visuals: Neural Networks Don’t Have to Be a Black Box

Backpropagation Visuals: Neural Networks Don’t Have to Be a Black Box

23 views
1 min read

Backpropagation Visuals: Neural Networks Don’t Have to Be a Black Box AKASH YADAV · Follow 4 min read · Just now Fig 1: Visualization of weights and activations between layers Introduction In this post, we will focus on backpropagation visualization to better understand how each layer applies weights, how activations highlight certain parts of an image, and how more weight is given to those pixels. All demonstrations in this post are generated using VGG16 and can be reproduced with any other network The post is organized as follows: Weights and Activations visualization Saliency map Guided Backpropagation Conclusion Complete code can found on this GitHub repo Weights and Activations Visualization: Weights are parameters in the network that are adjusted during training to minimize the gap between the expected output and […]

Latest from Blog

withemes on instagram