Skip to content

Commit

Permalink
softmax
Browse files Browse the repository at this point in the history
  • Loading branch information
rasbt committed Nov 9, 2015
1 parent 46c4456 commit 99c401d
Show file tree
Hide file tree
Showing 6 changed files with 17 additions and 0 deletions.
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -91,6 +91,7 @@ Excerpts from the [Foreword](./docs/foreword_ro.pdf) and [Preface](./docs/prefac
- [What is the probabilistic interpretation of regularized logistic regression?](./faq/probablistic-logistic-regression.md)
- [Can you give a visual explanation for the back propagation algorithm for neural networks?](./faq/visual-backpropagation.md)
- [How do I evaluate a model?](./faq/evaluate-a-model.md)
- [What exactly is the "softmax and the multinomial logistic loss" in the context of machine learning?](./faq/softmax.md)
- [Why do we re-use parameters from the training set to standardize the test set and new data?](./faq/standardize-param-reuse.md)
- [What are some of the issues with clustering?](./faq/issues-with-clustering.md)
- [What is the difference between deep learning and usual machine learning?](./faq/difference-deep-and-normal-learning.md)
Expand Down
1 change: 1 addition & 0 deletions faq/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,7 @@ Sebastian
- [What is the probabilistic interpretation of regularized logistic regression?](./probablistic-logistic-regression.md)
- [Can you give a visual explanation for the back propagation algorithm for neural networks?](./visual-backpropagation.md)
- [How do I evaluate a model?](./evaluate-a-model.md)
- [What exactly is the "softmax and the multinomial logistic loss" in the context of machine learning?](./softmax.md)
- [Why do we re-use parameters from the training set to standardize the test set and new data?](./standardize-param-reuse.md)
- [What are some of the issues with clustering?](./issues-with-clustering.md)
- [What is the difference between deep learning and usual machine learning?](./difference-deep-and-normal-learning.md)
Expand Down
15 changes: 15 additions & 0 deletions faq/softmax.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
# What exactly is the "softmax and the multinomial logistic loss" in the context of machine learning?

The softmax function is simply a generalization of the logistic function that allows us to compute meaningful class-probabilities in multi-class settings (multinomial logistic regression). In softmax, we compute the probability that a particular sample (with net input z) belongs to the *i*th class using a normalization term in the denominator that is the sum of all *M* linear functions:

![](./softmax/softmax_1.png)

In contrast, the logistic function:

![](./softmax/logistic.png)

And for completeness, we define the net input as

![](./softmax/net_input.png)

where the weight coefficients of your model are stored as vector "w" and "x" is the feature vector of your sample.
Binary file added faq/softmax/logistic.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added faq/softmax/net_input.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added faq/softmax/softmax_1.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit 99c401d

Please sign in to comment.