Skip to content

Commit c430df9

Browse files
authored
Merge pull request #2 from akutzer/readme-updates
Readme updates
2 parents f2fbcb7 + e836ac2 commit c430df9

File tree

1 file changed

+16
-7
lines changed

1 file changed

+16
-7
lines changed

README.md

Lines changed: 16 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -24,19 +24,21 @@ For the test_CNN script you will also need PyTorch, because I confirmed my resul
2424
```
2525
pip install torch===1.4.0
2626
```
27-
or use [the PyTorch website](https://pytorch.org/)
27+
Alternatively use [the PyTorch website](https://pytorch.org/).
2828

2929

3030

31-
### Testing
31+
### Testing and Example
3232

33+
**Testing:**
3334
The ```test_CNN.py``` script runs the forward- and backwardpass of all Layers, Activations and Losses with random shaped inputs
3435
and checks the results with the PyTorch-Autograd Engine.
3536

37+
**Example:**
3638
I also wrote a small Network in the ```FashionNet.py``` file, which trains a small Model with the FashionMNIST dataset.
3739
The Model was trained for only one epoch and returned some descend results. They aren't the best, but my test with the same Model in PyTorch got a similar result, so it must be the bad architecture and the short training of only one epoch.
3840
![Plot of Loss and Accuracy](FashionMNIST_model_graph.png)
39-
*Note: the Testing Loss and Accuracy is more stable because the testing batch was four times the size of the training batch*
41+
*NOTE: The Testing Loss and Accuracy is more stable because the testing batch was four times the size of the training batch.*
4042

4143

4244

@@ -69,13 +71,20 @@ The Model was trained for only one epoch and returned some descend results. They
6971

7072

7173

74+
### Future Updates
75+
76+
- update BinaryCrossEntropyLoss for more numerical stability
77+
- add BatchNorm and Dropout-Layer
78+
- add CELU and ELU activations
79+
80+
81+
7282
### Acknowledgments
7383
For the Softmax, LogSoftmax and CrossEntropyLoss-Module I used
7484
the numerical more stable functions implemented in the PyTorch Library!
75-
You should definetly check [this amazing Library out](https://pytorch.org/) ;) luv u :*
76-
77-
Also a great source for Convolutions and Optimizer were [the CS231n course notes](http://cs231n.github.io/)
78-
85+
You should definetly check [this amazing Library out](https://pytorch.org/)! ;) luv u :*
7986

87+
Also a great source for Convolutions and Optimizer were [the CS231n course notes](http://cs231n.github.io/).
8088

89+
To learn more about Transposed Convolutions: [Paper](https://arxiv.org/pdf/1603.07285.pdf) and [Animations](https://github.com/vdumoulin/conv_arithmetic).
8190

0 commit comments

Comments
 (0)