You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+16-7Lines changed: 16 additions & 7 deletions
Original file line number
Diff line number
Diff line change
@@ -24,19 +24,21 @@ For the test_CNN script you will also need PyTorch, because I confirmed my resul
24
24
```
25
25
pip install torch===1.4.0
26
26
```
27
-
or use [the PyTorch website](https://pytorch.org/)
27
+
Alternatively use [the PyTorch website](https://pytorch.org/).
28
28
29
29
30
30
31
-
### Testing
31
+
### Testing and Example
32
32
33
+
**Testing:**
33
34
The ```test_CNN.py``` script runs the forward- and backwardpass of all Layers, Activations and Losses with random shaped inputs
34
35
and checks the results with the PyTorch-Autograd Engine.
35
36
37
+
**Example:**
36
38
I also wrote a small Network in the ```FashionNet.py``` file, which trains a small Model with the FashionMNIST dataset.
37
39
The Model was trained for only one epoch and returned some descend results. They aren't the best, but my test with the same Model in PyTorch got a similar result, so it must be the bad architecture and the short training of only one epoch.
38
40

39
-
*Note: the Testing Loss and Accuracy is more stable because the testing batch was four times the size of the training batch*
41
+
*NOTE: The Testing Loss and Accuracy is more stable because the testing batch was four times the size of the training batch.*
40
42
41
43
42
44
@@ -69,13 +71,20 @@ The Model was trained for only one epoch and returned some descend results. They
69
71
70
72
71
73
74
+
### Future Updates
75
+
76
+
- update BinaryCrossEntropyLoss for more numerical stability
77
+
- add BatchNorm and Dropout-Layer
78
+
- add CELU and ELU activations
79
+
80
+
81
+
72
82
### Acknowledgments
73
83
For the Softmax, LogSoftmax and CrossEntropyLoss-Module I used
74
84
the numerical more stable functions implemented in the PyTorch Library!
75
-
You should definetly check [this amazing Library out](https://pytorch.org/) ;) luv u :*
76
-
77
-
Also a great source for Convolutions and Optimizer were [the CS231n course notes](http://cs231n.github.io/)
78
-
85
+
You should definetly check [this amazing Library out](https://pytorch.org/)! ;) luv u :*
79
86
87
+
Also a great source for Convolutions and Optimizer were [the CS231n course notes](http://cs231n.github.io/).
80
88
89
+
To learn more about Transposed Convolutions: [Paper](https://arxiv.org/pdf/1603.07285.pdf) and [Animations](https://github.com/vdumoulin/conv_arithmetic).
0 commit comments