Skip to content
Snippets Groups Projects
Commit af35b1a8 authored by Cirilli Simon's avatar Cirilli Simon
Browse files

I have no idea what I'm doing here.

parent f5049954
No related branches found
No related tags found
No related merge requests found
File added
This diff is collapsed.
"Sepal.Length" "Sepal.Width" "Petal.Length" "Petal.Width" "Species" "Sepal.Area" "Petal.Area" "Sepal.Ratio" "Petal.Ratio"
5.1 3.5 1.4 0.2 "setosa" 17.85 0.28 1.45714285714286 7
4.9 3 1.4 0.2 "setosa" 14.7 0.28 1.63333333333333 7
4.7 3.2 1.3 0.2 "setosa" 15.04 0.26 1.46875 6.5
4.6 3.1 1.5 0.2 "setosa" 14.26 0.3 1.48387096774194 7.5
5 3.6 1.4 0.2 "setosa" 18 0.28 1.38888888888889 7
5.4 3.9 1.7 0.4 "setosa" 21.06 0.68 1.38461538461538 4.25
4.6 3.4 1.4 0.3 "setosa" 15.64 0.42 1.35294117647059 4.66666666666667
5 3.4 1.5 0.2 "setosa" 17 0.3 1.47058823529412 7.5
4.4 2.9 1.4 0.2 "setosa" 12.76 0.28 1.51724137931034 7
4.9 3.1 1.5 0.1 "setosa" 15.19 0.15 1.58064516129032 15
5.4 3.7 1.5 0.2 "setosa" 19.98 0.3 1.45945945945946 7.5
4.8 3.4 1.6 0.2 "setosa" 16.32 0.32 1.41176470588235 8
4.8 3 1.4 0.1 "setosa" 14.4 0.14 1.6 14
4.3 3 1.1 0.1 "setosa" 12.9 0.11 1.43333333333333 11
5.8 4 1.2 0.2 "setosa" 23.2 0.24 1.45 6
5.7 4.4 1.5 0.4 "setosa" 25.08 0.6 1.29545454545455 3.75
5.4 3.9 1.3 0.4 "setosa" 21.06 0.52 1.38461538461538 3.25
5.1 3.5 1.4 0.3 "setosa" 17.85 0.42 1.45714285714286 4.66666666666667
5.7 3.8 1.7 0.3 "setosa" 21.66 0.51 1.5 5.66666666666667
5.1 3.8 1.5 0.3 "setosa" 19.38 0.45 1.34210526315789 5
5.4 3.4 1.7 0.2 "setosa" 18.36 0.34 1.58823529411765 8.5
5.1 3.7 1.5 0.4 "setosa" 18.87 0.6 1.37837837837838 3.75
4.6 3.6 1 0.2 "setosa" 16.56 0.2 1.27777777777778 5
5.1 3.3 1.7 0.5 "setosa" 16.83 0.85 1.54545454545455 3.4
4.8 3.4 1.9 0.2 "setosa" 16.32 0.38 1.41176470588235 9.5
5 3 1.6 0.2 "setosa" 15 0.32 1.66666666666667 8
5 3.4 1.6 0.4 "setosa" 17 0.64 1.47058823529412 4
5.2 3.5 1.5 0.2 "setosa" 18.2 0.3 1.48571428571429 7.5
5.2 3.4 1.4 0.2 "setosa" 17.68 0.28 1.52941176470588 7
4.7 3.2 1.6 0.2 "setosa" 15.04 0.32 1.46875 8
4.8 3.1 1.6 0.2 "setosa" 14.88 0.32 1.54838709677419 8
5.4 3.4 1.5 0.4 "setosa" 18.36 0.6 1.58823529411765 3.75
5.2 4.1 1.5 0.1 "setosa" 21.32 0.15 1.26829268292683 15
5.5 4.2 1.4 0.2 "setosa" 23.1 0.28 1.30952380952381 7
4.9 3.1 1.5 0.2 "setosa" 15.19 0.3 1.58064516129032 7.5
5 3.2 1.2 0.2 "setosa" 16 0.24 1.5625 6
5.5 3.5 1.3 0.2 "setosa" 19.25 0.26 1.57142857142857 6.5
4.9 3.6 1.4 0.1 "setosa" 17.64 0.14 1.36111111111111 14
4.4 3 1.3 0.2 "setosa" 13.2 0.26 1.46666666666667 6.5
5.1 3.4 1.5 0.2 "setosa" 17.34 0.3 1.5 7.5
5 3.5 1.3 0.3 "setosa" 17.5 0.39 1.42857142857143 4.33333333333333
4.5 2.3 1.3 0.3 "setosa" 10.35 0.39 1.95652173913044 4.33333333333333
4.4 3.2 1.3 0.2 "setosa" 14.08 0.26 1.375 6.5
5 3.5 1.6 0.6 "setosa" 17.5 0.96 1.42857142857143 2.66666666666667
5.1 3.8 1.9 0.4 "setosa" 19.38 0.76 1.34210526315789 4.75
4.8 3 1.4 0.3 "setosa" 14.4 0.42 1.6 4.66666666666667
5.1 3.8 1.6 0.2 "setosa" 19.38 0.32 1.34210526315789 8
4.6 3.2 1.4 0.2 "setosa" 14.72 0.28 1.4375 7
5.3 3.7 1.5 0.2 "setosa" 19.61 0.3 1.43243243243243 7.5
5 3.3 1.4 0.2 "setosa" 16.5 0.28 1.51515151515152 7
7 3.2 4.7 1.4 "versicolor" 22.4 6.58 2.1875 3.35714285714286
6.4 3.2 4.5 1.5 "versicolor" 20.48 6.75 2 3
6.9 3.1 4.9 1.5 "versicolor" 21.39 7.35 2.22580645161290 3.26666666666667
5.5 2.3 4 1.3 "versicolor" 12.65 5.2 2.39130434782609 3.07692307692308
6.5 2.8 4.6 1.5 "versicolor" 18.2 6.9 2.32142857142857 3.06666666666667
5.7 2.8 4.5 1.3 "versicolor" 15.96 5.85 2.03571428571429 3.46153846153846
6.3 3.3 4.7 1.6 "versicolor" 20.79 7.52 1.90909090909091 2.9375
4.9 2.4 3.3 1 "versicolor" 11.76 3.3 2.04166666666667 3.3
6.6 2.9 4.6 1.3 "versicolor" 19.14 5.98 2.27586206896552 3.53846153846154
5.2 2.7 3.9 1.4 "versicolor" 14.04 5.46 1.92592592592593 2.78571428571429
5 2 3.5 1 "versicolor" 10 3.5 2.5 3.5
5.9 3 4.2 1.5 "versicolor" 17.7 6.3 1.96666666666667 2.8
6 2.2 4 1 "versicolor" 13.2 4 2.72727272727273 4
6.1 2.9 4.7 1.4 "versicolor" 17.69 6.58 2.10344827586207 3.35714285714286
5.6 2.9 3.6 1.3 "versicolor" 16.24 4.68 1.93103448275862 2.76923076923077
6.7 3.1 4.4 1.4 "versicolor" 20.77 6.16 2.16129032258065 3.14285714285714
5.6 3 4.5 1.5 "versicolor" 16.8 6.75 1.86666666666667 3
5.8 2.7 4.1 1 "versicolor" 15.66 4.1 2.14814814814815 4.1
6.2 2.2 4.5 1.5 "versicolor" 13.64 6.75 2.81818181818182 3
5.6 2.5 3.9 1.1 "versicolor" 14 4.29 2.24 3.54545454545454
5.9 3.2 4.8 1.8 "versicolor" 18.88 8.64 1.84375 2.66666666666667
6.1 2.8 4 1.3 "versicolor" 17.08 5.2 2.17857142857143 3.07692307692308
6.3 2.5 4.9 1.5 "versicolor" 15.75 7.35 2.52 3.26666666666667
6.1 2.8 4.7 1.2 "versicolor" 17.08 5.64 2.17857142857143 3.91666666666667
6.4 2.9 4.3 1.3 "versicolor" 18.56 5.59 2.20689655172414 3.30769230769231
6.6 3 4.4 1.4 "versicolor" 19.8 6.16 2.2 3.14285714285714
6.8 2.8 4.8 1.4 "versicolor" 19.04 6.72 2.42857142857143 3.42857142857143
6.7 3 5 1.7 "versicolor" 20.1 8.5 2.23333333333333 2.94117647058824
6 2.9 4.5 1.5 "versicolor" 17.4 6.75 2.06896551724138 3
5.7 2.6 3.5 1 "versicolor" 14.82 3.5 2.19230769230769 3.5
5.5 2.4 3.8 1.1 "versicolor" 13.2 4.18 2.29166666666667 3.45454545454545
5.5 2.4 3.7 1 "versicolor" 13.2 3.7 2.29166666666667 3.7
5.8 2.7 3.9 1.2 "versicolor" 15.66 4.68 2.14814814814815 3.25
6 2.7 5.1 1.6 "versicolor" 16.2 8.16 2.22222222222222 3.1875
5.4 3 4.5 1.5 "versicolor" 16.2 6.75 1.8 3
6 3.4 4.5 1.6 "versicolor" 20.4 7.2 1.76470588235294 2.8125
6.7 3.1 4.7 1.5 "versicolor" 20.77 7.05 2.16129032258065 3.13333333333333
6.3 2.3 4.4 1.3 "versicolor" 14.49 5.72 2.73913043478261 3.38461538461538
5.6 3 4.1 1.3 "versicolor" 16.8 5.33 1.86666666666667 3.15384615384615
5.5 2.5 4 1.3 "versicolor" 13.75 5.2 2.2 3.07692307692308
5.5 2.6 4.4 1.2 "versicolor" 14.3 5.28 2.11538461538462 3.66666666666667
6.1 3 4.6 1.4 "versicolor" 18.3 6.44 2.03333333333333 3.28571428571429
5.8 2.6 4 1.2 "versicolor" 15.08 4.8 2.23076923076923 3.33333333333333
5 2.3 3.3 1 "versicolor" 11.5 3.3 2.17391304347826 3.3
5.6 2.7 4.2 1.3 "versicolor" 15.12 5.46 2.07407407407407 3.23076923076923
5.7 3 4.2 1.2 "versicolor" 17.1 5.04 1.9 3.5
5.7 2.9 4.2 1.3 "versicolor" 16.53 5.46 1.96551724137931 3.23076923076923
6.2 2.9 4.3 1.3 "versicolor" 17.98 5.59 2.13793103448276 3.30769230769231
5.1 2.5 3 1.1 "versicolor" 12.75 3.3 2.04 2.72727272727273
5.7 2.8 4.1 1.3 "versicolor" 15.96 5.33 2.03571428571429 3.15384615384615
6.3 3.3 6 2.5 "virginica" 20.79 15 1.90909090909091 2.4
5.8 2.7 5.1 1.9 "virginica" 15.66 9.69 2.14814814814815 2.68421052631579
7.1 3 5.9 2.1 "virginica" 21.3 12.39 2.36666666666667 2.80952380952381
6.3 2.9 5.6 1.8 "virginica" 18.27 10.08 2.17241379310345 3.11111111111111
6.5 3 5.8 2.2 "virginica" 19.5 12.76 2.16666666666667 2.63636363636364
7.6 3 6.6 2.1 "virginica" 22.8 13.86 2.53333333333333 3.14285714285714
4.9 2.5 4.5 1.7 "virginica" 12.25 7.65 1.96 2.64705882352941
7.3 2.9 6.3 1.8 "virginica" 21.17 11.34 2.51724137931034 3.5
6.7 2.5 5.8 1.8 "virginica" 16.75 10.44 2.68 3.22222222222222
7.2 3.6 6.1 2.5 "virginica" 25.92 15.25 2 2.44
6.5 3.2 5.1 2 "virginica" 20.8 10.2 2.03125 2.55
6.4 2.7 5.3 1.9 "virginica" 17.28 10.07 2.37037037037037 2.78947368421053
6.8 3 5.5 2.1 "virginica" 20.4 11.55 2.26666666666667 2.61904761904762
5.7 2.5 5 2 "virginica" 14.25 10 2.28 2.5
5.8 2.8 5.1 2.4 "virginica" 16.24 12.24 2.07142857142857 2.125
6.4 3.2 5.3 2.3 "virginica" 20.48 12.19 2 2.30434782608696
6.5 3 5.5 1.8 "virginica" 19.5 9.9 2.16666666666667 3.05555555555556
7.7 3.8 6.7 2.2 "virginica" 29.26 14.74 2.02631578947368 3.04545454545455
7.7 2.6 6.9 2.3 "virginica" 20.02 15.87 2.96153846153846 3
6 2.2 5 1.5 "virginica" 13.2 7.5 2.72727272727273 3.33333333333333
6.9 3.2 5.7 2.3 "virginica" 22.08 13.11 2.15625 2.47826086956522
5.6 2.8 4.9 2 "virginica" 15.68 9.8 2 2.45
7.7 2.8 6.7 2 "virginica" 21.56 13.4 2.75 3.35
6.3 2.7 4.9 1.8 "virginica" 17.01 8.82 2.33333333333333 2.72222222222222
6.7 3.3 5.7 2.1 "virginica" 22.11 11.97 2.03030303030303 2.71428571428571
7.2 3.2 6 1.8 "virginica" 23.04 10.8 2.25 3.33333333333333
6.2 2.8 4.8 1.8 "virginica" 17.36 8.64 2.21428571428571 2.66666666666667
6.1 3 4.9 1.8 "virginica" 18.3 8.82 2.03333333333333 2.72222222222222
6.4 2.8 5.6 2.1 "virginica" 17.92 11.76 2.28571428571429 2.66666666666667
7.2 3 5.8 1.6 "virginica" 21.6 9.28 2.4 3.625
7.4 2.8 6.1 1.9 "virginica" 20.72 11.59 2.64285714285714 3.21052631578947
7.9 3.8 6.4 2 "virginica" 30.02 12.8 2.07894736842105 3.2
6.4 2.8 5.6 2.2 "virginica" 17.92 12.32 2.28571428571429 2.54545454545454
6.3 2.8 5.1 1.5 "virginica" 17.64 7.65 2.25 3.4
6.1 2.6 5.6 1.4 "virginica" 15.86 7.84 2.34615384615385 4
7.7 3 6.1 2.3 "virginica" 23.1 14.03 2.56666666666667 2.65217391304348
6.3 3.4 5.6 2.4 "virginica" 21.42 13.44 1.85294117647059 2.33333333333333
6.4 3.1 5.5 1.8 "virginica" 19.84 9.9 2.06451612903226 3.05555555555556
6 3 4.8 1.8 "virginica" 18 8.64 2 2.66666666666667
6.9 3.1 5.4 2.1 "virginica" 21.39 11.34 2.22580645161290 2.57142857142857
6.7 3.1 5.6 2.4 "virginica" 20.77 13.44 2.16129032258065 2.33333333333333
6.9 3.1 5.1 2.3 "virginica" 21.39 11.73 2.22580645161290 2.21739130434783
5.8 2.7 5.1 1.9 "virginica" 15.66 9.69 2.14814814814815 2.68421052631579
6.8 3.2 5.9 2.3 "virginica" 21.76 13.57 2.125 2.56521739130435
6.7 3.3 5.7 2.5 "virginica" 22.11 14.25 2.03030303030303 2.28
6.7 3 5.2 2.3 "virginica" 20.1 11.96 2.23333333333333 2.26086956521739
6.3 2.5 5 1.9 "virginica" 15.75 9.5 2.52 2.63157894736842
6.5 3 5.2 2 "virginica" 19.5 10.4 2.16666666666667 2.6
6.2 3.4 5.4 2.3 "virginica" 21.08 12.42 1.82352941176471 2.34782608695652
5.9 3 5.1 1.8 "virginica" 17.7 9.18 1.96666666666667 2.83333333333333
serie1/learning.png

28.9 KiB

serie1/line.png

21.3 KiB

This diff is collapsed.
# Perceptron learning rule
simon.cirilli@hes-so.ch - kiady.arintsoa@hes-so.ch
## lineparams
This function is usefull to translate the weights and bias into a line equation. It returns the slope and the intercept of the line.
## predict
This function is used to predict the class of a point. It returns 1 if the point is above the line and 0 if the point is below the line.
## Update weights and bias
This function is used to update the weights and bias. It returns the new weights and bias and this is from the perceptron learning rule.
## Train code
The train code is used to train the perceptron. The stop criterion is the number of iterations. The function returns the weights and bias and the number of misclassifications per iteration.
## Test
We see that we have a pretty good line separating the two classes.
## plot with the misclassifications per iteration
![](learning.png)
# To summarize
So we have seen that the perceptron learning rule is able to find a separating line for linearly separable data. We also saw that in 28 iterations the algorithm was able to find almost the correct separating line. We can see from the misclassifications plot that the number of misclassifications decreases with each iteration. But sometimes the number of misclassifications increases. This is because when we update the weights and bias, we are not always updating in the perfect right direction.
\ No newline at end of file
File added
This diff is collapsed.
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment