Нейронная сеть адаптируется к изменениям в выходных данных вместо функций обучения - PullRequest
0 голосов
/ 25 октября 2018

Я пытался внедрить нейронную сеть с нуля в наборе данных IRIS.Однако вместо изучения функций в наборе данных сеть просто продолжала изменять вероятности прогнозирования в соответствии с выходными данными, т. Е. Как только появлялись новые выходные данные, вероятность этого выходного сигнала увеличивалась, тогда как другие снижались.

Функциональные возможности my Square-Error, Softmax и Square-Error: -

def SE(yHatarr, yarr):
    yHatarr=yHatarr[0];ans=[]
    for i in range(len(yarr)):
        ans.append(0.5*(yHatarr[i]-yarr[i])**2)
    return [ans]
def softmax(arr):
    arr=arr[0]
    ctr=sum([np.exp(i) for i in arr])
    ans=[]
    for i in range(len(arr)):
        ans.append(np.exp(arr[i])/ctr)
    return [ans]
def SEd(yHatarr, yarr):
    yHatarr=yHatarr[0];ans=[]
    for i in range(len(yarr)):
        ans.append(yHatarr[i]-yarr[i])
    return [ans]

Следите за тем, как изменяется Predicted_Probability, как только правильный выходной сигнал изменяется с [0,0,1] на [1, 0,0].

Вот пример вывода нейронов: -

Format:- [Prob_Class1,Prob_Class2,Prob_Class3] [True Class] Error

[0.08 0.87 0.05 ][0 0 1] 0.8355123456043121
[0.08 0.86 0.06 ][0 0 1] 0.8201579522256677
[0.09 0.85 0.07 ][0 0 1] 0.7995375042585138
[0.10 0.83 0.08 ][0 0 1] 0.7766517570676785
[0.10 0.81 0.09 ][0 0 1] 0.7481394857171886
[0.11 0.78 0.10 ][0 0 1] 0.7147686941406041
[0.12 0.76 0.12 ][0 0 1] 0.6782722653829699
[0.13 0.72 0.15 ][0 0 1] 0.6298704268963138
[0.14 0.68 0.18 ][0 0 1] 0.5776411760112319
[0.15 0.63 0.22 ][0 0 1] 0.5131507663831025
[0.16 0.58 0.26 ][0 0 1] 0.4493113237770373
[0.16 0.52 0.32 ][0 0 1] 0.3831032130730988
[0.16 0.47 0.37 ][0 0 1] 0.31710665107283553
[0.15 0.42 0.43 ][0 0 1] 0.26146891484458623
[0.15 0.37 0.48 ][0 0 1] 0.21150581119605297
[0.14 0.32 0.53 ][0 0 1] 0.1717355634863632
[0.13 0.29 0.58 ][0 0 1] 0.14168474990198907
[0.13 0.26 0.61 ][0 0 1] 0.11669936303086509
[0.12 0.24 0.64 ][0 0 1] 0.09947807429495423
[0.11 0.22 0.67 ][0 0 1] 0.08728858698649153
[0.11 0.20 0.69 ][0 0 1] 0.07416948011896661
[0.10 0.19 0.71 ][0 0 1] 0.06615338011546637
[0.10 0.18 0.73 ][0 0 1] 0.057605179280366875
[0.09 0.17 0.74 ][0 0 1] 0.05265220063231083
[0.09 0.16 0.75 ][0 0 1] 0.046785733140138064
[0.09 0.15 0.76 ][0 0 1] 0.042482200219891866
[0.08 0.14 0.77 ][0 0 1] 0.0396332613686496
[0.08 0.14 0.78 ][0 0 1] 0.03633626094906805
[0.08 0.13 0.79 ][0 0 1] 0.033367106642829145
[0.07 0.13 0.80 ][0 0 1] 0.03084241102227691
[0.07 0.12 0.81 ][0 0 1] 0.028763190819509295
[0.07 0.12 0.81 ][0 0 1] 0.026683698305330142
[0.07 0.11 0.82 ][0 0 1] 0.025496619777046432
[0.07 0.11 0.82 ][0 0 1] 0.02419286952170306
[0.07 0.11 0.83 ][0 0 1] 0.022735851321869192
[0.06 0.10 0.83 ][0 0 1] 0.021315209520257297
[0.06 0.10 0.84 ][0 0 1] 0.02031128142500064
[0.06 0.10 0.84 ][0 0 1] 0.01943104743984881
[0.06 0.10 0.84 ][0 0 1] 0.018748933049037395
[0.06 0.09 0.85 ][0 0 1] 0.017691330851158107
[0.06 0.09 0.85 ][0 0 1] 0.016885822752465505
[0.06 0.09 0.85 ][0 0 1] 0.016275209414762362
[0.06 0.09 0.86 ][0 0 1] 0.015776698478290732
[0.06 0.08 0.86 ][0 0 1] 0.01488144258331394
[0.05 0.08 0.86 ][0 0 1] 0.014335498578266635
[0.05 0.08 0.86 ][0 0 1] 0.013945092246224356
[0.05 0.08 0.87 ][0 0 1] 0.013625021482995811
[0.05 0.08 0.87 ][0 0 1] 0.013014934792770029
[0.05 0.08 0.87 ][0 0 1] 0.012500751350839438
[0.05 0.08 0.87 ][0 0 1] 0.012226730308986347
[0.05 0.08 0.87 ][1 0 0] 0.82624996642836
[0.06 0.08 0.85 ][1 0 0] 0.8072553055841852
[0.07 0.09 0.84 ][1 0 0] 0.7853602251838425
[0.08 0.10 0.82 ][1 0 0] 0.7619773134178103
[0.10 0.10 0.80 ][1 0 0] 0.7324226143564783
[0.11 0.11 0.77 ][1 0 0] 0.7001594230119894
[0.14 0.12 0.74 ][1 0 0] 0.6549562589875634
[0.16 0.13 0.71 ][1 0 0] 0.6087388567926084
[0.20 0.14 0.66 ][1 0 0] 0.5525454038670237
[0.23 0.15 0.62 ][1 0 0] 0.49434468417325117
[0.28 0.15 0.56 ][1 0 0] 0.4279669053631815
[0.33 0.16 0.51 ][1 0 0] 0.3640007597877611
[0.39 0.15 0.46 ][1 0 0] 0.3027515636324625
[0.45 0.15 0.40 ][1 0 0] 0.24527733282462752
[0.50 0.14 0.36 ][1 0 0] 0.1996751453608583
[0.54 0.14 0.32 ][1 0 0] 0.16499312667280702
[0.58 0.13 0.29 ][1 0 0] 0.13579401354879722
[0.62 0.12 0.26 ][1 0 0] 0.1158698305331455
[0.64 0.12 0.24 ][1 0 0] 0.09984949418039146
[0.67 0.11 0.22 ][1 0 0] 0.08490903381373108
[0.69 0.11 0.21 ][1 0 0] 0.0762273750979035
[0.71 0.10 0.19 ][1 0 0] 0.0650603698708109
[0.73 0.09 0.18 ][1 0 0] 0.056831046615074626
[0.74 0.09 0.17 ][1 0 0] 0.05322067257195143
[0.75 0.09 0.16 ][1 0 0] 0.049471523570271556
[0.76 0.09 0.15 ][1 0 0] 0.04501377897284836
[0.77 0.08 0.15 ][1 0 0] 0.039889681824408205
[0.78 0.08 0.14 ][1 0 0] 0.036874399389275316
[0.79 0.08 0.13 ][1 0 0] 0.03406101493411523
[0.79 0.08 0.13 ][1 0 0] 0.03253282476808503
[0.80 0.07 0.13 ][1 0 0] 0.030510155182873304
[0.81 0.07 0.12 ][1 0 0] 0.027467136216565744
[0.82 0.07 0.11 ][1 0 0] 0.02571578615714047
[0.82 0.07 0.11 ][1 0 0] 0.02394192291624685
[0.82 0.07 0.11 ][1 0 0] 0.024099987904135103
[0.83 0.06 0.11 ][1 0 0] 0.021953535472746984
[0.84 0.06 0.10 ][1 0 0] 0.02071075226022949
[0.84 0.06 0.10 ][1 0 0] 0.02069826140858322
[0.84 0.06 0.10 ][1 0 0] 0.0195093015800547
[0.85 0.06 0.10 ][1 0 0] 0.018330409091236972
[0.85 0.06 0.09 ][1 0 0] 0.017104438839642612
[0.85 0.06 0.09 ][1 0 0] 0.01792169480509333
[0.85 0.06 0.09 ][1 0 0] 0.016217031383768205
[0.86 0.05 0.09 ][1 0 0] 0.015117683236563869
[0.86 0.05 0.09 ][1 0 0] 0.01481093436641068
[0.86 0.05 0.08 ][1 0 0] 0.014563732083334013
[0.87 0.05 0.08 ][1 0 0] 0.013740893744623764
[0.87 0.05 0.08 ][1 0 0] 0.013584502802642152
[0.87 0.05 0.08 ][1 0 0] 0.012811420431731651
[0.87 0.05 0.08 ][1 0 0] 0.012588752270858152
[0.87 0.05 0.08 ][0 1 0] 0.8335434414722795
[0.86 0.06 0.08 ][0 1 0] 0.8166574457987319
[0.84 0.07 0.09 ][0 1 0] 0.7959609502247489
[0.82 0.08 0.10 ][0 1 0] 0.7667863185653575
[0.80 0.09 0.11 ][0 1 0] 0.7419803629500777
[0.78 0.11 0.12 ][0 1 0] 0.7064226958160007
[0.75 0.13 0.13 ][0 1 0] 0.6703650116595594
[0.71 0.16 0.14 ][0 1 0] 0.6145673227572149
[0.67 0.19 0.15 ][0 1 0] 0.5650973847474794
[0.62 0.23 0.15 ][0 1 0] 0.5035201128245061
[0.56 0.28 0.16 ][0 1 0] 0.42696760456385946
[0.51 0.32 0.16 ][0 1 0] 0.37426984304835553
[0.45 0.39 0.16 ][0 1 0] 0.30104776794364024
[0.41 0.44 0.16 ][0 1 0] 0.253447906925031
[0.36 0.49 0.15 ][0 1 0] 0.20762783659417763
[0.32 0.54 0.14 ][0 1 0] 0.1697684204367431
[0.29 0.58 0.13 ][0 1 0] 0.13994110531897877
[0.26 0.62 0.13 ][0 1 0] 0.11478501478921287
[0.23 0.65 0.12 ][0 1 0] 0.09665604878828593
[0.22 0.67 0.11 ][0 1 0] 0.08380306576614041
[0.20 0.69 0.11 ][0 1 0] 0.07547345419618587
[0.19 0.71 0.10 ][0 1 0] 0.06518680085909506
[0.17 0.73 0.10 ][0 1 0] 0.057208830432325906
[0.17 0.74 0.09 ][0 1 0] 0.05156602530369503
[0.16 0.75 0.09 ][0 1 0] 0.04706089350030288
[0.15 0.76 0.09 ][0 1 0] 0.04303197953293731
[0.14 0.77 0.08 ][0 1 0] 0.03903237092002031
[0.14 0.78 0.08 ][0 1 0] 0.03641615165069422
[0.13 0.79 0.08 ][0 1 0] 0.033529912323920094
[0.13 0.80 0.08 ][0 1 0] 0.030884518362202438
[0.12 0.81 0.07 ][0 1 0] 0.028481220398680705
[0.12 0.81 0.07 ][0 1 0] 0.02670197699209554
[0.11 0.82 0.07 ][0 1 0] 0.025503271958091277
[0.11 0.82 0.07 ][0 1 0] 0.024024101104994443
[0.11 0.83 0.07 ][0 1 0] 0.022999049596384023
[0.10 0.83 0.07 ][0 1 0] 0.02205762731388365
[0.10 0.84 0.06 ][0 1 0] 0.02066619909329867
[0.10 0.84 0.06 ][0 1 0] 0.019008093621003776
[0.10 0.84 0.06 ][0 1 0] 0.018782938063013493
[0.09 0.85 0.06 ][0 1 0] 0.01761650910622183
[0.09 0.85 0.06 ][0 1 0] 0.016808075776727417
[0.09 0.85 0.06 ][0 1 0] 0.016412550550525176
[0.09 0.86 0.06 ][0 1 0] 0.015557087328353427
[0.08 0.86 0.06 ][0 1 0] 0.014981225852649351
[0.08 0.86 0.05 ][0 1 0] 0.014474167980246613
[0.08 0.86 0.05 ][0 1 0] 0.014076842379816792
[0.08 0.87 0.05 ][0 1 0] 0.013565080535187103
[0.08 0.87 0.05 ][0 1 0] 0.013069692653491805
[0.08 0.87 0.05 ][0 1 0] 0.012850853194447856
[0.08 0.87 0.05 ][0 1 0] 0.012231305425899792

Вот алгоритм прямого распространения и обратного распространения, который я реализовал: -

ILi: -Входной слой ввода
Форма: - [1,4]

ILo: -Входной слой уровня
Форма: - [1,4]

HLi: -Вход скрытого слоя
Форма: - [1100]

HLo: - Выход скрытого слоя
Форма: - [1100]

OLi: -Выход на вход слоя
Форма:- [1,3]

OLo: -Вход уровня выходного слоя
Форма: - [1,3]

Wij: - Веса между входным слоемr и скрытый слой [4,100]

Wjk: - веса между скрытым слоем и выходным слоем [4,100]

Wij=np.random.rand(4,100)
Wjk=np.random.rand(100,3)
def forwardpass(ILi,Outpute):
    global ILo,HLo,HLi,OLi,OLo,Wij,Wjk
    ILo=np.array(ILi) #Shape:-[1,4]
    HLi=np.matmul(ILi,Wij) # [[1,4],[4,100]]
    HLo=np.array(expit(HLi)) #Shape:-[1,100]
    OLi=np.matmul(HLo,Wjk) #[[1,100],[100,3]]
    OLo=np.array(softmax(OLi)) #Shape:-[1,3]
    Error=SE(OLo,Outpute) #Shape:-[1,3]
    print('[',end="")
    for i in OLo[0]:
        print("%.2f" % i,end=" ");
    print(']',end="")
    print(Outpute,sum(Error[0]))

def BackProp(ILi,Outpute,alpha):
    global ILo,HLo,HLi,OLi,OLo,Wij,Wjk
    dEdOLo=np.array(SEd(OLo,Outpute))
    dOLoOLi=OLo*(1-OLo)#Shape:-[1,3]
    dWjk=np.dot(HLo.T,(dEdOLo*dOLoOLi))*alpha
    dWij=np.dot(ILo.T,np.dot(Wjk,(OLo*(1-OLo)*dEdOLo).T).T*HLo*(1-HLo))*alpha
    Wij-=dWij
    Wjk-=dWjk

Я не понимаю, что происходит, я попытался уменьшить нейроныв скрытом слое, а также возиться с альфа-каналом [0.1,0.01,0.001], но, похоже, ничто не помогает ему в этом.как только входное значение изменяется с [1,0,0] на [0,1,0], прогнозируемая вероятность для класса 2 начинает увеличиваться пропорционально значению альфа, т. е. чем больше альфа, тем быстрее он увеличивается (что в значительной степениальфа должна делать в любом случае)

Вот что происходит, когда я наконец запускаю его на своих тестовых данных: -

Каким бы ни было последнее состояние Class_Probability во время обучения, прогнозируется для всех дальнейших случаев. В этом примере, так какClass2_Probability был самым высоким в конце обучения, он выводит все дальнейшие test_inputs как Class2.

[1 0 0] [0.04600215 0.94253181 0.01146605] InCorrect
[1 0 0] [0.04574746 0.94220717 0.01204537] InCorrect
[1 0 0] [0.04709778 0.94090963 0.01199259] InCorrect
[1 0 0] [0.0449623  0.94279201 0.01224569] InCorrect
[1 0 0] [0.04614702 0.94239695 0.01145603] InCorrect
[1 0 0] [0.04279    0.94586446 0.01134554] InCorrect
[1 0 0] [0.04582037 0.9421256  0.01205403] InCorrect
[1 0 0] [0.04499829 0.94336797 0.01163374] InCorrect
[1 0 0] [0.04601034 0.94135087 0.01263879] InCorrect
[1 0 0] [0.0452387  0.94293738 0.01182392] InCorrect
[1 0 0] [0.04504582 0.94377873 0.01117545] InCorrect
[1 0 0] [0.04413799 0.94404294 0.01181907] InCorrect
[1 0 0] [0.0462843  0.94174267 0.01197303] InCorrect
[1 0 0] [0.05024135 0.93734185 0.0124168 ] InCorrect
[1 0 0] [0.04781183 0.94132715 0.01086101] InCorrect
[1 0 0] [0.04485557 0.94426659 0.01087784] InCorrect
[1 0 0] [0.04635582 0.94238965 0.01125453] InCorrect
[1 0 0] [0.04560167 0.9428036  0.01159472] InCorrect
[1 0 0] [0.04293738 0.94593326 0.01112936] InCorrect
[1 0 0] [0.04492806 0.94367631 0.01139563] InCorrect
[1 0 0] [0.04297452 0.94558126 0.01144422] InCorrect
[1 0 0] [0.04444936 0.94395257 0.01159807] InCorrect
[1 0 0] [0.05080601 0.93744935 0.01174464] InCorrect
[1 0 0] [0.04175144 0.9460157  0.01223286] InCorrect
[1 0 0] [0.04139108 0.94659725 0.01201168] InCorrect
[1 0 0] [0.04366487 0.94428704 0.01204808] InCorrect
[1 0 0] [0.0432279  0.94479472 0.01197739] InCorrect
[1 0 0] [0.04497833 0.94359783 0.01142384] InCorrect
[1 0 0] [0.04586366 0.94266019 0.01147615] InCorrect
[1 0 0] [0.04398904 0.94391234 0.01209863] InCorrect
[1 0 0] [0.04384402 0.94403927 0.01211671] InCorrect
[1 0 0] [0.04402129 0.94433795 0.01164076] InCorrect
[1 0 0] [0.04583841 0.94322418 0.01093741] InCorrect
[1 0 0] [0.0462944  0.94284241 0.01086319] InCorrect
[1 0 0] [0.0452387  0.94293738 0.01182392] InCorrect
[1 0 0] [0.04794889 0.94030179 0.01174931] InCorrect
[1 0 0] [0.04673692 0.94203717 0.01122592] InCorrect
[1 0 0] [0.0452387  0.94293738 0.01182392] InCorrect
[1 0 0] [0.04721021 0.94030648 0.01248331] InCorrect
[1 0 0] [0.04494438 0.94349066 0.01156496] InCorrect
[1 0 0] [0.04666355 0.94169285 0.0116436 ] InCorrect
[1 0 0] [0.04618719 0.94025169 0.01356111] InCorrect
[1 0 0] [0.04734783 0.94039558 0.01225659] InCorrect
[1 0 0] [0.04252896 0.94527146 0.01219958] InCorrect
[1 0 0] [0.0410784  0.94719177 0.01172983] InCorrect
[1 0 0] [0.04533436 0.94236908 0.01229656] InCorrect
[1 0 0] [0.04438119 0.94432203 0.01129678] InCorrect
[1 0 0] [0.04609738 0.9418017  0.01210092] InCorrect
[1 0 0] [0.04509783 0.94367338 0.01122879] InCorrect
[1 0 0] [0.04590867 0.942401   0.01169032] InCorrect
[0 1 0] [0.0228392  0.95683902 0.02032178] Correct
[0 1 0] [0.02290028 0.95622627 0.02087344] Correct
[0 1 0] [0.02195316 0.95638371 0.02166313] Correct
[0 1 0] [0.02354612 0.9546604  0.02179348] Correct
[0 1 0] [0.02223699 0.95592689 0.02183613] Correct
[0 1 0] [0.02263986 0.95532912 0.02203102] Correct
[0 1 0] [0.02223671 0.95595921 0.02180408] Correct
[0 1 0] [0.02702391 0.95370156 0.01927453] Correct
[0 1 0] [0.02277198 0.95638812 0.0208399 ] Correct
[0 1 0] [0.0239913  0.95454039 0.02146831] Correct
[0 1 0] [0.02568777 0.95363505 0.02067718] Correct
[0 1 0] [0.02335134 0.95560515 0.02104351] Correct
[0 1 0] [0.02427133 0.95557703 0.02015164] Correct
[0 1 0] [0.02214374 0.95565245 0.0222038 ] Correct
[0 1 0] [0.02576075 0.95522355 0.0190157 ] Correct
[0 1 0] [0.02344731 0.9566133  0.01993939] Correct
[0 1 0] [0.02243672 0.95513521 0.02242808] Correct
[0 1 0] [0.02444209 0.95581059 0.01974732] Correct
[0 1 0] [0.02187808 0.95508379 0.02303814] Correct
[0 1 0] [0.02458706 0.95522332 0.02018962] Correct
[0 1 0] [0.0214855 0.955105  0.0234095] Correct
[0 1 0] [0.02431472 0.95590259 0.01978269] Correct
[0 1 0] [0.02126984 0.95524599 0.02348416] Correct
[0 1 0] [0.02241108 0.95583502 0.0217539 ] Correct
[0 1 0] [0.02357835 0.95629567 0.02012597] Correct
[0 1 0] [0.02326596 0.95643409 0.02029995] Correct
[0 1 0] [0.02203126 0.95624361 0.02172513] Correct
[0 1 0] [0.02124352 0.95574319 0.02301329] Correct
[0 1 0] [0.0224344  0.95551672 0.02204887] Correct
[0 1 0] [0.02669121 0.95521888 0.01808991] Correct
[0 1 0] [0.02480339 0.95495376 0.02024285] Correct
[0 1 0] [0.02545837 0.95499661 0.01954502] Correct
[0 1 0] [0.02466058 0.95555258 0.01978684] Correct
[0 1 0] [0.02087918 0.95483589 0.02428494] Correct
[0 1 0] [0.02240548 0.95488623 0.02270828] Correct
[0 1 0] [0.02276104 0.95584266 0.0213963 ] Correct
[0 1 0] [0.02237121 0.9563328  0.02129599] Correct
[0 1 0] [0.02258582 0.9556093  0.02180488] Correct
[0 1 0] [0.02401389 0.955494   0.02049211] Correct
[0 1 0] [0.02374459 0.95487506 0.02138034] Correct
[0 1 0] [0.02287425 0.95504713 0.02207862] Correct
[0 1 0] [0.02249167 0.95580278 0.02170555] Correct
[0 1 0] [0.02417834 0.95548273 0.02033893] Correct
[0 1 0] [0.02691897 0.95375005 0.01933099] Correct
[0 1 0] [0.02335798 0.95519802 0.021444  ] Correct
[0 1 0] [0.02395313 0.95572731 0.02031956] Correct
[0 1 0] [0.0236058  0.95551662 0.02087759] Correct
[0 1 0] [0.02348489 0.95607945 0.02043566] Correct
[0 1 0] [0.02843568 0.95348119 0.01808313] Correct
[0 1 0] [0.02380982 0.95543285 0.02075733] Correct
[0 0 1] [0.01928846 0.9540461  0.02666544] InCorrect
[0 0 1] [0.02054192 0.95427898 0.02517909] InCorrect
[0 0 1] [0.01953975 0.95485475 0.02560549] InCorrect
[0 0 1] [0.02006166 0.95469965 0.02523869] InCorrect
[0 0 1] [0.01955431 0.95439148 0.02605421] InCorrect
[0 0 1] [0.01895904 0.95467582 0.02636514] InCorrect
[0 0 1] [0.02176962 0.95347429 0.0247561 ] InCorrect
[0 0 1] [0.019349   0.95494324 0.02570776] InCorrect
[0 0 1] [0.01967753 0.95464815 0.02567433] InCorrect
[0 0 1] [0.0192938  0.95470464 0.02600156] InCorrect
[0 0 1] [0.02075158 0.95523926 0.02400916] InCorrect
[0 0 1] [0.02026045 0.95474032 0.02499923] InCorrect
[0 0 1] [0.01997406 0.95492516 0.02510077] InCorrect
[0 0 1] [0.02050339 0.95396308 0.02553353] InCorrect
[0 0 1] [0.02011159 0.95385235 0.02603606] InCorrect
[0 0 1] [0.02010898 0.95464838 0.02524264] InCorrect
[0 0 1] [0.0202475  0.95501687 0.02473563] InCorrect
[0 0 1] [0.01909665 0.95492155 0.0259818 ] InCorrect
[0 0 1] [0.01859911 0.95428594 0.02711495] InCorrect
[0 0 1] [0.02089861 0.95459811 0.02450329] InCorrect
[0 0 1] [0.01966197 0.95476637 0.02557166] InCorrect
[0 0 1] [0.02078649 0.95413697 0.02507654] InCorrect
[0 0 1] [0.0188903  0.95467145 0.02643826] InCorrect
[0 0 1] [0.02099487 0.95502981 0.02397532] InCorrect
[0 0 1] [0.01986686 0.95488956 0.02524358] InCorrect
[0 0 1] [0.01978704 0.95529619 0.02491677] InCorrect
[0 0 1] [0.02123522 0.95507595 0.02368883] InCorrect
[0 0 1] [0.02117206 0.95507665 0.02375129] InCorrect
[0 0 1] [0.01976324 0.95441051 0.02582625] InCorrect
[0 0 1] [0.02013448 0.95559489 0.02427063] InCorrect
[0 0 1] [0.0194322  0.95500786 0.02555993] InCorrect
[0 0 1] [0.01957681 0.95554078 0.0248824 ] InCorrect
[0 0 1] [0.01968592 0.95431578 0.0259983 ] InCorrect
[0 0 1] [0.02110093 0.95536086 0.02353821] InCorrect
[0 0 1] [0.02031518 0.95476479 0.02492002] InCorrect
[0 0 1] [0.01924208 0.95490102 0.02585691] InCorrect
[0 0 1] [0.01975263 0.95440082 0.02584655] InCorrect
[0 0 1] [0.02029343 0.95500019 0.02470639] InCorrect
[0 0 1] [0.02135021 0.95503637 0.02361342] InCorrect
[0 0 1] [0.02017182 0.95515471 0.02467347] InCorrect
[0 0 1] [0.01963861 0.95452695 0.02583444] InCorrect
[0 0 1] [0.02039665 0.95515188 0.02445147] InCorrect
[0 0 1] [0.02054192 0.95427898 0.02517909] InCorrect
[0 0 1] [0.01945533 0.95455845 0.02598622] InCorrect
[0 0 1] [0.019537   0.95448224 0.02598076] InCorrect
[0 0 1] [0.0201648  0.95483331 0.02500189] InCorrect
[0 0 1] [0.02059044 0.95468262 0.02472694] InCorrect
[0 0 1] [0.02046433 0.95500623 0.02452944] InCorrect
[0 0 1] [0.02006918 0.9545484  0.02538243] InCorrect
[0 0 1] [0.02080652 0.95473361 0.02445987] InCorrect
Accuracy:- 33.33333333333333 %
...