다운로드
작성자: admin 작성일시: 2018-04-19 11:42:04 조회수: 603 다운로드: 55
카테고리: 머신 러닝 태그목록:

RNN Text Generation

In:
from keras.models import Sequential
from keras.layers import Dense, Embedding, LSTM, Dropout
from keras.optimizers import *
from keras.utils import np_utils
from keras.preprocessing import sequence
from keras.preprocessing.text import Tokenizer

from nltk.tokenize import sent_tokenize
from konlpy.corpus import kolaw
from konlpy.tag import Twitter

Corpus

In:
c = kolaw.open('constitution.txt').read()
In:
senstents = [s for s in sent_tokenize(c)]
In:
senstents[3]
Out:
'제2조 ① 대한민국의 국민이 되는 요건은 법률로 정한다.'

Preprocessing

In:
twitter = Twitter()
doc0 = [" ".join(["".join(w) for w, t in twitter.pos(s) 
                  if t not in ['Number', "Foreign"] and w not in ["제", "조"]]) for s in sent_tokenize(c)]
In:
len(doc0)
Out:
357
In:
doc0[3]
Out:
'대한민국 의 국민 이 되는 요건 은 법률 로 정한 다 .'
In:
tokenizer = Tokenizer()
tokenizer.fit_on_texts(doc0)
doc = [l for l in tokenizer.texts_to_sequences(doc0) if len(l) > 1]
In:
len(doc)
Out:
354
In:
doc[3]
Out:
[101, 1, 19, 6, 177, 653, 5, 9, 20, 37, 3]
In:
maxlen = max([len(x) - 1 for x in doc])
vocab_size = len(tokenizer.word_index) + 1
In:
maxlen, vocab_size
Out:
(188, 1205)

Data Generation

In:
def generate_data(X, maxlen, vocab_size):
    for sentence in X: 
        inputs = []
        targets = []
        for i in range(1, len(sentence)):
            inputs.append(sentence[0:i])
            targets.append(sentence[i])
        y = np_utils.to_categorical(targets, vocab_size)
        inputs_sequence = sequence.pad_sequences(inputs, maxlen=maxlen)
        yield (inputs_sequence, y)
In:
for i, (x, y) in enumerate(generate_data(doc, maxlen, vocab_size)):
    print("i", i)
    print("x", x.shape, "\n", x)
    print("y", y.shape, "\n", y)
    if i > 1:
        break
i 0
x (188, 188) 
 [[  0   0   0 ...   0   0 101]
 [  0   0   0 ...   0 101  24]
 [  0   0   0 ... 101  24 607]
 ...
 [  0   0 101 ... 155   2  18]
 [  0 101  24 ...   2  18 176]
 [101  24 607 ...  18 176   7]]
y (188, 1205) 
 [[0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 ...
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]]
i 1
x (5, 188) 
 [[  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0  46]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0  46 648]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0  46 648 101]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0  46 648 101   5]
 [  0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0   0
    0   0   0  46 648 101   5 649]]
y (5, 1205) 
 [[0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]]
i 2
x (13, 188) 
 [[  0   0   0 ...   0   0 101]
 [  0   0   0 ...   0 101   1]
 [  0   0   0 ... 101   1 437]
 ...
 [  0   0   0 ... 651   5  19]
 [  0   0   0 ...   5  19 326]
 [  0   0   0 ...  19 326 652]]
y (13, 1205) 
 [[0. 1. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 ...
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]]
In:
X = []
Y = []
for x, y in generate_data(doc, maxlen, vocab_size):
    X.append(x)
    Y.append(y)

X = np.concatenate(X)
Y = np.concatenate(Y)
In:
X.shape, Y.shape
Out:
((6923, 188), (6923, 1205))

Model

In:
model = Sequential()
model.add(Embedding(vocab_size, 100, input_length=maxlen))
model.add(LSTM(100, return_sequences=False))
model.add(Dropout(0.5))
model.add(Dense(vocab_size, activation='softmax'))
In:
model.summary()
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
embedding_4 (Embedding)      (None, 188, 100)          120500    
_________________________________________________________________
lstm_4 (LSTM)                (None, 100)               80400     
_________________________________________________________________
dropout_2 (Dropout)          (None, 100)               0         
_________________________________________________________________
dense_4 (Dense)              (None, 1205)              121705    
=================================================================
Total params: 322,605
Trainable params: 322,605
Non-trainable params: 0
_________________________________________________________________

Training

In:
model.compile(loss='categorical_crossentropy', optimizer=RMSprop(), metrics=["accuracy"])
In:
%%time
hist = model.fit(X, Y, epochs=500, batch_size=800, verbose=2)
Epoch 1/500
 - 4s - loss: 6.7324 - acc: 0.0403
Epoch 2/500
 - 3s - loss: 5.8547 - acc: 0.0506
Epoch 3/500
 - 3s - loss: 5.7481 - acc: 0.0478
Epoch 4/500
 - 3s - loss: 5.7281 - acc: 0.0461
Epoch 5/500
 - 3s - loss: 5.7047 - acc: 0.0498
Epoch 6/500
 - 3s - loss: 5.6813 - acc: 0.0497
Epoch 7/500
 - 3s - loss: 5.6626 - acc: 0.0469
Epoch 8/500
 - 3s - loss: 5.6355 - acc: 0.0491
Epoch 9/500
 - 3s - loss: 5.6050 - acc: 0.0488
Epoch 10/500
 - 3s - loss: 5.5755 - acc: 0.0494
Epoch 11/500
 - 3s - loss: 5.5497 - acc: 0.0514
Epoch 12/500
 - 3s - loss: 5.5065 - acc: 0.0527
Epoch 13/500
 - 3s - loss: 5.4761 - acc: 0.0568
Epoch 14/500
 - 3s - loss: 5.4370 - acc: 0.0575
Epoch 15/500
 - 3s - loss: 5.3962 - acc: 0.0601
Epoch 16/500
 - 3s - loss: 5.3676 - acc: 0.0612
Epoch 17/500
 - 3s - loss: 5.3323 - acc: 0.0646
Epoch 18/500
 - 3s - loss: 5.2968 - acc: 0.0680
Epoch 19/500
 - 3s - loss: 5.2600 - acc: 0.0696
Epoch 20/500
 - 3s - loss: 5.2297 - acc: 0.0735
Epoch 21/500
 - 3s - loss: 5.1884 - acc: 0.0816
Epoch 22/500
 - 3s - loss: 5.1500 - acc: 0.0875
Epoch 23/500
 - 3s - loss: 5.1125 - acc: 0.0871
Epoch 24/500
 - 3s - loss: 5.0766 - acc: 0.0939
Epoch 25/500
 - 3s - loss: 5.0382 - acc: 0.0975
Epoch 26/500
 - 3s - loss: 5.0050 - acc: 0.0992
Epoch 27/500
 - 3s - loss: 4.9679 - acc: 0.1026
Epoch 28/500
 - 3s - loss: 4.9308 - acc: 0.1092
Epoch 29/500
 - 3s - loss: 4.8992 - acc: 0.1099
Epoch 30/500
 - 3s - loss: 4.8629 - acc: 0.1177
Epoch 31/500
 - 3s - loss: 4.8344 - acc: 0.1195
Epoch 32/500
 - 3s - loss: 4.7990 - acc: 0.1258
Epoch 33/500
 - 3s - loss: 4.7623 - acc: 0.1351
Epoch 34/500
 - 3s - loss: 4.7311 - acc: 0.1361
Epoch 35/500
 - 3s - loss: 4.6967 - acc: 0.1457
Epoch 36/500
 - 3s - loss: 4.6667 - acc: 0.1486
Epoch 37/500
 - 3s - loss: 4.6314 - acc: 0.1475
Epoch 38/500
 - 3s - loss: 4.5981 - acc: 0.1608
Epoch 39/500
 - 3s - loss: 4.5825 - acc: 0.1603
Epoch 40/500
 - 3s - loss: 4.5446 - acc: 0.1645
Epoch 41/500
 - 3s - loss: 4.5173 - acc: 0.1694
Epoch 42/500
 - 3s - loss: 4.4778 - acc: 0.1733
Epoch 43/500
 - 3s - loss: 4.4500 - acc: 0.1764
Epoch 44/500
 - 3s - loss: 4.4189 - acc: 0.1832
Epoch 45/500
 - 3s - loss: 4.3938 - acc: 0.1839
Epoch 46/500
 - 3s - loss: 4.3594 - acc: 0.1879
Epoch 47/500
 - 3s - loss: 4.3459 - acc: 0.1959
Epoch 48/500
 - 3s - loss: 4.3157 - acc: 0.2002
Epoch 49/500
 - 3s - loss: 4.2747 - acc: 0.1999
Epoch 50/500
 - 3s - loss: 4.2496 - acc: 0.2038
Epoch 51/500
 - 3s - loss: 4.2247 - acc: 0.2105
Epoch 52/500
 - 3s - loss: 4.1975 - acc: 0.2086
Epoch 53/500
 - 3s - loss: 4.1587 - acc: 0.2142
Epoch 54/500
 - 3s - loss: 4.1421 - acc: 0.2191
Epoch 55/500
 - 4s - loss: 4.1137 - acc: 0.2239
Epoch 56/500
 - 3s - loss: 4.0767 - acc: 0.2295
Epoch 57/500
 - 3s - loss: 4.0559 - acc: 0.2349
Epoch 58/500
 - 3s - loss: 4.0254 - acc: 0.2414
Epoch 59/500
 - 3s - loss: 3.9961 - acc: 0.2409
Epoch 60/500
 - 3s - loss: 3.9705 - acc: 0.2458
Epoch 61/500
 - 3s - loss: 3.9483 - acc: 0.2471
Epoch 62/500
 - 3s - loss: 3.9241 - acc: 0.2451
Epoch 63/500
 - 3s - loss: 3.9004 - acc: 0.2552
Epoch 64/500
 - 3s - loss: 3.8728 - acc: 0.2594
Epoch 65/500
 - 3s - loss: 3.8531 - acc: 0.2638
Epoch 66/500
 - 3s - loss: 3.8256 - acc: 0.2645
Epoch 67/500
 - 3s - loss: 3.7965 - acc: 0.2685
Epoch 68/500
 - 3s - loss: 3.7675 - acc: 0.2759
Epoch 69/500
 - 3s - loss: 3.7452 - acc: 0.2749
Epoch 70/500
 - 3s - loss: 3.7242 - acc: 0.2792
Epoch 71/500
 - 3s - loss: 3.7052 - acc: 0.2818
Epoch 72/500
 - 3s - loss: 3.6704 - acc: 0.2880
Epoch 73/500
 - 3s - loss: 3.6517 - acc: 0.2895
Epoch 74/500
 - 3s - loss: 3.6361 - acc: 0.2957
Epoch 75/500
 - 3s - loss: 3.6083 - acc: 0.2918
Epoch 76/500
 - 3s - loss: 3.5783 - acc: 0.2955
Epoch 77/500
 - 3s - loss: 3.5553 - acc: 0.2997
Epoch 78/500
 - 3s - loss: 3.5402 - acc: 0.3078
Epoch 79/500
 - 3s - loss: 3.5072 - acc: 0.3126
Epoch 80/500
 - 3s - loss: 3.4864 - acc: 0.3147
Epoch 81/500
 - 3s - loss: 3.4681 - acc: 0.3137
Epoch 82/500
 - 3s - loss: 3.4401 - acc: 0.3142
Epoch 83/500
 - 3s - loss: 3.4171 - acc: 0.3155
Epoch 84/500
 - 3s - loss: 3.3913 - acc: 0.3272
Epoch 85/500
 - 3s - loss: 3.3773 - acc: 0.3276
Epoch 86/500
 - 3s - loss: 3.3541 - acc: 0.3306
Epoch 87/500
 - 3s - loss: 3.3307 - acc: 0.3302
Epoch 88/500
 - 3s - loss: 3.3035 - acc: 0.3354
Epoch 89/500
 - 3s - loss: 3.2914 - acc: 0.3341
Epoch 90/500
 - 3s - loss: 3.2749 - acc: 0.3371
Epoch 91/500
 - 3s - loss: 3.2460 - acc: 0.3439
Epoch 92/500
 - 3s - loss: 3.2271 - acc: 0.3436
Epoch 93/500
 - 3s - loss: 3.2218 - acc: 0.3452
Epoch 94/500
 - 3s - loss: 3.1909 - acc: 0.3526
Epoch 95/500
 - 3s - loss: 3.1688 - acc: 0.3507
Epoch 96/500
 - 3s - loss: 3.1491 - acc: 0.3550
Epoch 97/500
 - 3s - loss: 3.1315 - acc: 0.3558
Epoch 98/500
 - 3s - loss: 3.1115 - acc: 0.3571
Epoch 99/500
 - 3s - loss: 3.0811 - acc: 0.3607
Epoch 100/500
 - 3s - loss: 3.0740 - acc: 0.3643
Epoch 101/500
 - 3s - loss: 3.0533 - acc: 0.3669
Epoch 102/500
 - 3s - loss: 3.0359 - acc: 0.3678
Epoch 103/500
 - 3s - loss: 3.0191 - acc: 0.3777
Epoch 104/500
 - 3s - loss: 2.9898 - acc: 0.3769
Epoch 105/500
 - 3s - loss: 2.9875 - acc: 0.3748
Epoch 106/500
 - 3s - loss: 2.9587 - acc: 0.3763
Epoch 107/500
 - 3s - loss: 2.9410 - acc: 0.3887
Epoch 108/500
 - 3s - loss: 2.9212 - acc: 0.3854
Epoch 109/500
 - 3s - loss: 2.9020 - acc: 0.3901
Epoch 110/500
 - 3s - loss: 2.8898 - acc: 0.3922
Epoch 111/500
 - 3s - loss: 2.8700 - acc: 0.3961
Epoch 112/500
 - 3s - loss: 2.8500 - acc: 0.3992
Epoch 113/500
 - 3s - loss: 2.8443 - acc: 0.3952
Epoch 114/500
 - 3s - loss: 2.8165 - acc: 0.3978
Epoch 115/500
 - 3s - loss: 2.7981 - acc: 0.4016
Epoch 116/500
 - 3s - loss: 2.7790 - acc: 0.4138
Epoch 117/500
 - 3s - loss: 2.7649 - acc: 0.4083
Epoch 118/500
 - 3s - loss: 2.7491 - acc: 0.4127
Epoch 119/500
 - 3s - loss: 2.7413 - acc: 0.4127
Epoch 120/500
 - 3s - loss: 2.7098 - acc: 0.4198
Epoch 121/500
 - 3s - loss: 2.7020 - acc: 0.4226
Epoch 122/500
 - 3s - loss: 2.6922 - acc: 0.4224
Epoch 123/500
 - 3s - loss: 2.6726 - acc: 0.4245
Epoch 124/500
 - 3s - loss: 2.6637 - acc: 0.4291
Epoch 125/500
 - 3s - loss: 2.6254 - acc: 0.4313
Epoch 126/500
 - 3s - loss: 2.6253 - acc: 0.4336
Epoch 127/500
 - 3s - loss: 2.6116 - acc: 0.4362
Epoch 128/500
 - 3s - loss: 2.5817 - acc: 0.4417
Epoch 129/500
 - 3s - loss: 2.5825 - acc: 0.4401
Epoch 130/500
 - 3s - loss: 2.5600 - acc: 0.4434
Epoch 131/500
 - 3s - loss: 2.5467 - acc: 0.4404
Epoch 132/500
 - 3s - loss: 2.5266 - acc: 0.4527
Epoch 133/500
 - 3s - loss: 2.5165 - acc: 0.4471
Epoch 134/500
 - 3s - loss: 2.4950 - acc: 0.4549
Epoch 135/500
 - 3s - loss: 2.4908 - acc: 0.4544
Epoch 136/500
 - 3s - loss: 2.4805 - acc: 0.4573
Epoch 137/500
 - 3s - loss: 2.4574 - acc: 0.4592
Epoch 138/500
 - 3s - loss: 2.4435 - acc: 0.4624
Epoch 139/500
 - 3s - loss: 2.4278 - acc: 0.4689
Epoch 140/500
 - 3s - loss: 2.4021 - acc: 0.4687
Epoch 141/500
 - 3s - loss: 2.4059 - acc: 0.4648
Epoch 142/500
 - 3s - loss: 2.3994 - acc: 0.4696
Epoch 143/500
 - 3s - loss: 2.3687 - acc: 0.4794
Epoch 144/500
 - 3s - loss: 2.3784 - acc: 0.4749
Epoch 145/500
 - 3s - loss: 2.3496 - acc: 0.4797
Epoch 146/500
 - 3s - loss: 2.3484 - acc: 0.4837
Epoch 147/500
 - 3s - loss: 2.3388 - acc: 0.4853
Epoch 148/500
 - 3s - loss: 2.3105 - acc: 0.4820
Epoch 149/500
 - 3s - loss: 2.2980 - acc: 0.4888
Epoch 150/500
 - 3s - loss: 2.2676 - acc: 0.4966
Epoch 151/500
 - 3s - loss: 2.2673 - acc: 0.4993
Epoch 152/500
 - 3s - loss: 2.2586 - acc: 0.4917
Epoch 153/500
 - 3s - loss: 2.2533 - acc: 0.4956
Epoch 154/500
 - 3s - loss: 2.2442 - acc: 0.4963
Epoch 155/500
 - 3s - loss: 2.2385 - acc: 0.4954
Epoch 156/500
 - 3s - loss: 2.2123 - acc: 0.5053
Epoch 157/500
 - 3s - loss: 2.1871 - acc: 0.5083
Epoch 158/500
 - 3s - loss: 2.1935 - acc: 0.5069
Epoch 159/500
 - 3s - loss: 2.1776 - acc: 0.5083
Epoch 160/500
 - 3s - loss: 2.1594 - acc: 0.5196
Epoch 161/500
 - 3s - loss: 2.1527 - acc: 0.5181
Epoch 162/500
 - 3s - loss: 2.1483 - acc: 0.5220
Epoch 163/500
 - 3s - loss: 2.1231 - acc: 0.5232
Epoch 164/500
 - 3s - loss: 2.1248 - acc: 0.5238
Epoch 165/500
 - 3s - loss: 2.1051 - acc: 0.5278
Epoch 166/500
 - 3s - loss: 2.0878 - acc: 0.5340
Epoch 167/500
 - 3s - loss: 2.0869 - acc: 0.5301
Epoch 168/500
 - 3s - loss: 2.0626 - acc: 0.5337
Epoch 169/500
 - 3s - loss: 2.0593 - acc: 0.5385
Epoch 170/500
 - 3s - loss: 2.0594 - acc: 0.5334
Epoch 171/500
 - 3s - loss: 2.0405 - acc: 0.5434
Epoch 172/500
 - 3s - loss: 2.0293 - acc: 0.5430
Epoch 173/500
 - 3s - loss: 2.0324 - acc: 0.5347
Epoch 174/500
 - 3s - loss: 2.0216 - acc: 0.5398
Epoch 175/500
 - 3s - loss: 1.9958 - acc: 0.5535
Epoch 176/500
 - 3s - loss: 1.9796 - acc: 0.5571
Epoch 177/500
 - 3s - loss: 1.9703 - acc: 0.5457
Epoch 178/500
 - 3s - loss: 1.9711 - acc: 0.5602
Epoch 179/500
 - 3s - loss: 1.9572 - acc: 0.5555
Epoch 180/500
 - 3s - loss: 1.9526 - acc: 0.5568
Epoch 181/500
 - 3s - loss: 1.9474 - acc: 0.5619
Epoch 182/500
 - 3s - loss: 1.9229 - acc: 0.5652
Epoch 183/500
 - 3s - loss: 1.9340 - acc: 0.5620
Epoch 184/500
 - 3s - loss: 1.8960 - acc: 0.5697
Epoch 185/500
 - 3s - loss: 1.8986 - acc: 0.5690
Epoch 186/500
 - 3s - loss: 1.8830 - acc: 0.5771
Epoch 187/500
 - 3s - loss: 1.8647 - acc: 0.5775
Epoch 188/500
 - 3s - loss: 1.8709 - acc: 0.5840
Epoch 189/500
 - 3s - loss: 1.8545 - acc: 0.5800
Epoch 190/500
 - 3s - loss: 1.8591 - acc: 0.5787
Epoch 191/500
 - 3s - loss: 1.8384 - acc: 0.5778
Epoch 192/500
 - 3s - loss: 1.8295 - acc: 0.5794
Epoch 193/500
 - 3s - loss: 1.8237 - acc: 0.5807
Epoch 194/500
 - 3s - loss: 1.8174 - acc: 0.5852
Epoch 195/500
 - 3s - loss: 1.8106 - acc: 0.5828
Epoch 196/500
 - 3s - loss: 1.8049 - acc: 0.5943
Epoch 197/500
 - 3s - loss: 1.7952 - acc: 0.5876
Epoch 198/500
 - 3s - loss: 1.7735 - acc: 0.5960
Epoch 199/500
 - 3s - loss: 1.7766 - acc: 0.5932
Epoch 200/500
 - 3s - loss: 1.7557 - acc: 0.6013
Epoch 201/500
 - 3s - loss: 1.7652 - acc: 0.5989
Epoch 202/500
 - 3s - loss: 1.7478 - acc: 0.6062
Epoch 203/500
 - 3s - loss: 1.7330 - acc: 0.6064
Epoch 204/500
 - 3s - loss: 1.7396 - acc: 0.6003
Epoch 205/500
 - 3s - loss: 1.7174 - acc: 0.6026
Epoch 206/500
 - 3s - loss: 1.7213 - acc: 0.6055
Epoch 207/500
 - 3s - loss: 1.7046 - acc: 0.6084
Epoch 208/500
 - 3s - loss: 1.7045 - acc: 0.6094
Epoch 209/500
 - 3s - loss: 1.6973 - acc: 0.6083
Epoch 210/500
 - 3s - loss: 1.6903 - acc: 0.6172
Epoch 211/500
 - 3s - loss: 1.6658 - acc: 0.6211
Epoch 212/500
 - 3s - loss: 1.6634 - acc: 0.6229
Epoch 213/500
 - 3s - loss: 1.6705 - acc: 0.6191
Epoch 214/500
 - 3s - loss: 1.6536 - acc: 0.6175
Epoch 215/500
 - 3s - loss: 1.6464 - acc: 0.6184
Epoch 216/500
 - 3s - loss: 1.6243 - acc: 0.6295
Epoch 217/500
 - 3s - loss: 1.6207 - acc: 0.6347
Epoch 218/500
 - 3s - loss: 1.6070 - acc: 0.6302
Epoch 219/500
 - 3s - loss: 1.6182 - acc: 0.6283
Epoch 220/500
 - 3s - loss: 1.6077 - acc: 0.6244
Epoch 221/500
 - 3s - loss: 1.5983 - acc: 0.6333
Epoch 222/500
 - 3s - loss: 1.5965 - acc: 0.6364
Epoch 223/500
 - 3s - loss: 1.5852 - acc: 0.6406
Epoch 224/500
 - 3s - loss: 1.5777 - acc: 0.6364
Epoch 225/500
 - 3s - loss: 1.5601 - acc: 0.6413
Epoch 226/500
 - 3s - loss: 1.5592 - acc: 0.6448
Epoch 227/500
 - 3s - loss: 1.5563 - acc: 0.6389
Epoch 228/500
 - 3s - loss: 1.5471 - acc: 0.6468
Epoch 229/500
 - 3s - loss: 1.5429 - acc: 0.6405
Epoch 230/500
 - 3s - loss: 1.5211 - acc: 0.6604
Epoch 231/500
 - 3s - loss: 1.5145 - acc: 0.6470
Epoch 232/500
 - 3s - loss: 1.5142 - acc: 0.6529
Epoch 233/500
 - 3s - loss: 1.5008 - acc: 0.6574
Epoch 234/500
 - 3s - loss: 1.4963 - acc: 0.6588
Epoch 235/500
 - 3s - loss: 1.4984 - acc: 0.6548
Epoch 236/500
 - 3s - loss: 1.4928 - acc: 0.6533
Epoch 237/500
 - 3s - loss: 1.4968 - acc: 0.6590
Epoch 238/500
 - 3s - loss: 1.4744 - acc: 0.6588
Epoch 239/500
 - 3s - loss: 1.4638 - acc: 0.6658
Epoch 240/500
 - 3s - loss: 1.4752 - acc: 0.6593
Epoch 241/500
 - 3s - loss: 1.4697 - acc: 0.6580
Epoch 242/500
 - 3s - loss: 1.4800 - acc: 0.6542
Epoch 243/500
 - 3s - loss: 1.4484 - acc: 0.6662
Epoch 244/500
 - 3s - loss: 1.4451 - acc: 0.6646
Epoch 245/500
 - 3s - loss: 1.4347 - acc: 0.6695
Epoch 246/500
 - 3s - loss: 1.4476 - acc: 0.6594
Epoch 247/500
 - 3s - loss: 1.4127 - acc: 0.6759
Epoch 248/500
 - 3s - loss: 1.4246 - acc: 0.6681
Epoch 249/500
 - 3s - loss: 1.4098 - acc: 0.6747
Epoch 250/500
 - 3s - loss: 1.3923 - acc: 0.6822
Epoch 251/500
 - 3s - loss: 1.4170 - acc: 0.6707
Epoch 252/500
 - 3s - loss: 1.3948 - acc: 0.6760
Epoch 253/500
 - 3s - loss: 1.3865 - acc: 0.6728
Epoch 254/500
 - 3s - loss: 1.3947 - acc: 0.6769
Epoch 255/500
 - 3s - loss: 1.3799 - acc: 0.6799
Epoch 256/500
 - 3s - loss: 1.3736 - acc: 0.6877
Epoch 257/500
 - 3s - loss: 1.3594 - acc: 0.6834
Epoch 258/500
 - 3s - loss: 1.3651 - acc: 0.6828
Epoch 259/500
 - 3s - loss: 1.3576 - acc: 0.6860
Epoch 260/500
 - 3s - loss: 1.3733 - acc: 0.6812
Epoch 261/500
 - 3s - loss: 1.3537 - acc: 0.6822
Epoch 262/500
 - 3s - loss: 1.3506 - acc: 0.6879
Epoch 263/500
 - 3s - loss: 1.3298 - acc: 0.6928
Epoch 264/500
 - 3s - loss: 1.3243 - acc: 0.6814
Epoch 265/500
 - 3s - loss: 1.3267 - acc: 0.6903
Epoch 266/500
 - 4s - loss: 1.3106 - acc: 0.6858
Epoch 267/500
 - 3s - loss: 1.3238 - acc: 0.6958
Epoch 268/500
 - 3s - loss: 1.2991 - acc: 0.6923
Epoch 269/500
 - 3s - loss: 1.3034 - acc: 0.7029
Epoch 270/500
 - 3s - loss: 1.2952 - acc: 0.6962
Epoch 271/500
 - 3s - loss: 1.2894 - acc: 0.7040
Epoch 272/500
 - 3s - loss: 1.2905 - acc: 0.6935
Epoch 273/500
 - 3s - loss: 1.2918 - acc: 0.7006
Epoch 274/500
 - 3s - loss: 1.2925 - acc: 0.6931
Epoch 275/500
 - 3s - loss: 1.2975 - acc: 0.6928
Epoch 276/500
 - 3s - loss: 1.2737 - acc: 0.7048
Epoch 277/500
 - 3s - loss: 1.2696 - acc: 0.7000
Epoch 278/500
 - 3s - loss: 1.2544 - acc: 0.7046
Epoch 279/500
 - 3s - loss: 1.2544 - acc: 0.7033
Epoch 280/500
 - 3s - loss: 1.2620 - acc: 0.7061
Epoch 281/500
 - 3s - loss: 1.2585 - acc: 0.7048
Epoch 282/500
 - 3s - loss: 1.2363 - acc: 0.7072
Epoch 283/500
 - 3s - loss: 1.2465 - acc: 0.7089
Epoch 284/500
 - 3s - loss: 1.2278 - acc: 0.7045
Epoch 285/500
 - 3s - loss: 1.2262 - acc: 0.7115
Epoch 286/500
 - 3s - loss: 1.2278 - acc: 0.7139
Epoch 287/500
 - 3s - loss: 1.2135 - acc: 0.7160
Epoch 288/500
 - 3s - loss: 1.2215 - acc: 0.7133
Epoch 289/500
 - 3s - loss: 1.2162 - acc: 0.7154
Epoch 290/500
 - 3s - loss: 1.2028 - acc: 0.7188
Epoch 291/500
 - 3s - loss: 1.1968 - acc: 0.7214
Epoch 292/500
 - 3s - loss: 1.1973 - acc: 0.7170
Epoch 293/500
 - 3s - loss: 1.1858 - acc: 0.7225
Epoch 294/500
 - 3s - loss: 1.1982 - acc: 0.7180
Epoch 295/500
 - 3s - loss: 1.1784 - acc: 0.7234
Epoch 296/500
 - 3s - loss: 1.1783 - acc: 0.7183
Epoch 297/500
 - 3s - loss: 1.1886 - acc: 0.7199
Epoch 298/500
 - 3s - loss: 1.1535 - acc: 0.7336
Epoch 299/500
 - 3s - loss: 1.1654 - acc: 0.7227
Epoch 300/500
 - 3s - loss: 1.1589 - acc: 0.7224
Epoch 301/500
 - 3s - loss: 1.1584 - acc: 0.7240
Epoch 302/500
 - 3s - loss: 1.1652 - acc: 0.7258
Epoch 303/500
 - 3s - loss: 1.1515 - acc: 0.7284
Epoch 304/500
 - 3s - loss: 1.1369 - acc: 0.7287
Epoch 305/500
 - 3s - loss: 1.1504 - acc: 0.7299
Epoch 306/500
 - 3s - loss: 1.1504 - acc: 0.7254
Epoch 307/500
 - 3s - loss: 1.1364 - acc: 0.7224
Epoch 308/500
 - 3s - loss: 1.1344 - acc: 0.7279
Epoch 309/500
 - 3s - loss: 1.1169 - acc: 0.7380
Epoch 310/500
 - 3s - loss: 1.1208 - acc: 0.7341
Epoch 311/500
 - 3s - loss: 1.1200 - acc: 0.7282
Epoch 312/500
 - 3s - loss: 1.1256 - acc: 0.7358
Epoch 313/500
 - 3s - loss: 1.1026 - acc: 0.7404
Epoch 314/500
 - 3s - loss: 1.1052 - acc: 0.7368
Epoch 315/500
 - 3s - loss: 1.1164 - acc: 0.7351
Epoch 316/500
 - 3s - loss: 1.1105 - acc: 0.7308
Epoch 317/500
 - 3s - loss: 1.1046 - acc: 0.7308
Epoch 318/500
 - 4s - loss: 1.0912 - acc: 0.7397
Epoch 319/500
 - 3s - loss: 1.0895 - acc: 0.7410
Epoch 320/500
 - 3s - loss: 1.0909 - acc: 0.7378
Epoch 321/500
 - 3s - loss: 1.0810 - acc: 0.7464
Epoch 322/500
 - 3s - loss: 1.0788 - acc: 0.7410
Epoch 323/500
 - 3s - loss: 1.0808 - acc: 0.7375
Epoch 324/500
 - 3s - loss: 1.0834 - acc: 0.7358
Epoch 325/500
 - 3s - loss: 1.0709 - acc: 0.7435
Epoch 326/500
 - 3s - loss: 1.0546 - acc: 0.7452
Epoch 327/500
 - 3s - loss: 1.0710 - acc: 0.7442
Epoch 328/500
 - 3s - loss: 1.0533 - acc: 0.7429
Epoch 329/500
 - 3s - loss: 1.0488 - acc: 0.7501
Epoch 330/500
 - 3s - loss: 1.0506 - acc: 0.7542
Epoch 331/500
 - 3s - loss: 1.0676 - acc: 0.7449
Epoch 332/500
 - 3s - loss: 1.0449 - acc: 0.7503
Epoch 333/500
 - 3s - loss: 1.0316 - acc: 0.7550
Epoch 334/500
 - 3s - loss: 1.0292 - acc: 0.7504
Epoch 335/500
 - 3s - loss: 1.0377 - acc: 0.7491
Epoch 336/500
 - 3s - loss: 1.0304 - acc: 0.7505
Epoch 337/500
 - 3s - loss: 1.0308 - acc: 0.7570
Epoch 338/500
 - 3s - loss: 1.0195 - acc: 0.7544
Epoch 339/500
 - 3s - loss: 1.0055 - acc: 0.7573
Epoch 340/500
 - 3s - loss: 1.0275 - acc: 0.7494
Epoch 341/500
 - 3s - loss: 1.0038 - acc: 0.7586
Epoch 342/500
 - 3s - loss: 0.9977 - acc: 0.7605
Epoch 343/500
 - 3s - loss: 1.0001 - acc: 0.7588
Epoch 344/500
 - 3s - loss: 1.0233 - acc: 0.7495
Epoch 345/500
 - 3s - loss: 1.0133 - acc: 0.7592
Epoch 346/500
 - 3s - loss: 0.9919 - acc: 0.7596
Epoch 347/500
 - 3s - loss: 0.9969 - acc: 0.7609
Epoch 348/500
 - 3s - loss: 1.0003 - acc: 0.7562
Epoch 349/500
 - 3s - loss: 0.9962 - acc: 0.7609
Epoch 350/500
 - 3s - loss: 0.9995 - acc: 0.7586
Epoch 351/500
 - 3s - loss: 0.9765 - acc: 0.7633
Epoch 352/500
 - 3s - loss: 0.9731 - acc: 0.7633
Epoch 353/500
 - 3s - loss: 0.9634 - acc: 0.7683
Epoch 354/500
 - 3s - loss: 0.9797 - acc: 0.7631
Epoch 355/500
 - 3s - loss: 0.9865 - acc: 0.7637
Epoch 356/500
 - 3s - loss: 0.9610 - acc: 0.7708
Epoch 357/500
 - 3s - loss: 0.9652 - acc: 0.7663
Epoch 358/500
 - 3s - loss: 0.9597 - acc: 0.7614
Epoch 359/500
 - 3s - loss: 0.9653 - acc: 0.7659
Epoch 360/500
 - 3s - loss: 0.9595 - acc: 0.7660
Epoch 361/500
 - 3s - loss: 0.9427 - acc: 0.7780
Epoch 362/500
 - 3s - loss: 0.9587 - acc: 0.7637
Epoch 363/500
 - 3s - loss: 0.9407 - acc: 0.7712
Epoch 364/500
 - 3s - loss: 0.9486 - acc: 0.7741
Epoch 365/500
 - 3s - loss: 0.9391 - acc: 0.7651
Epoch 366/500
 - 3s - loss: 0.9335 - acc: 0.7748
Epoch 367/500
 - 3s - loss: 0.9275 - acc: 0.7815
Epoch 368/500
 - 3s - loss: 0.9500 - acc: 0.7661
Epoch 369/500
 - 3s - loss: 0.9335 - acc: 0.7750
Epoch 370/500
 - 3s - loss: 0.9268 - acc: 0.7748
Epoch 371/500
 - 4s - loss: 0.9209 - acc: 0.7783
Epoch 372/500
 - 3s - loss: 0.9145 - acc: 0.7765
Epoch 373/500
 - 3s - loss: 0.9279 - acc: 0.7724
Epoch 374/500
 - 3s - loss: 0.9187 - acc: 0.7765
Epoch 375/500
 - 3s - loss: 0.9125 - acc: 0.7819
Epoch 376/500
 - 3s - loss: 0.9212 - acc: 0.7787
Epoch 377/500
 - 3s - loss: 0.9087 - acc: 0.7797
Epoch 378/500
 - 3s - loss: 0.9038 - acc: 0.7816
Epoch 379/500
 - 3s - loss: 0.9050 - acc: 0.7763
Epoch 380/500
 - 3s - loss: 0.9000 - acc: 0.7843
Epoch 381/500
 - 3s - loss: 0.9169 - acc: 0.7758
Epoch 382/500
 - 3s - loss: 0.8894 - acc: 0.7856
Epoch 383/500
 - 3s - loss: 0.8927 - acc: 0.7751
Epoch 384/500
 - 3s - loss: 0.8812 - acc: 0.7828
Epoch 385/500
 - 3s - loss: 0.8789 - acc: 0.7868
Epoch 386/500
 - 3s - loss: 0.8912 - acc: 0.7825
Epoch 387/500
 - 3s - loss: 0.8748 - acc: 0.7900
Epoch 388/500
 - 3s - loss: 0.8858 - acc: 0.7817
Epoch 389/500
 - 3s - loss: 0.8878 - acc: 0.7789
Epoch 390/500
 - 3s - loss: 0.8715 - acc: 0.7836
Epoch 391/500
 - 3s - loss: 0.8746 - acc: 0.7830
Epoch 392/500
 - 3s - loss: 0.8692 - acc: 0.7867
Epoch 393/500
 - 3s - loss: 0.8674 - acc: 0.7868
Epoch 394/500
 - 3s - loss: 0.8695 - acc: 0.7884
Epoch 395/500
 - 3s - loss: 0.8612 - acc: 0.7895
Epoch 396/500
 - 3s - loss: 0.8595 - acc: 0.7916
Epoch 397/500
 - 3s - loss: 0.8468 - acc: 0.7973
Epoch 398/500
 - 3s - loss: 0.8473 - acc: 0.7940
Epoch 399/500
 - 3s - loss: 0.8567 - acc: 0.7858
Epoch 400/500
 - 3s - loss: 0.8439 - acc: 0.7921
Epoch 401/500
 - 3s - loss: 0.8501 - acc: 0.7934
Epoch 402/500
 - 3s - loss: 0.8577 - acc: 0.7878
Epoch 403/500
 - 3s - loss: 0.8373 - acc: 0.7917
Epoch 404/500
 - 3s - loss: 0.8419 - acc: 0.7906
Epoch 405/500
 - 3s - loss: 0.8340 - acc: 0.7956
Epoch 406/500
 - 3s - loss: 0.8352 - acc: 0.7976
Epoch 407/500
 - 3s - loss: 0.8336 - acc: 0.7894
Epoch 408/500
 - 3s - loss: 0.8159 - acc: 0.7984
Epoch 409/500
 - 3s - loss: 0.8373 - acc: 0.7901
Epoch 410/500
 - 3s - loss: 0.8334 - acc: 0.7946
Epoch 411/500
 - 3s - loss: 0.8469 - acc: 0.7911
Epoch 412/500
 - 3s - loss: 0.8265 - acc: 0.7959
Epoch 413/500
 - 3s - loss: 0.8214 - acc: 0.8030
Epoch 414/500
 - 3s - loss: 0.8169 - acc: 0.7962
Epoch 415/500
 - 3s - loss: 0.8016 - acc: 0.8014
Epoch 416/500
 - 3s - loss: 0.8318 - acc: 0.7894
Epoch 417/500
 - 3s - loss: 0.7993 - acc: 0.8040
Epoch 418/500
 - 3s - loss: 0.8185 - acc: 0.8038
Epoch 419/500
 - 3s - loss: 0.8159 - acc: 0.7979
Epoch 420/500
 - 3s - loss: 0.8104 - acc: 0.7973
Epoch 421/500
 - 3s - loss: 0.8056 - acc: 0.8012
Epoch 422/500
 - 3s - loss: 0.7889 - acc: 0.8040
Epoch 423/500
 - 3s - loss: 0.8034 - acc: 0.8018
Epoch 424/500
 - 3s - loss: 0.8011 - acc: 0.8025
Epoch 425/500
 - 3s - loss: 0.7882 - acc: 0.8036
Epoch 426/500
 - 3s - loss: 0.7972 - acc: 0.8012
Epoch 427/500
 - 3s - loss: 0.7779 - acc: 0.8101
Epoch 428/500
 - 3s - loss: 0.7821 - acc: 0.8044
Epoch 429/500
 - 3s - loss: 0.7822 - acc: 0.8033
Epoch 430/500
 - 3s - loss: 0.7855 - acc: 0.8062
Epoch 431/500
 - 3s - loss: 0.7829 - acc: 0.8060
Epoch 432/500
 - 3s - loss: 0.7823 - acc: 0.8037
Epoch 433/500
 - 3s - loss: 0.7682 - acc: 0.8099
Epoch 434/500
 - 3s - loss: 0.7802 - acc: 0.8050
Epoch 435/500
 - 3s - loss: 0.7822 - acc: 0.8023
Epoch 436/500
 - 3s - loss: 0.7779 - acc: 0.8077
Epoch 437/500
 - 3s - loss: 0.7619 - acc: 0.8128
Epoch 438/500
 - 3s - loss: 0.7668 - acc: 0.8085
Epoch 439/500
 - 3s - loss: 0.7730 - acc: 0.8079
Epoch 440/500
 - 3s - loss: 0.7849 - acc: 0.8010
Epoch 441/500
 - 3s - loss: 0.7802 - acc: 0.8041
Epoch 442/500
 - 3s - loss: 0.7619 - acc: 0.8099
Epoch 443/500
 - 3s - loss: 0.7586 - acc: 0.8150
Epoch 444/500
 - 3s - loss: 0.7492 - acc: 0.8168
Epoch 445/500
 - 3s - loss: 0.7532 - acc: 0.8098
Epoch 446/500
 - 3s - loss: 0.7427 - acc: 0.8114
Epoch 447/500
 - 3s - loss: 0.7517 - acc: 0.8153
Epoch 448/500
 - 3s - loss: 0.7537 - acc: 0.8108
Epoch 449/500
 - 3s - loss: 0.7591 - acc: 0.8093
Epoch 450/500
 - 3s - loss: 0.7468 - acc: 0.8119
Epoch 451/500
 - 3s - loss: 0.7487 - acc: 0.8129
Epoch 452/500
 - 3s - loss: 0.7511 - acc: 0.8112
Epoch 453/500
 - 3s - loss: 0.7547 - acc: 0.8138
Epoch 454/500
 - 3s - loss: 0.7428 - acc: 0.8132
Epoch 455/500
 - 3s - loss: 0.7348 - acc: 0.8140
Epoch 456/500
 - 3s - loss: 0.7389 - acc: 0.8147
Epoch 457/500
 - 3s - loss: 0.7417 - acc: 0.8134
Epoch 458/500
 - 3s - loss: 0.7438 - acc: 0.8145
Epoch 459/500
 - 3s - loss: 0.7377 - acc: 0.8144
Epoch 460/500
 - 3s - loss: 0.7354 - acc: 0.8163
Epoch 461/500
 - 3s - loss: 0.7221 - acc: 0.8218
Epoch 462/500
 - 3s - loss: 0.7243 - acc: 0.8220
Epoch 463/500
 - 3s - loss: 0.7339 - acc: 0.8184
Epoch 464/500
 - 3s - loss: 0.7154 - acc: 0.8209
Epoch 465/500
 - 3s - loss: 0.7147 - acc: 0.8181
Epoch 466/500
 - 3s - loss: 0.7221 - acc: 0.8189
Epoch 467/500
 - 3s - loss: 0.7075 - acc: 0.8244
Epoch 468/500
 - 3s - loss: 0.7087 - acc: 0.8189
Epoch 469/500
 - 3s - loss: 0.7223 - acc: 0.8202
Epoch 470/500
 - 3s - loss: 0.7284 - acc: 0.8153
Epoch 471/500
 - 3s - loss: 0.7175 - acc: 0.8202
Epoch 472/500
 - 3s - loss: 0.7139 - acc: 0.8176
Epoch 473/500
 - 3s - loss: 0.6983 - acc: 0.8233
Epoch 474/500
 - 3s - loss: 0.7123 - acc: 0.8203
Epoch 475/500
 - 3s - loss: 0.7112 - acc: 0.8236
Epoch 476/500
 - 4s - loss: 0.7001 - acc: 0.8209
Epoch 477/500
 - 3s - loss: 0.7075 - acc: 0.8206
Epoch 478/500
 - 3s - loss: 0.7066 - acc: 0.8229
Epoch 479/500
 - 3s - loss: 0.7124 - acc: 0.8249
Epoch 480/500
 - 3s - loss: 0.6856 - acc: 0.8293
Epoch 481/500
 - 3s - loss: 0.6927 - acc: 0.8271
Epoch 482/500
 - 3s - loss: 0.6973 - acc: 0.8220
Epoch 483/500
 - 3s - loss: 0.6853 - acc: 0.8307
Epoch 484/500
 - 3s - loss: 0.6987 - acc: 0.8277
Epoch 485/500
 - 3s - loss: 0.6869 - acc: 0.8285
Epoch 486/500
 - 3s - loss: 0.6928 - acc: 0.8241
Epoch 487/500
 - 3s - loss: 0.6856 - acc: 0.8288
Epoch 488/500
 - 3s - loss: 0.6859 - acc: 0.8309
Epoch 489/500
 - 3s - loss: 0.6794 - acc: 0.8284
Epoch 490/500
 - 3s - loss: 0.6912 - acc: 0.8246
Epoch 491/500
 - 3s - loss: 0.6902 - acc: 0.8219
Epoch 492/500
 - 3s - loss: 0.6855 - acc: 0.8316
Epoch 493/500
 - 3s - loss: 0.6759 - acc: 0.8300
Epoch 494/500
 - 3s - loss: 0.6852 - acc: 0.8245
Epoch 495/500
 - 3s - loss: 0.6711 - acc: 0.8291
Epoch 496/500
 - 3s - loss: 0.6640 - acc: 0.8307
Epoch 497/500
 - 3s - loss: 0.6704 - acc: 0.8290
Epoch 498/500
 - 3s - loss: 0.6605 - acc: 0.8329
Epoch 499/500
 - 3s - loss: 0.6589 - acc: 0.8392
Epoch 500/500
 - 3s - loss: 0.6989 - acc: 0.8236
CPU times: user 34min 3s, sys: 5min 45s, total: 39min 48s
Wall time: 28min 29s
In:
plt.plot(hist.history['acc'])
plt.show()
/home/ubuntu/anaconda3/envs/tensorflow_p36/lib/python3.6/site-packages/matplotlib/font_manager.py:1328: UserWarning: findfont: Font family ['nanumgothic'] not found. Falling back to DejaVu Sans
  (prop.get_family(), self.defaultFamily[fontext]))
In:
model.save("rnn_text_gen.hdf5")

Test

In:
from keras.models import load_model
model = load_model("rnn_text_gen.hdf5")
In:
word_list = '대한민국 의 국민 이 되는 요건 은 법률 로 정한 다 .'.split(" ")
word_list
Out:
['대한민국', '의', '국민', '이', '되는', '요건', '은', '법률', '로', '정한', '다', '.']
In:
reverse_word_map = dict(map(reversed, tokenizer.word_index.items()))
In:
len(reverse_word_map)
Out:
1204
In:
x = sequence.pad_sequences([[tokenizer.word_index[w] for w in word_list[:2]]], maxlen=maxlen)
x
Out:
array([[  0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
          0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
          0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
          0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
          0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
          0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
          0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
          0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
          0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
          0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
          0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
          0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
          0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
          0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,   0,
          0,   0,   0,   0, 101,   1]], dtype=int32)
In:
p = model.predict(x)[0]
p
Out:
array([5.1314508e-08, 1.1153760e-03, 3.6256557e-04, ..., 6.6387229e-06,
       1.3235493e-07, 1.3641418e-05], dtype=float32)
In:
idx = np.flip(np.argsort(p), 0)
idx
Out:
array([ 438,  437,   19, ...,  605, 1170,  286])
In:
p[idx]
Out:
array([1.76298395e-01, 1.67277738e-01, 5.77470176e-02, ...,
       1.06850376e-10, 7.99032368e-11, 7.51542578e-11], dtype=float32)
In:
for i in idx[:5]:
    print(reverse_word_map[i])
영토
주권
국민
조직
종류
In:
def predict_word(i, n=1):
    x = sequence.pad_sequences([[tokenizer.word_index[w] for w in word_list[:i]]], maxlen=maxlen)
    p = model.predict(x)[0]
    idx = np.flip(np.argsort(p), 0)
    for j in idx[:n]:
        print('"', " ".join(word_list[:i]), '"', reverse_word_map[j], " (p={:4.2f}%)".format(100 * p[j]))
In:
predict_word(1, n=3)
" 대한민국 " 의  (p=62.61%)
" 대한민국 " 은  (p=30.96%)
" 대한민국 " 헌법  (p=1.82%)
In:
predict_word(2, n=3)
" 대한민국 의 " 영토  (p=17.63%)
" 대한민국 의 " 주권  (p=16.73%)
" 대한민국 의 " 국민  (p=5.77%)
In:
predict_word(3, n=3)
" 대한민국 의 국민 " 은  (p=62.33%)
" 대한민국 의 국민 " 이  (p=17.43%)
" 대한민국 의 국민 " 의  (p=11.92%)
In:
predict_word(4, n=3)
" 대한민국 의 국민 이 " 되는  (p=74.99%)
" 대한민국 의 국민 이 " 헌법  (p=10.13%)
" 대한민국 의 국민 이 " 에  (p=2.34%)
In:
predict_word(5, n=3)
" 대한민국 의 국민 이 되는 " 요건  (p=98.50%)
" 대한민국 의 국민 이 되는 " 계약  (p=0.34%)
" 대한민국 의 국민 이 되는 " 되는  (p=0.21%)
In:
predict_word(6, n=3)
" 대한민국 의 국민 이 되는 요건 " 은  (p=95.34%)
" 대한민국 의 국민 이 되는 요건 " 의  (p=2.49%)
" 대한민국 의 국민 이 되는 요건 " 이  (p=1.59%)
In:
predict_word(7, n=3)
" 대한민국 의 국민 이 되는 요건 은 " 법률  (p=97.38%)
" 대한민국 의 국민 이 되는 요건 은 " 국가  (p=1.09%)
" 대한민국 의 국민 이 되는 요건 은 " 대통령  (p=0.53%)
In:
predict_word(8, n=3)
" 대한민국 의 국민 이 되는 요건 은 법률 " 로  (p=97.31%)
" 대한민국 의 국민 이 되는 요건 은 법률 " 이  (p=1.63%)
" 대한민국 의 국민 이 되는 요건 은 법률 " 로써  (p=0.68%)
In:
predict_word(9, n=3)
" 대한민국 의 국민 이 되는 요건 은 법률 로 " 정한  (p=99.61%)
" 대한민국 의 국민 이 되는 요건 은 법률 로 " 정하되  (p=0.11%)
" 대한민국 의 국민 이 되는 요건 은 법률 로 " 인하여  (p=0.02%)
In:
predict_word(10, n=3)
" 대한민국 의 국민 이 되는 요건 은 법률 로 정한 " 다  (p=99.90%)
" 대한민국 의 국민 이 되는 요건 은 법률 로 정한 " 국회  (p=0.03%)
" 대한민국 의 국민 이 되는 요건 은 법률 로 정한 " 후  (p=0.01%)
In:
def generate(w, n, seed=None):
    
    if seed is not None:
        np.random.seed(seed)
    
    def _predict_word(sent):
        x = sequence.pad_sequences([[tokenizer.word_index[w] for w in sent]], maxlen=maxlen)
        p = model.predict(x)[0]
        logp = np.log(p)
        p = np.exp(logp) / np.sum(np.exp(logp))
        p = p[1:]
        return np.random.choice(list(tokenizer.word_index.keys()), p=p)

    sent = [w]
    for i in range(n):
        w = _predict_word(sent)
        sent.append(w)
        
    return " ".join(sent)
In:
generate("대한민국", 10, seed=1)
Out:
'대한민국 의 영토 의 피고인 은 법률 이 정하는 바 에'
In:
generate("대한민국", 10, seed=2)
Out:
'대한민국 의 국민 은 법률 이 정하는 바 에 의하여 국가'
In:
generate("대한민국", 10, seed=3)
Out:
'대한민국 의 영토 는 한반도 와 그 부속 도서 로 한'
In:
generate("국민", 10, seed=1)
Out:
'국민 은 국무회의 의 항 과 국가 의 기본 적 인'
In:
generate("국민", 10, seed=2)
Out:
'국민 은 이 경우 를 제외 하고는 심사 하여 그 정책'
In:
generate("국민", 10, seed=3)
Out:
'국민 은 국무회의 의 의장 국무총리 국무위원 의 동의 를 얻어'
In:
generate("세계", 10, seed=3)
Out:
'세계 조직 과 집회 결사 에 대한 집회 가 그 절차'

질문/덧글

아직 질문이나 덧글이 없습니다. 첫번째 글을 남겨주세요!