## Max 75% accuracy! help!

1

i am trying to train a deep neural network to figure out that if there is a 1 and 0 present in the first two columns of X that the output is 1 otherwise its 0. but im only getting a 75% accuracy on the model!

import numpy as np import tflearn

X = [[0, 0, 1],
[0, 1, 1],
[1, 0, 1],
[1, 1, 1]]

Y = [[0, 1],
[1, 1],
[1, 0],
[0, 1]]

Xtest = np.array([[1, 1, 1],
[0, 1, 1],
[1, 0, 1],
[0, 1, 1]])

# Build neural network
net = tflearn.input_data(shape=[None, 3])
net = tflearn.fully_connected(net, 32, activation='sigmoid')
net = tflearn.fully_connected(net, 32, activation='sigmoid')
net = tflearn.fully_connected(net, 2, activation='softmax')

# Define model
model = tflearn.DNN(net)
# Start training (apply gradient descent algorithm)
model.fit(X, Y, n_epoch=5000, batch_size=16, show_metric=True)

pred = model.predict(Xtest)
for i in range(4):
print(pred[i])


The output should be: [0, 1, 1, 1]

Training Step: 4999  | total loss: 0.50493 | time: 0.004s
| Adam | epoch: 4999 | loss: 0.50493 - acc: 0.7813 -- iter: 4/4
--
0.01631585881114006
0.4872587323188782
0.9684665203094482
0.019177177920937538


1

It looks like you are training this as a multiclass classifier, to represent a binary choice. In which case, your Y value is wrong:

Y = [[0, 1],
[1, 1],
[1, 0],
[0, 1]]


Here your second label is not self-consistent, and thus it is impossible to predict using a softmax output layer (where the sum of all outputs must equal 1). The best it can do is [0.5, 0.5] to match that label and you can see actually it got close to that in your test.

Y = [[0, 1],
[1, 0],
[1, 0],
[0, 1]]


A few asides . . .

• Your example inputs all have the same third column (=1). This is redundant data, and you could drop it.

• Your network is more complex than it needs to be for this task. A single hidden layer with only a few neurons in it should be sufficient.

• For this specific task you could have chosen a single output neuron using a sigmoid activation (and need only one column in Y).