Same classification given for neural network regardless of the inputSKNN regression problemWhy CNN doesn't give higher accuracy over simple MLP network? [From Keras examples]Tensorflow regression predicting 1 for all inputsRegression Neural Network using tflearnNeural Networks - Strategies for problems with high Bayes error rateValidation score (f1) remains the same when swapping labelskeras model only predicts one class for all the test imagesNeural Network Data Normalization SetupValidation loss increases and validation accuracy decreasesNeural Network Model using Transfer Learning not learning

Is Cola "probably the best-known" Latin word in the world? If not, which might it be?

Is it the same airport YUL and YMQ in Canada?

Why do freehub and cassette have only one position that matches?

What does air vanishing on contact sound like?

How do you center multiple equations that have multiple steps?

Stark VS Thanos

How do I tell my manager that his code review comment is wrong?

How to assert on pagereference where the endpoint of pagereference is predefined

Pressure to defend the relevance of one's area of mathematics

Is it legal to define an unnamed struct?

Accidentally deleted the "/usr/share" folder

Is lying to get "gardening leave" fraud?

Hang 20lb projector screen on Hardieplank

How to reply this mail from potential PhD professor?

Selecting a secure PIN for building access

Is NMDA produced in the body?

Attending a conference where my ex-supervisor and his collaborator are present, should I attend?

Catholic vs Protestant Support for Nazism in Germany

Why do computer-science majors learn calculus?

Why do money exchangers give different rates to different bills?

How to implement float hashing with approximate equality

Has any spacecraft ever had the ability to directly communicate with civilian air traffic control?

If Earth is tilted, why is Polaris always above the same spot?

Why is this a valid proof for the harmonic series?



Same classification given for neural network regardless of the input


SKNN regression problemWhy CNN doesn't give higher accuracy over simple MLP network? [From Keras examples]Tensorflow regression predicting 1 for all inputsRegression Neural Network using tflearnNeural Networks - Strategies for problems with high Bayes error rateValidation score (f1) remains the same when swapping labelskeras model only predicts one class for all the test imagesNeural Network Data Normalization SetupValidation loss increases and validation accuracy decreasesNeural Network Model using Transfer Learning not learning













0












$begingroup$


I am using the MNIST classification tutorials on the TensorFlow website to create my own classification program to predict a footballers value using the FIFA 19 dataset. However, when I run my program, it always picks the same classification for every player in my testing dataset even when the players should have different values. I checked the probabilities that were predicted in the last layer of the neural network which has the probabilities for each classification and it shows that all probabilities are 0 except for one, which is the class that is predicted for all players. How do I fix this?



import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers
from keras.models import Sequential
from keras.layers import Dense, Activation

# import numpy library
import numpy as np

# import pandas library
import pandas as pd

# import seaborn library
import seaborn as sns

# import pyplot library
import matplotlib.pyplot as plt

# store directory containing data in a variable
DATAPATH = '../../Data/fifa19/dataNumerical.csv'
VALUES = '../../Data/fifa19/player values for classification.csv'

# import the dataset using pandas
def init_column_names():
columnNames = ['Age',
'CountryWorldRankingPoints',
'Overall',
'Potential',
'ClubGoals',
'Value(€M)',
'Wage(€K)',
'Special',
'PreferredFoot',
'InternationalReputation',
'WeakFoot',
'SkillMoves',
'WorkRate',
'BodyType',
'Position',
'JerseyNumber',
'Height(cm)',
'Weight(lbs)',
'LS',
'ST',
'RS',
'LW',
'LF',
'CF',
'RF',
'RW',
'LAM',
'CAM',
'RAM',
'LM',
'LCM',
'CM',
'RCM',
'RM',
'LWB',
'LDM',
'CDM',
'RDM',
'RWB',
'LB',
'LCB',
'CB',
'RCB',
'RB',
'Crossing',
'Finishing',
'HeadingAccuracy',
'ShortPassing',
'Volleys',
'Dribbling',
'Curve',
'FKAccuracy',
'LongPassing',
'BallControl',
'Acceleration',
'SprintSpeed',
'Agility',
'Reactions',
'Balance',
'ShotPower',
'Jumping',
'Stamina',
'Strength',
'LongShots',
'Aggression',
'Interceptions',
'Positioning',
'Vision',
'Penalties',
'Composure',
'Marking',
'StandingTackle',
'SlidingTackle',
'GKDiving',
'GKHandling',
'GKKicking',
'GKPositioning',
'GKReflexes',
'ReleaseClause(€M)'
]
return columnNames


class_names = ['0-100K', '101-500K', '500K-1M', '1-5M', '5-10M', '10-25M',
'25-50M', '50-75M', '75-100M', '100M+']

print('==================================================')
print('READ DATA CSV')
column_names = init_column_names()
raw_dataset = pd.read_csv(DATAPATH, names=column_names, na_values="?", comment='t', sep=",", skipinitialspace=True,
encoding='latin-1'
)

print('==================================================')
print('READ PLAYER VALUES CSV')
playerValues = pd.read_csv(VALUES, names=['values'], na_values="?", comment='t', sep=",", skipinitialspace=True,
encoding='latin-1'
)

# pd.set_option('display.max_columns', 177)
# pd.set_option('display.max_columns', 15)
pd.set_option('display.max_rows', 5)

# make a copy of the dataset to leave the original unaffected
dataset = raw_dataset.copy()

# clean the dataset by removing unknown values
dataset = dataset.dropna();
# print(type(dataset))

# create classification labels for the dataset
classification_labels = playerValues.values
# print(classification_labels)
# print(type(classification_labels))


# create empty array then append the different classes into the array.
class_labels_array = []
print('==================================================')
print('CREATING CLASS LABELS')
for x in classification_labels:
if x <= 0.1:
class_labels_array.append(0)
elif x < 0.5:
class_labels_array.append(1)
elif x < 1:
class_labels_array.append(2)
elif x < 5:
class_labels_array.append(3)
elif x < 10:
class_labels_array.append(4)
elif x < 25:
class_labels_array.append(5)
elif x < 50:
class_labels_array.append(6)
elif x < 75:
class_labels_array.append(7)
elif x < 100:
class_labels_array.append(8)
else:
class_labels_array.append(9)

# print(class_labels_array)
class_labels = pd.DataFrame(data=class_labels_array, columns=['class_labels'])
# print(class_labels)
# print("end of class labels")
# printing the dataset
# print(dataset.tail())


def position_one_hot(dataset):
position = dataset.pop("Position")
dataset['CAM'] = (position == 'CAM') * 1.0
dataset['CB'] = (position == 'CB') * 1.0
dataset['CDM'] = (position == 'CDM') * 1.0
dataset['CF'] = (position == 'CF') * 1.0
dataset['CM'] = (position == 'CM') * 1.0
dataset['GK'] = (position == 'GK') * 1.0
dataset['LB'] = (position == 'LB') * 1.0
dataset['LCB'] = (position == 'LCB') * 1.0
dataset['LCM'] = (position == 'LCM') * 1.0
dataset['LDM'] = (position == 'LDM') * 1.0
dataset['LF'] = (position == 'LF') * 1.0
dataset['LM'] = (position == 'LM') * 1.0
dataset['LS'] = (position == 'LS') * 1.0
dataset['LW'] = (position == 'LW') * 1.0
dataset['LWB'] = (position == 'LWB') * 1.0
dataset['RB'] = (position == 'RB') * 1.0
dataset['RCB'] = (position == 'RCB') * 1.0
dataset['RCM'] = (position == 'RCM') * 1.0
dataset['RDM'] = (position == 'RDM') * 1.0
dataset['RM'] = (position == 'RM') * 1.0
dataset['RS'] = (position == 'RS') * 1.0
dataset['RW'] = (position == 'RW') * 1.0
dataset['RWB'] = (position == 'RWB') * 1.0
dataset['ST'] = (position == 'ST') * 1.0

return dataset

# convert the categorical position column into a one-hot.
dataset = position_one_hot(dataset)

# print(dataset.tail())

# print(type(dataset))

# split the data into training and testing datasets.
training_data = dataset.sample(frac=0.8, random_state=0)
print("=============================")
print("TRAINING DATA EXAMPLES")
print(training_data.tail())
testing_data = dataset.drop(training_data.index)
print("=============================")
print("TESTING DATA EXAMPLES")
print(testing_data.tail())

# remove value we are trying to predict from the dataset.
training_data.pop('Value(€M)')
testing_data.pop('Value(€M)')

# split the labels into a training and testing labels.
training_labels = class_labels.sample(frac=0.8, random_state=0)
print("=============================")
print("TRAINING LABELS")
print(training_labels.tail())
testing_labels = class_labels.drop(training_labels.index)
print("=============================")
print("TESTING LABELS")
print(testing_labels.tail())



# new method which can be called to build the model
def buildModel():
model = keras.Sequential([

keras.layers.Flatten(input_shape=[len(training_data.keys())]),
keras.layers.Dense(128, activation=tf.nn.relu),
keras.layers.Dense(64, activation=tf.nn.relu),
keras.layers.Dense(10, activation=tf.nn.softmax)
])

# optimizer = tf.keras.optimizers.RMSprop(0.001)
model.compile(loss='sparse_categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
return model


# build the model and store it in a variable.
model = buildModel()

print('==================================================')
print('TRAINING')
# train the model
model.fit(training_data, training_labels, epochs=320, batch_size=32)

print('==================================================')
print('ACCURACY TESTING')
# check accuracy
test_loss, test_accuracy = model.evaluate(testing_data, testing_labels)
print('Test accuracy:', test_accuracy)

# make prediction for first player in dataset
predictions = model.predict(testing_data)
# print(predictions[0])
print('==================================================')
print('CLASSIFICATION')

# for x in range(len(predictions)):
for x in range(10):
# print the strongest class from the prediction
print('PREDICTION: ', class_names[np.argmax(predictions[x])])

# print the correct classification of the player that the prediction was made on to see if classification was correct.
# print(testing_labels.tail())
print('ACTUAL: ', class_names[int(testing_labels.iloc[x]['class_labels'])])
print()

print('==================================================')
print('END')


TRAINING AND TESTING OUTPUT



TRAINING
Epoch 1/320

32/519 [>.............................] - ETA: 1s - loss: 14.0409 - acc: 0.1250
519/519 [==============================] - 0s 241us/sample - loss: 12.8769 - acc: 0.1965
Epoch 2/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.4266 - acc: 0.2274
Epoch 3/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 4/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 5/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 6/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 7/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 69us/sample - loss: 12.3603 - acc: 0.2331
Epoch 8/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 9/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 10/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 11/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 12/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 13/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 14/320

32/519 [>.............................] - ETA: 0s - loss: 10.5775 - acc: 0.3438
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 15/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 16/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 17/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 18/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 19/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 20/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 21/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 22/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 23/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 24/320

32/519 [>.............................] - ETA: 0s - loss: 9.0664 - acc: 0.4375
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 25/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 26/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 27/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 28/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 29/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 30/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 31/320

32/519 [>.............................] - ETA: 0s - loss: 10.5775 - acc: 0.3438
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 32/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 33/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 34/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 35/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 36/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 37/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 38/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 39/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 40/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 95us/sample - loss: 12.3603 - acc: 0.2331
Epoch 41/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 42/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 43/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 44/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 45/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 46/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 47/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 48/320

32/519 [>.............................] - ETA: 0s - loss: 10.0738 - acc: 0.3750
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 49/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 50/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 51/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 52/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 53/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 54/320

32/519 [>.............................] - ETA: 0s - loss: 10.5775 - acc: 0.3438
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 55/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 56/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 57/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 58/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 59/320

32/519 [>.............................] - ETA: 0s - loss: 14.6070 - acc: 0.0938
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 60/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 61/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 62/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 63/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 64/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 65/320

32/519 [>.............................] - ETA: 0s - loss: 10.5775 - acc: 0.3438
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 66/320

32/519 [>.............................] - ETA: 0s - loss: 9.5701 - acc: 0.4062
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 67/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 68/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 69/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 70/320

32/519 [>.............................] - ETA: 0s - loss: 14.6070 - acc: 0.0938
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 71/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 72/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 73/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 73us/sample - loss: 12.3603 - acc: 0.2331
Epoch 74/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 75/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 76/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 77/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 78/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 79/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 80/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 81/320

32/519 [>.............................] - ETA: 0s - loss: 10.0738 - acc: 0.3750
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 82/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 83/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 84/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 85/320

32/519 [>.............................] - ETA: 0s - loss: 10.5775 - acc: 0.3438
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 86/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 87/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 88/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 89/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 90/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 91/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 92/320

32/519 [>.............................] - ETA: 0s - loss: 14.6070 - acc: 0.0938
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 93/320

32/519 [>.............................] - ETA: 0s - loss: 10.5775 - acc: 0.3438
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 94/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331

***************REST OF TRAINING REMOVED DUE TO WORD LIMIT*****************

==================================================
ACCURACY TESTING

32/130 [======>.......................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
130/130 [==============================] - 0s 481us/sample - loss: 13.0185 - acc: 0.1923
Test accuracy: 0.1923077
==================================================
CLASSIFICATION
PREDICTION: 5-10M
ACTUAL: 500K-1M

PREDICTION: 5-10M
ACTUAL: 101-500K

PREDICTION: 5-10M
ACTUAL: 25-50M

PREDICTION: 5-10M
ACTUAL: 1-5M

PREDICTION: 5-10M
ACTUAL: 10-25M

PREDICTION: 5-10M
ACTUAL: 10-25M

PREDICTION: 5-10M
ACTUAL: 25-50M

PREDICTION: 5-10M
ACTUAL: 5-10M

PREDICTION: 5-10M
ACTUAL: 1-5M

PREDICTION: 5-10M
ACTUAL: 1-5M

==================================================
END









share|improve this question











$endgroup$











  • $begingroup$
    MNIST is a Computer Vision problem dataset, your data is well structured and neural networks are not the best approach to it. Try XGBoost or Random Forests and you will probably get better results.
    $endgroup$
    – Pedro Henrique Monforte
    Apr 11 at 14:46















0












$begingroup$


I am using the MNIST classification tutorials on the TensorFlow website to create my own classification program to predict a footballers value using the FIFA 19 dataset. However, when I run my program, it always picks the same classification for every player in my testing dataset even when the players should have different values. I checked the probabilities that were predicted in the last layer of the neural network which has the probabilities for each classification and it shows that all probabilities are 0 except for one, which is the class that is predicted for all players. How do I fix this?



import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers
from keras.models import Sequential
from keras.layers import Dense, Activation

# import numpy library
import numpy as np

# import pandas library
import pandas as pd

# import seaborn library
import seaborn as sns

# import pyplot library
import matplotlib.pyplot as plt

# store directory containing data in a variable
DATAPATH = '../../Data/fifa19/dataNumerical.csv'
VALUES = '../../Data/fifa19/player values for classification.csv'

# import the dataset using pandas
def init_column_names():
columnNames = ['Age',
'CountryWorldRankingPoints',
'Overall',
'Potential',
'ClubGoals',
'Value(€M)',
'Wage(€K)',
'Special',
'PreferredFoot',
'InternationalReputation',
'WeakFoot',
'SkillMoves',
'WorkRate',
'BodyType',
'Position',
'JerseyNumber',
'Height(cm)',
'Weight(lbs)',
'LS',
'ST',
'RS',
'LW',
'LF',
'CF',
'RF',
'RW',
'LAM',
'CAM',
'RAM',
'LM',
'LCM',
'CM',
'RCM',
'RM',
'LWB',
'LDM',
'CDM',
'RDM',
'RWB',
'LB',
'LCB',
'CB',
'RCB',
'RB',
'Crossing',
'Finishing',
'HeadingAccuracy',
'ShortPassing',
'Volleys',
'Dribbling',
'Curve',
'FKAccuracy',
'LongPassing',
'BallControl',
'Acceleration',
'SprintSpeed',
'Agility',
'Reactions',
'Balance',
'ShotPower',
'Jumping',
'Stamina',
'Strength',
'LongShots',
'Aggression',
'Interceptions',
'Positioning',
'Vision',
'Penalties',
'Composure',
'Marking',
'StandingTackle',
'SlidingTackle',
'GKDiving',
'GKHandling',
'GKKicking',
'GKPositioning',
'GKReflexes',
'ReleaseClause(€M)'
]
return columnNames


class_names = ['0-100K', '101-500K', '500K-1M', '1-5M', '5-10M', '10-25M',
'25-50M', '50-75M', '75-100M', '100M+']

print('==================================================')
print('READ DATA CSV')
column_names = init_column_names()
raw_dataset = pd.read_csv(DATAPATH, names=column_names, na_values="?", comment='t', sep=",", skipinitialspace=True,
encoding='latin-1'
)

print('==================================================')
print('READ PLAYER VALUES CSV')
playerValues = pd.read_csv(VALUES, names=['values'], na_values="?", comment='t', sep=",", skipinitialspace=True,
encoding='latin-1'
)

# pd.set_option('display.max_columns', 177)
# pd.set_option('display.max_columns', 15)
pd.set_option('display.max_rows', 5)

# make a copy of the dataset to leave the original unaffected
dataset = raw_dataset.copy()

# clean the dataset by removing unknown values
dataset = dataset.dropna();
# print(type(dataset))

# create classification labels for the dataset
classification_labels = playerValues.values
# print(classification_labels)
# print(type(classification_labels))


# create empty array then append the different classes into the array.
class_labels_array = []
print('==================================================')
print('CREATING CLASS LABELS')
for x in classification_labels:
if x <= 0.1:
class_labels_array.append(0)
elif x < 0.5:
class_labels_array.append(1)
elif x < 1:
class_labels_array.append(2)
elif x < 5:
class_labels_array.append(3)
elif x < 10:
class_labels_array.append(4)
elif x < 25:
class_labels_array.append(5)
elif x < 50:
class_labels_array.append(6)
elif x < 75:
class_labels_array.append(7)
elif x < 100:
class_labels_array.append(8)
else:
class_labels_array.append(9)

# print(class_labels_array)
class_labels = pd.DataFrame(data=class_labels_array, columns=['class_labels'])
# print(class_labels)
# print("end of class labels")
# printing the dataset
# print(dataset.tail())


def position_one_hot(dataset):
position = dataset.pop("Position")
dataset['CAM'] = (position == 'CAM') * 1.0
dataset['CB'] = (position == 'CB') * 1.0
dataset['CDM'] = (position == 'CDM') * 1.0
dataset['CF'] = (position == 'CF') * 1.0
dataset['CM'] = (position == 'CM') * 1.0
dataset['GK'] = (position == 'GK') * 1.0
dataset['LB'] = (position == 'LB') * 1.0
dataset['LCB'] = (position == 'LCB') * 1.0
dataset['LCM'] = (position == 'LCM') * 1.0
dataset['LDM'] = (position == 'LDM') * 1.0
dataset['LF'] = (position == 'LF') * 1.0
dataset['LM'] = (position == 'LM') * 1.0
dataset['LS'] = (position == 'LS') * 1.0
dataset['LW'] = (position == 'LW') * 1.0
dataset['LWB'] = (position == 'LWB') * 1.0
dataset['RB'] = (position == 'RB') * 1.0
dataset['RCB'] = (position == 'RCB') * 1.0
dataset['RCM'] = (position == 'RCM') * 1.0
dataset['RDM'] = (position == 'RDM') * 1.0
dataset['RM'] = (position == 'RM') * 1.0
dataset['RS'] = (position == 'RS') * 1.0
dataset['RW'] = (position == 'RW') * 1.0
dataset['RWB'] = (position == 'RWB') * 1.0
dataset['ST'] = (position == 'ST') * 1.0

return dataset

# convert the categorical position column into a one-hot.
dataset = position_one_hot(dataset)

# print(dataset.tail())

# print(type(dataset))

# split the data into training and testing datasets.
training_data = dataset.sample(frac=0.8, random_state=0)
print("=============================")
print("TRAINING DATA EXAMPLES")
print(training_data.tail())
testing_data = dataset.drop(training_data.index)
print("=============================")
print("TESTING DATA EXAMPLES")
print(testing_data.tail())

# remove value we are trying to predict from the dataset.
training_data.pop('Value(€M)')
testing_data.pop('Value(€M)')

# split the labels into a training and testing labels.
training_labels = class_labels.sample(frac=0.8, random_state=0)
print("=============================")
print("TRAINING LABELS")
print(training_labels.tail())
testing_labels = class_labels.drop(training_labels.index)
print("=============================")
print("TESTING LABELS")
print(testing_labels.tail())



# new method which can be called to build the model
def buildModel():
model = keras.Sequential([

keras.layers.Flatten(input_shape=[len(training_data.keys())]),
keras.layers.Dense(128, activation=tf.nn.relu),
keras.layers.Dense(64, activation=tf.nn.relu),
keras.layers.Dense(10, activation=tf.nn.softmax)
])

# optimizer = tf.keras.optimizers.RMSprop(0.001)
model.compile(loss='sparse_categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
return model


# build the model and store it in a variable.
model = buildModel()

print('==================================================')
print('TRAINING')
# train the model
model.fit(training_data, training_labels, epochs=320, batch_size=32)

print('==================================================')
print('ACCURACY TESTING')
# check accuracy
test_loss, test_accuracy = model.evaluate(testing_data, testing_labels)
print('Test accuracy:', test_accuracy)

# make prediction for first player in dataset
predictions = model.predict(testing_data)
# print(predictions[0])
print('==================================================')
print('CLASSIFICATION')

# for x in range(len(predictions)):
for x in range(10):
# print the strongest class from the prediction
print('PREDICTION: ', class_names[np.argmax(predictions[x])])

# print the correct classification of the player that the prediction was made on to see if classification was correct.
# print(testing_labels.tail())
print('ACTUAL: ', class_names[int(testing_labels.iloc[x]['class_labels'])])
print()

print('==================================================')
print('END')


TRAINING AND TESTING OUTPUT



TRAINING
Epoch 1/320

32/519 [>.............................] - ETA: 1s - loss: 14.0409 - acc: 0.1250
519/519 [==============================] - 0s 241us/sample - loss: 12.8769 - acc: 0.1965
Epoch 2/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.4266 - acc: 0.2274
Epoch 3/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 4/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 5/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 6/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 7/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 69us/sample - loss: 12.3603 - acc: 0.2331
Epoch 8/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 9/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 10/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 11/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 12/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 13/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 14/320

32/519 [>.............................] - ETA: 0s - loss: 10.5775 - acc: 0.3438
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 15/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 16/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 17/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 18/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 19/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 20/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 21/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 22/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 23/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 24/320

32/519 [>.............................] - ETA: 0s - loss: 9.0664 - acc: 0.4375
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 25/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 26/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 27/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 28/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 29/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 30/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 31/320

32/519 [>.............................] - ETA: 0s - loss: 10.5775 - acc: 0.3438
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 32/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 33/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 34/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 35/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 36/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 37/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 38/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 39/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 40/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 95us/sample - loss: 12.3603 - acc: 0.2331
Epoch 41/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 42/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 43/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 44/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 45/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 46/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 47/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 48/320

32/519 [>.............................] - ETA: 0s - loss: 10.0738 - acc: 0.3750
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 49/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 50/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 51/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 52/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 53/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 54/320

32/519 [>.............................] - ETA: 0s - loss: 10.5775 - acc: 0.3438
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 55/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 56/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 57/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 58/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 59/320

32/519 [>.............................] - ETA: 0s - loss: 14.6070 - acc: 0.0938
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 60/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 61/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 62/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 63/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 64/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 65/320

32/519 [>.............................] - ETA: 0s - loss: 10.5775 - acc: 0.3438
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 66/320

32/519 [>.............................] - ETA: 0s - loss: 9.5701 - acc: 0.4062
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 67/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 68/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 69/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 70/320

32/519 [>.............................] - ETA: 0s - loss: 14.6070 - acc: 0.0938
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 71/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 72/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 73/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 73us/sample - loss: 12.3603 - acc: 0.2331
Epoch 74/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 75/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 76/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 77/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 78/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 79/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 80/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 81/320

32/519 [>.............................] - ETA: 0s - loss: 10.0738 - acc: 0.3750
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 82/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 83/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 84/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 85/320

32/519 [>.............................] - ETA: 0s - loss: 10.5775 - acc: 0.3438
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 86/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 87/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 88/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 89/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 90/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 91/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 92/320

32/519 [>.............................] - ETA: 0s - loss: 14.6070 - acc: 0.0938
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 93/320

32/519 [>.............................] - ETA: 0s - loss: 10.5775 - acc: 0.3438
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 94/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331

***************REST OF TRAINING REMOVED DUE TO WORD LIMIT*****************

==================================================
ACCURACY TESTING

32/130 [======>.......................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
130/130 [==============================] - 0s 481us/sample - loss: 13.0185 - acc: 0.1923
Test accuracy: 0.1923077
==================================================
CLASSIFICATION
PREDICTION: 5-10M
ACTUAL: 500K-1M

PREDICTION: 5-10M
ACTUAL: 101-500K

PREDICTION: 5-10M
ACTUAL: 25-50M

PREDICTION: 5-10M
ACTUAL: 1-5M

PREDICTION: 5-10M
ACTUAL: 10-25M

PREDICTION: 5-10M
ACTUAL: 10-25M

PREDICTION: 5-10M
ACTUAL: 25-50M

PREDICTION: 5-10M
ACTUAL: 5-10M

PREDICTION: 5-10M
ACTUAL: 1-5M

PREDICTION: 5-10M
ACTUAL: 1-5M

==================================================
END









share|improve this question











$endgroup$











  • $begingroup$
    MNIST is a Computer Vision problem dataset, your data is well structured and neural networks are not the best approach to it. Try XGBoost or Random Forests and you will probably get better results.
    $endgroup$
    – Pedro Henrique Monforte
    Apr 11 at 14:46













0












0








0





$begingroup$


I am using the MNIST classification tutorials on the TensorFlow website to create my own classification program to predict a footballers value using the FIFA 19 dataset. However, when I run my program, it always picks the same classification for every player in my testing dataset even when the players should have different values. I checked the probabilities that were predicted in the last layer of the neural network which has the probabilities for each classification and it shows that all probabilities are 0 except for one, which is the class that is predicted for all players. How do I fix this?



import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers
from keras.models import Sequential
from keras.layers import Dense, Activation

# import numpy library
import numpy as np

# import pandas library
import pandas as pd

# import seaborn library
import seaborn as sns

# import pyplot library
import matplotlib.pyplot as plt

# store directory containing data in a variable
DATAPATH = '../../Data/fifa19/dataNumerical.csv'
VALUES = '../../Data/fifa19/player values for classification.csv'

# import the dataset using pandas
def init_column_names():
columnNames = ['Age',
'CountryWorldRankingPoints',
'Overall',
'Potential',
'ClubGoals',
'Value(€M)',
'Wage(€K)',
'Special',
'PreferredFoot',
'InternationalReputation',
'WeakFoot',
'SkillMoves',
'WorkRate',
'BodyType',
'Position',
'JerseyNumber',
'Height(cm)',
'Weight(lbs)',
'LS',
'ST',
'RS',
'LW',
'LF',
'CF',
'RF',
'RW',
'LAM',
'CAM',
'RAM',
'LM',
'LCM',
'CM',
'RCM',
'RM',
'LWB',
'LDM',
'CDM',
'RDM',
'RWB',
'LB',
'LCB',
'CB',
'RCB',
'RB',
'Crossing',
'Finishing',
'HeadingAccuracy',
'ShortPassing',
'Volleys',
'Dribbling',
'Curve',
'FKAccuracy',
'LongPassing',
'BallControl',
'Acceleration',
'SprintSpeed',
'Agility',
'Reactions',
'Balance',
'ShotPower',
'Jumping',
'Stamina',
'Strength',
'LongShots',
'Aggression',
'Interceptions',
'Positioning',
'Vision',
'Penalties',
'Composure',
'Marking',
'StandingTackle',
'SlidingTackle',
'GKDiving',
'GKHandling',
'GKKicking',
'GKPositioning',
'GKReflexes',
'ReleaseClause(€M)'
]
return columnNames


class_names = ['0-100K', '101-500K', '500K-1M', '1-5M', '5-10M', '10-25M',
'25-50M', '50-75M', '75-100M', '100M+']

print('==================================================')
print('READ DATA CSV')
column_names = init_column_names()
raw_dataset = pd.read_csv(DATAPATH, names=column_names, na_values="?", comment='t', sep=",", skipinitialspace=True,
encoding='latin-1'
)

print('==================================================')
print('READ PLAYER VALUES CSV')
playerValues = pd.read_csv(VALUES, names=['values'], na_values="?", comment='t', sep=",", skipinitialspace=True,
encoding='latin-1'
)

# pd.set_option('display.max_columns', 177)
# pd.set_option('display.max_columns', 15)
pd.set_option('display.max_rows', 5)

# make a copy of the dataset to leave the original unaffected
dataset = raw_dataset.copy()

# clean the dataset by removing unknown values
dataset = dataset.dropna();
# print(type(dataset))

# create classification labels for the dataset
classification_labels = playerValues.values
# print(classification_labels)
# print(type(classification_labels))


# create empty array then append the different classes into the array.
class_labels_array = []
print('==================================================')
print('CREATING CLASS LABELS')
for x in classification_labels:
if x <= 0.1:
class_labels_array.append(0)
elif x < 0.5:
class_labels_array.append(1)
elif x < 1:
class_labels_array.append(2)
elif x < 5:
class_labels_array.append(3)
elif x < 10:
class_labels_array.append(4)
elif x < 25:
class_labels_array.append(5)
elif x < 50:
class_labels_array.append(6)
elif x < 75:
class_labels_array.append(7)
elif x < 100:
class_labels_array.append(8)
else:
class_labels_array.append(9)

# print(class_labels_array)
class_labels = pd.DataFrame(data=class_labels_array, columns=['class_labels'])
# print(class_labels)
# print("end of class labels")
# printing the dataset
# print(dataset.tail())


def position_one_hot(dataset):
position = dataset.pop("Position")
dataset['CAM'] = (position == 'CAM') * 1.0
dataset['CB'] = (position == 'CB') * 1.0
dataset['CDM'] = (position == 'CDM') * 1.0
dataset['CF'] = (position == 'CF') * 1.0
dataset['CM'] = (position == 'CM') * 1.0
dataset['GK'] = (position == 'GK') * 1.0
dataset['LB'] = (position == 'LB') * 1.0
dataset['LCB'] = (position == 'LCB') * 1.0
dataset['LCM'] = (position == 'LCM') * 1.0
dataset['LDM'] = (position == 'LDM') * 1.0
dataset['LF'] = (position == 'LF') * 1.0
dataset['LM'] = (position == 'LM') * 1.0
dataset['LS'] = (position == 'LS') * 1.0
dataset['LW'] = (position == 'LW') * 1.0
dataset['LWB'] = (position == 'LWB') * 1.0
dataset['RB'] = (position == 'RB') * 1.0
dataset['RCB'] = (position == 'RCB') * 1.0
dataset['RCM'] = (position == 'RCM') * 1.0
dataset['RDM'] = (position == 'RDM') * 1.0
dataset['RM'] = (position == 'RM') * 1.0
dataset['RS'] = (position == 'RS') * 1.0
dataset['RW'] = (position == 'RW') * 1.0
dataset['RWB'] = (position == 'RWB') * 1.0
dataset['ST'] = (position == 'ST') * 1.0

return dataset

# convert the categorical position column into a one-hot.
dataset = position_one_hot(dataset)

# print(dataset.tail())

# print(type(dataset))

# split the data into training and testing datasets.
training_data = dataset.sample(frac=0.8, random_state=0)
print("=============================")
print("TRAINING DATA EXAMPLES")
print(training_data.tail())
testing_data = dataset.drop(training_data.index)
print("=============================")
print("TESTING DATA EXAMPLES")
print(testing_data.tail())

# remove value we are trying to predict from the dataset.
training_data.pop('Value(€M)')
testing_data.pop('Value(€M)')

# split the labels into a training and testing labels.
training_labels = class_labels.sample(frac=0.8, random_state=0)
print("=============================")
print("TRAINING LABELS")
print(training_labels.tail())
testing_labels = class_labels.drop(training_labels.index)
print("=============================")
print("TESTING LABELS")
print(testing_labels.tail())



# new method which can be called to build the model
def buildModel():
model = keras.Sequential([

keras.layers.Flatten(input_shape=[len(training_data.keys())]),
keras.layers.Dense(128, activation=tf.nn.relu),
keras.layers.Dense(64, activation=tf.nn.relu),
keras.layers.Dense(10, activation=tf.nn.softmax)
])

# optimizer = tf.keras.optimizers.RMSprop(0.001)
model.compile(loss='sparse_categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
return model


# build the model and store it in a variable.
model = buildModel()

print('==================================================')
print('TRAINING')
# train the model
model.fit(training_data, training_labels, epochs=320, batch_size=32)

print('==================================================')
print('ACCURACY TESTING')
# check accuracy
test_loss, test_accuracy = model.evaluate(testing_data, testing_labels)
print('Test accuracy:', test_accuracy)

# make prediction for first player in dataset
predictions = model.predict(testing_data)
# print(predictions[0])
print('==================================================')
print('CLASSIFICATION')

# for x in range(len(predictions)):
for x in range(10):
# print the strongest class from the prediction
print('PREDICTION: ', class_names[np.argmax(predictions[x])])

# print the correct classification of the player that the prediction was made on to see if classification was correct.
# print(testing_labels.tail())
print('ACTUAL: ', class_names[int(testing_labels.iloc[x]['class_labels'])])
print()

print('==================================================')
print('END')


TRAINING AND TESTING OUTPUT



TRAINING
Epoch 1/320

32/519 [>.............................] - ETA: 1s - loss: 14.0409 - acc: 0.1250
519/519 [==============================] - 0s 241us/sample - loss: 12.8769 - acc: 0.1965
Epoch 2/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.4266 - acc: 0.2274
Epoch 3/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 4/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 5/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 6/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 7/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 69us/sample - loss: 12.3603 - acc: 0.2331
Epoch 8/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 9/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 10/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 11/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 12/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 13/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 14/320

32/519 [>.............................] - ETA: 0s - loss: 10.5775 - acc: 0.3438
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 15/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 16/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 17/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 18/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 19/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 20/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 21/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 22/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 23/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 24/320

32/519 [>.............................] - ETA: 0s - loss: 9.0664 - acc: 0.4375
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 25/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 26/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 27/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 28/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 29/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 30/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 31/320

32/519 [>.............................] - ETA: 0s - loss: 10.5775 - acc: 0.3438
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 32/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 33/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 34/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 35/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 36/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 37/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 38/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 39/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 40/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 95us/sample - loss: 12.3603 - acc: 0.2331
Epoch 41/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 42/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 43/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 44/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 45/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 46/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 47/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 48/320

32/519 [>.............................] - ETA: 0s - loss: 10.0738 - acc: 0.3750
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 49/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 50/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 51/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 52/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 53/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 54/320

32/519 [>.............................] - ETA: 0s - loss: 10.5775 - acc: 0.3438
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 55/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 56/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 57/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 58/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 59/320

32/519 [>.............................] - ETA: 0s - loss: 14.6070 - acc: 0.0938
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 60/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 61/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 62/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 63/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 64/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 65/320

32/519 [>.............................] - ETA: 0s - loss: 10.5775 - acc: 0.3438
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 66/320

32/519 [>.............................] - ETA: 0s - loss: 9.5701 - acc: 0.4062
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 67/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 68/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 69/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 70/320

32/519 [>.............................] - ETA: 0s - loss: 14.6070 - acc: 0.0938
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 71/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 72/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 73/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 73us/sample - loss: 12.3603 - acc: 0.2331
Epoch 74/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 75/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 76/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 77/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 78/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 79/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 80/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 81/320

32/519 [>.............................] - ETA: 0s - loss: 10.0738 - acc: 0.3750
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 82/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 83/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 84/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 85/320

32/519 [>.............................] - ETA: 0s - loss: 10.5775 - acc: 0.3438
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 86/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 87/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 88/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 89/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 90/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 91/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 92/320

32/519 [>.............................] - ETA: 0s - loss: 14.6070 - acc: 0.0938
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 93/320

32/519 [>.............................] - ETA: 0s - loss: 10.5775 - acc: 0.3438
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 94/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331

***************REST OF TRAINING REMOVED DUE TO WORD LIMIT*****************

==================================================
ACCURACY TESTING

32/130 [======>.......................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
130/130 [==============================] - 0s 481us/sample - loss: 13.0185 - acc: 0.1923
Test accuracy: 0.1923077
==================================================
CLASSIFICATION
PREDICTION: 5-10M
ACTUAL: 500K-1M

PREDICTION: 5-10M
ACTUAL: 101-500K

PREDICTION: 5-10M
ACTUAL: 25-50M

PREDICTION: 5-10M
ACTUAL: 1-5M

PREDICTION: 5-10M
ACTUAL: 10-25M

PREDICTION: 5-10M
ACTUAL: 10-25M

PREDICTION: 5-10M
ACTUAL: 25-50M

PREDICTION: 5-10M
ACTUAL: 5-10M

PREDICTION: 5-10M
ACTUAL: 1-5M

PREDICTION: 5-10M
ACTUAL: 1-5M

==================================================
END









share|improve this question











$endgroup$




I am using the MNIST classification tutorials on the TensorFlow website to create my own classification program to predict a footballers value using the FIFA 19 dataset. However, when I run my program, it always picks the same classification for every player in my testing dataset even when the players should have different values. I checked the probabilities that were predicted in the last layer of the neural network which has the probabilities for each classification and it shows that all probabilities are 0 except for one, which is the class that is predicted for all players. How do I fix this?



import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers
from keras.models import Sequential
from keras.layers import Dense, Activation

# import numpy library
import numpy as np

# import pandas library
import pandas as pd

# import seaborn library
import seaborn as sns

# import pyplot library
import matplotlib.pyplot as plt

# store directory containing data in a variable
DATAPATH = '../../Data/fifa19/dataNumerical.csv'
VALUES = '../../Data/fifa19/player values for classification.csv'

# import the dataset using pandas
def init_column_names():
columnNames = ['Age',
'CountryWorldRankingPoints',
'Overall',
'Potential',
'ClubGoals',
'Value(€M)',
'Wage(€K)',
'Special',
'PreferredFoot',
'InternationalReputation',
'WeakFoot',
'SkillMoves',
'WorkRate',
'BodyType',
'Position',
'JerseyNumber',
'Height(cm)',
'Weight(lbs)',
'LS',
'ST',
'RS',
'LW',
'LF',
'CF',
'RF',
'RW',
'LAM',
'CAM',
'RAM',
'LM',
'LCM',
'CM',
'RCM',
'RM',
'LWB',
'LDM',
'CDM',
'RDM',
'RWB',
'LB',
'LCB',
'CB',
'RCB',
'RB',
'Crossing',
'Finishing',
'HeadingAccuracy',
'ShortPassing',
'Volleys',
'Dribbling',
'Curve',
'FKAccuracy',
'LongPassing',
'BallControl',
'Acceleration',
'SprintSpeed',
'Agility',
'Reactions',
'Balance',
'ShotPower',
'Jumping',
'Stamina',
'Strength',
'LongShots',
'Aggression',
'Interceptions',
'Positioning',
'Vision',
'Penalties',
'Composure',
'Marking',
'StandingTackle',
'SlidingTackle',
'GKDiving',
'GKHandling',
'GKKicking',
'GKPositioning',
'GKReflexes',
'ReleaseClause(€M)'
]
return columnNames


class_names = ['0-100K', '101-500K', '500K-1M', '1-5M', '5-10M', '10-25M',
'25-50M', '50-75M', '75-100M', '100M+']

print('==================================================')
print('READ DATA CSV')
column_names = init_column_names()
raw_dataset = pd.read_csv(DATAPATH, names=column_names, na_values="?", comment='t', sep=",", skipinitialspace=True,
encoding='latin-1'
)

print('==================================================')
print('READ PLAYER VALUES CSV')
playerValues = pd.read_csv(VALUES, names=['values'], na_values="?", comment='t', sep=",", skipinitialspace=True,
encoding='latin-1'
)

# pd.set_option('display.max_columns', 177)
# pd.set_option('display.max_columns', 15)
pd.set_option('display.max_rows', 5)

# make a copy of the dataset to leave the original unaffected
dataset = raw_dataset.copy()

# clean the dataset by removing unknown values
dataset = dataset.dropna();
# print(type(dataset))

# create classification labels for the dataset
classification_labels = playerValues.values
# print(classification_labels)
# print(type(classification_labels))


# create empty array then append the different classes into the array.
class_labels_array = []
print('==================================================')
print('CREATING CLASS LABELS')
for x in classification_labels:
if x <= 0.1:
class_labels_array.append(0)
elif x < 0.5:
class_labels_array.append(1)
elif x < 1:
class_labels_array.append(2)
elif x < 5:
class_labels_array.append(3)
elif x < 10:
class_labels_array.append(4)
elif x < 25:
class_labels_array.append(5)
elif x < 50:
class_labels_array.append(6)
elif x < 75:
class_labels_array.append(7)
elif x < 100:
class_labels_array.append(8)
else:
class_labels_array.append(9)

# print(class_labels_array)
class_labels = pd.DataFrame(data=class_labels_array, columns=['class_labels'])
# print(class_labels)
# print("end of class labels")
# printing the dataset
# print(dataset.tail())


def position_one_hot(dataset):
position = dataset.pop("Position")
dataset['CAM'] = (position == 'CAM') * 1.0
dataset['CB'] = (position == 'CB') * 1.0
dataset['CDM'] = (position == 'CDM') * 1.0
dataset['CF'] = (position == 'CF') * 1.0
dataset['CM'] = (position == 'CM') * 1.0
dataset['GK'] = (position == 'GK') * 1.0
dataset['LB'] = (position == 'LB') * 1.0
dataset['LCB'] = (position == 'LCB') * 1.0
dataset['LCM'] = (position == 'LCM') * 1.0
dataset['LDM'] = (position == 'LDM') * 1.0
dataset['LF'] = (position == 'LF') * 1.0
dataset['LM'] = (position == 'LM') * 1.0
dataset['LS'] = (position == 'LS') * 1.0
dataset['LW'] = (position == 'LW') * 1.0
dataset['LWB'] = (position == 'LWB') * 1.0
dataset['RB'] = (position == 'RB') * 1.0
dataset['RCB'] = (position == 'RCB') * 1.0
dataset['RCM'] = (position == 'RCM') * 1.0
dataset['RDM'] = (position == 'RDM') * 1.0
dataset['RM'] = (position == 'RM') * 1.0
dataset['RS'] = (position == 'RS') * 1.0
dataset['RW'] = (position == 'RW') * 1.0
dataset['RWB'] = (position == 'RWB') * 1.0
dataset['ST'] = (position == 'ST') * 1.0

return dataset

# convert the categorical position column into a one-hot.
dataset = position_one_hot(dataset)

# print(dataset.tail())

# print(type(dataset))

# split the data into training and testing datasets.
training_data = dataset.sample(frac=0.8, random_state=0)
print("=============================")
print("TRAINING DATA EXAMPLES")
print(training_data.tail())
testing_data = dataset.drop(training_data.index)
print("=============================")
print("TESTING DATA EXAMPLES")
print(testing_data.tail())

# remove value we are trying to predict from the dataset.
training_data.pop('Value(€M)')
testing_data.pop('Value(€M)')

# split the labels into a training and testing labels.
training_labels = class_labels.sample(frac=0.8, random_state=0)
print("=============================")
print("TRAINING LABELS")
print(training_labels.tail())
testing_labels = class_labels.drop(training_labels.index)
print("=============================")
print("TESTING LABELS")
print(testing_labels.tail())



# new method which can be called to build the model
def buildModel():
model = keras.Sequential([

keras.layers.Flatten(input_shape=[len(training_data.keys())]),
keras.layers.Dense(128, activation=tf.nn.relu),
keras.layers.Dense(64, activation=tf.nn.relu),
keras.layers.Dense(10, activation=tf.nn.softmax)
])

# optimizer = tf.keras.optimizers.RMSprop(0.001)
model.compile(loss='sparse_categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
return model


# build the model and store it in a variable.
model = buildModel()

print('==================================================')
print('TRAINING')
# train the model
model.fit(training_data, training_labels, epochs=320, batch_size=32)

print('==================================================')
print('ACCURACY TESTING')
# check accuracy
test_loss, test_accuracy = model.evaluate(testing_data, testing_labels)
print('Test accuracy:', test_accuracy)

# make prediction for first player in dataset
predictions = model.predict(testing_data)
# print(predictions[0])
print('==================================================')
print('CLASSIFICATION')

# for x in range(len(predictions)):
for x in range(10):
# print the strongest class from the prediction
print('PREDICTION: ', class_names[np.argmax(predictions[x])])

# print the correct classification of the player that the prediction was made on to see if classification was correct.
# print(testing_labels.tail())
print('ACTUAL: ', class_names[int(testing_labels.iloc[x]['class_labels'])])
print()

print('==================================================')
print('END')


TRAINING AND TESTING OUTPUT



TRAINING
Epoch 1/320

32/519 [>.............................] - ETA: 1s - loss: 14.0409 - acc: 0.1250
519/519 [==============================] - 0s 241us/sample - loss: 12.8769 - acc: 0.1965
Epoch 2/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.4266 - acc: 0.2274
Epoch 3/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 4/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 5/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 6/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 7/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 69us/sample - loss: 12.3603 - acc: 0.2331
Epoch 8/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 9/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 10/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 11/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 12/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 13/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 14/320

32/519 [>.............................] - ETA: 0s - loss: 10.5775 - acc: 0.3438
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 15/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 16/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 17/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 18/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 19/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 20/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 21/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 22/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 23/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 24/320

32/519 [>.............................] - ETA: 0s - loss: 9.0664 - acc: 0.4375
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 25/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 26/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 27/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 28/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 29/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 30/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 31/320

32/519 [>.............................] - ETA: 0s - loss: 10.5775 - acc: 0.3438
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 32/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 33/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 34/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 35/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 36/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 37/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 38/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 39/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 40/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 95us/sample - loss: 12.3603 - acc: 0.2331
Epoch 41/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 42/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 43/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 44/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 45/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 46/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 47/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 48/320

32/519 [>.............................] - ETA: 0s - loss: 10.0738 - acc: 0.3750
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 49/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 50/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 51/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 52/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 53/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 54/320

32/519 [>.............................] - ETA: 0s - loss: 10.5775 - acc: 0.3438
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 55/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 56/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 57/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 58/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 59/320

32/519 [>.............................] - ETA: 0s - loss: 14.6070 - acc: 0.0938
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 60/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 61/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 62/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 63/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 64/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 65/320

32/519 [>.............................] - ETA: 0s - loss: 10.5775 - acc: 0.3438
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 66/320

32/519 [>.............................] - ETA: 0s - loss: 9.5701 - acc: 0.4062
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 67/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 68/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 69/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 70/320

32/519 [>.............................] - ETA: 0s - loss: 14.6070 - acc: 0.0938
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 71/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 72/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 73/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 73us/sample - loss: 12.3603 - acc: 0.2331
Epoch 74/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 75/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 76/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 77/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 78/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 79/320

32/519 [>.............................] - ETA: 0s - loss: 11.5849 - acc: 0.2812
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 80/320

32/519 [>.............................] - ETA: 0s - loss: 11.0812 - acc: 0.3125
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 81/320

32/519 [>.............................] - ETA: 0s - loss: 10.0738 - acc: 0.3750
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 82/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 83/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 84/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 85/320

32/519 [>.............................] - ETA: 0s - loss: 10.5775 - acc: 0.3438
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 86/320

32/519 [>.............................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 87/320

32/519 [>.............................] - ETA: 0s - loss: 13.5996 - acc: 0.1562
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 88/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 89/320

32/519 [>.............................] - ETA: 0s - loss: 13.0960 - acc: 0.1875
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 90/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 91/320

32/519 [>.............................] - ETA: 0s - loss: 12.0886 - acc: 0.2500
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 92/320

32/519 [>.............................] - ETA: 0s - loss: 14.6070 - acc: 0.0938
519/519 [==============================] - 0s 30us/sample - loss: 12.3603 - acc: 0.2331
Epoch 93/320

32/519 [>.............................] - ETA: 0s - loss: 10.5775 - acc: 0.3438
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331
Epoch 94/320

32/519 [>.............................] - ETA: 0s - loss: 14.1033 - acc: 0.1250
519/519 [==============================] - 0s 60us/sample - loss: 12.3603 - acc: 0.2331

***************REST OF TRAINING REMOVED DUE TO WORD LIMIT*****************

==================================================
ACCURACY TESTING

32/130 [======>.......................] - ETA: 0s - loss: 12.5923 - acc: 0.2188
130/130 [==============================] - 0s 481us/sample - loss: 13.0185 - acc: 0.1923
Test accuracy: 0.1923077
==================================================
CLASSIFICATION
PREDICTION: 5-10M
ACTUAL: 500K-1M

PREDICTION: 5-10M
ACTUAL: 101-500K

PREDICTION: 5-10M
ACTUAL: 25-50M

PREDICTION: 5-10M
ACTUAL: 1-5M

PREDICTION: 5-10M
ACTUAL: 10-25M

PREDICTION: 5-10M
ACTUAL: 10-25M

PREDICTION: 5-10M
ACTUAL: 25-50M

PREDICTION: 5-10M
ACTUAL: 5-10M

PREDICTION: 5-10M
ACTUAL: 1-5M

PREDICTION: 5-10M
ACTUAL: 1-5M

==================================================
END






python neural-network classification keras tensorflow






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Apr 11 at 14:10







Sajid

















asked Apr 11 at 13:52









SajidSajid

12




12











  • $begingroup$
    MNIST is a Computer Vision problem dataset, your data is well structured and neural networks are not the best approach to it. Try XGBoost or Random Forests and you will probably get better results.
    $endgroup$
    – Pedro Henrique Monforte
    Apr 11 at 14:46
















  • $begingroup$
    MNIST is a Computer Vision problem dataset, your data is well structured and neural networks are not the best approach to it. Try XGBoost or Random Forests and you will probably get better results.
    $endgroup$
    – Pedro Henrique Monforte
    Apr 11 at 14:46















$begingroup$
MNIST is a Computer Vision problem dataset, your data is well structured and neural networks are not the best approach to it. Try XGBoost or Random Forests and you will probably get better results.
$endgroup$
– Pedro Henrique Monforte
Apr 11 at 14:46




$begingroup$
MNIST is a Computer Vision problem dataset, your data is well structured and neural networks are not the best approach to it. Try XGBoost or Random Forests and you will probably get better results.
$endgroup$
– Pedro Henrique Monforte
Apr 11 at 14:46










1 Answer
1






active

oldest

votes


















1












$begingroup$

There are many potential issues that could cause this. What I would do is actually look at the training set you have created (write it to a csv perhaps) and make sure that everything is populating as expected. Is there a roughly equal distribution of all of the different potential classes? Do you have a good amount of data for each different categorical variable (in your case, player positions)? You only have 519 samples to choose from, perhaps you should simplify things down first by just doing a logistic regression and seeing if that works. Regardless, to get anything worthwhile I would recommend that you acquire more data, as it appears you only have at best 50 examples per each class.






share|improve this answer









$endgroup$













    Your Answer








    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "557"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );













    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f49126%2fsame-classification-given-for-neural-network-regardless-of-the-input%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    1












    $begingroup$

    There are many potential issues that could cause this. What I would do is actually look at the training set you have created (write it to a csv perhaps) and make sure that everything is populating as expected. Is there a roughly equal distribution of all of the different potential classes? Do you have a good amount of data for each different categorical variable (in your case, player positions)? You only have 519 samples to choose from, perhaps you should simplify things down first by just doing a logistic regression and seeing if that works. Regardless, to get anything worthwhile I would recommend that you acquire more data, as it appears you only have at best 50 examples per each class.






    share|improve this answer









    $endgroup$

















      1












      $begingroup$

      There are many potential issues that could cause this. What I would do is actually look at the training set you have created (write it to a csv perhaps) and make sure that everything is populating as expected. Is there a roughly equal distribution of all of the different potential classes? Do you have a good amount of data for each different categorical variable (in your case, player positions)? You only have 519 samples to choose from, perhaps you should simplify things down first by just doing a logistic regression and seeing if that works. Regardless, to get anything worthwhile I would recommend that you acquire more data, as it appears you only have at best 50 examples per each class.






      share|improve this answer









      $endgroup$















        1












        1








        1





        $begingroup$

        There are many potential issues that could cause this. What I would do is actually look at the training set you have created (write it to a csv perhaps) and make sure that everything is populating as expected. Is there a roughly equal distribution of all of the different potential classes? Do you have a good amount of data for each different categorical variable (in your case, player positions)? You only have 519 samples to choose from, perhaps you should simplify things down first by just doing a logistic regression and seeing if that works. Regardless, to get anything worthwhile I would recommend that you acquire more data, as it appears you only have at best 50 examples per each class.






        share|improve this answer









        $endgroup$



        There are many potential issues that could cause this. What I would do is actually look at the training set you have created (write it to a csv perhaps) and make sure that everything is populating as expected. Is there a roughly equal distribution of all of the different potential classes? Do you have a good amount of data for each different categorical variable (in your case, player positions)? You only have 519 samples to choose from, perhaps you should simplify things down first by just doing a logistic regression and seeing if that works. Regardless, to get anything worthwhile I would recommend that you acquire more data, as it appears you only have at best 50 examples per each class.







        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Apr 11 at 14:51









        stefanLopezstefanLopez

        862




        862



























            draft saved

            draft discarded
















































            Thanks for contributing an answer to Data Science Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f49126%2fsame-classification-given-for-neural-network-regardless-of-the-input%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Adding axes to figuresAdding axes labels to LaTeX figuresLaTeX equivalent of ConTeXt buffersRotate a node but not its content: the case of the ellipse decorationHow to define the default vertical distance between nodes?TikZ scaling graphic and adjust node position and keep font sizeNumerical conditional within tikz keys?adding axes to shapesAlign axes across subfiguresAdding figures with a certain orderLine up nested tikz enviroments or how to get rid of themAdding axes labels to LaTeX figures

            Luettelo Yhdysvaltain laivaston lentotukialuksista Lähteet | Navigointivalikko

            Gary (muusikko) Sisällysluettelo Historia | Rockin' High | Lähteet | Aiheesta muualla | NavigointivalikkoInfobox OKTuomas "Gary" Keskinen Ancaran kitaristiksiProjekti Rockin' High