Transfer Learning on Jankenpon Data

!pip install split-folders[full] matplotlib jupyter_http_over_ws imutils
Requirement already satisfied: split-folders[full] in /usr/local/lib/python3.10/dist-packages (0.5.1)
Requirement already satisfied: matplotlib in /usr/local/lib/python3.10/dist-packages (3.7.1)
Requirement already satisfied: jupyter_http_over_ws in /usr/local/lib/python3.10/dist-packages (0.0.8)
Requirement already satisfied: imutils in /usr/local/lib/python3.10/dist-packages (0.5.4)
Requirement already satisfied: tqdm in /usr/local/lib/python3.10/dist-packages (from split-folders[full]) (4.66.4)
Requirement already satisfied: contourpy>=1.0.1 in /usr/local/lib/python3.10/dist-packages (from matplotlib) (1.2.1)
Requirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.10/dist-packages (from matplotlib) (0.12.1)
Requirement already satisfied: fonttools>=4.22.0 in /usr/local/lib/python3.10/dist-packages (from matplotlib) (4.53.0)
Requirement already satisfied: kiwisolver>=1.0.1 in /usr/local/lib/python3.10/dist-packages (from matplotlib) (1.4.5)
Requirement already satisfied: numpy>=1.20 in /usr/local/lib/python3.10/dist-packages (from matplotlib) (1.25.2)
Requirement already satisfied: packaging>=20.0 in /usr/local/lib/python3.10/dist-packages (from matplotlib) (24.1)
Requirement already satisfied: pillow>=6.2.0 in /usr/local/lib/python3.10/dist-packages (from matplotlib) (9.4.0)
Requirement already satisfied: pyparsing>=2.3.1 in /usr/local/lib/python3.10/dist-packages (from matplotlib) (3.1.2)
Requirement already satisfied: python-dateutil>=2.7 in /usr/local/lib/python3.10/dist-packages (from matplotlib) (2.8.2)
Requirement already satisfied: notebook>=5.0 in /usr/local/lib/python3.10/dist-packages (from jupyter_http_over_ws) (6.5.5)
Requirement already satisfied: six>=1.6.0 in /usr/local/lib/python3.10/dist-packages (from jupyter_http_over_ws) (1.16.0)
Requirement already satisfied: tornado>=4.5 in /usr/local/lib/python3.10/dist-packages (from jupyter_http_over_ws) (6.3.3)
Requirement already satisfied: jinja2 in /usr/local/lib/python3.10/dist-packages (from notebook>=5.0->jupyter_http_over_ws) (3.1.4)
Requirement already satisfied: pyzmq<25,>=17 in /usr/local/lib/python3.10/dist-packages (from notebook>=5.0->jupyter_http_over_ws) (24.0.1)
Requirement already satisfied: argon2-cffi in /usr/local/lib/python3.10/dist-packages (from notebook>=5.0->jupyter_http_over_ws) (23.1.0)
Requirement already satisfied: traitlets>=4.2.1 in /usr/local/lib/python3.10/dist-packages (from notebook>=5.0->jupyter_http_over_ws) (5.7.1)
Requirement already satisfied: jupyter-core>=4.6.1 in /usr/local/lib/python3.10/dist-packages (from notebook>=5.0->jupyter_http_over_ws) (5.7.2)
Requirement already satisfied: jupyter-client<8,>=5.3.4 in /usr/local/lib/python3.10/dist-packages (from notebook>=5.0->jupyter_http_over_ws) (6.1.12)
Requirement already satisfied: ipython-genutils in /usr/local/lib/python3.10/dist-packages (from notebook>=5.0->jupyter_http_over_ws) (0.2.0)
Requirement already satisfied: nbformat in /usr/local/lib/python3.10/dist-packages (from notebook>=5.0->jupyter_http_over_ws) (5.10.4)
Requirement already satisfied: nbconvert>=5 in /usr/local/lib/python3.10/dist-packages (from notebook>=5.0->jupyter_http_over_ws) (6.5.4)
Requirement already satisfied: nest-asyncio>=1.5 in /usr/local/lib/python3.10/dist-packages (from notebook>=5.0->jupyter_http_over_ws) (1.6.0)
Requirement already satisfied: ipykernel in /usr/local/lib/python3.10/dist-packages (from notebook>=5.0->jupyter_http_over_ws) (5.5.6)
Requirement already satisfied: Send2Trash>=1.8.0 in /usr/local/lib/python3.10/dist-packages (from notebook>=5.0->jupyter_http_over_ws) (1.8.3)
Requirement already satisfied: terminado>=0.8.3 in /usr/local/lib/python3.10/dist-packages (from notebook>=5.0->jupyter_http_over_ws) (0.18.1)
Requirement already satisfied: prometheus-client in /usr/local/lib/python3.10/dist-packages (from notebook>=5.0->jupyter_http_over_ws) (0.20.0)
Requirement already satisfied: nbclassic>=0.4.7 in /usr/local/lib/python3.10/dist-packages (from notebook>=5.0->jupyter_http_over_ws) (1.1.0)
Requirement already satisfied: platformdirs>=2.5 in /usr/local/lib/python3.10/dist-packages (from jupyter-core>=4.6.1->notebook>=5.0->jupyter_http_over_ws) (4.2.2)
Requirement already satisfied: notebook-shim>=0.2.3 in /usr/local/lib/python3.10/dist-packages (from nbclassic>=0.4.7->notebook>=5.0->jupyter_http_over_ws) (0.2.4)
Requirement already satisfied: lxml in /usr/local/lib/python3.10/dist-packages (from nbconvert>=5->notebook>=5.0->jupyter_http_over_ws) (4.9.4)
Requirement already satisfied: beautifulsoup4 in /usr/local/lib/python3.10/dist-packages (from nbconvert>=5->notebook>=5.0->jupyter_http_over_ws) (4.12.3)
Requirement already satisfied: bleach in /usr/local/lib/python3.10/dist-packages (from nbconvert>=5->notebook>=5.0->jupyter_http_over_ws) (6.1.0)
Requirement already satisfied: defusedxml in /usr/local/lib/python3.10/dist-packages (from nbconvert>=5->notebook>=5.0->jupyter_http_over_ws) (0.7.1)
Requirement already satisfied: entrypoints>=0.2.2 in /usr/local/lib/python3.10/dist-packages (from nbconvert>=5->notebook>=5.0->jupyter_http_over_ws) (0.4)
Requirement already satisfied: jupyterlab-pygments in /usr/local/lib/python3.10/dist-packages (from nbconvert>=5->notebook>=5.0->jupyter_http_over_ws) (0.3.0)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.10/dist-packages (from nbconvert>=5->notebook>=5.0->jupyter_http_over_ws) (2.1.5)
Requirement already satisfied: mistune<2,>=0.8.1 in /usr/local/lib/python3.10/dist-packages (from nbconvert>=5->notebook>=5.0->jupyter_http_over_ws) (0.8.4)
Requirement already satisfied: nbclient>=0.5.0 in /usr/local/lib/python3.10/dist-packages (from nbconvert>=5->notebook>=5.0->jupyter_http_over_ws) (0.10.0)
Requirement already satisfied: pandocfilters>=1.4.1 in /usr/local/lib/python3.10/dist-packages (from nbconvert>=5->notebook>=5.0->jupyter_http_over_ws) (1.5.1)
Requirement already satisfied: pygments>=2.4.1 in /usr/local/lib/python3.10/dist-packages (from nbconvert>=5->notebook>=5.0->jupyter_http_over_ws) (2.16.1)
Requirement already satisfied: tinycss2 in /usr/local/lib/python3.10/dist-packages (from nbconvert>=5->notebook>=5.0->jupyter_http_over_ws) (1.3.0)
Requirement already satisfied: fastjsonschema>=2.15 in /usr/local/lib/python3.10/dist-packages (from nbformat->notebook>=5.0->jupyter_http_over_ws) (2.19.1)
Requirement already satisfied: jsonschema>=2.6 in /usr/local/lib/python3.10/dist-packages (from nbformat->notebook>=5.0->jupyter_http_over_ws) (4.19.2)
Requirement already satisfied: ptyprocess in /usr/local/lib/python3.10/dist-packages (from terminado>=0.8.3->notebook>=5.0->jupyter_http_over_ws) (0.7.0)
Requirement already satisfied: argon2-cffi-bindings in /usr/local/lib/python3.10/dist-packages (from argon2-cffi->notebook>=5.0->jupyter_http_over_ws) (21.2.0)
Requirement already satisfied: ipython>=5.0.0 in /usr/local/lib/python3.10/dist-packages (from ipykernel->notebook>=5.0->jupyter_http_over_ws) (7.34.0)
Requirement already satisfied: setuptools>=18.5 in /usr/local/lib/python3.10/dist-packages (from ipython>=5.0.0->ipykernel->notebook>=5.0->jupyter_http_over_ws) (67.7.2)
Requirement already satisfied: jedi>=0.16 in /usr/local/lib/python3.10/dist-packages (from ipython>=5.0.0->ipykernel->notebook>=5.0->jupyter_http_over_ws) (0.19.1)
Requirement already satisfied: decorator in /usr/local/lib/python3.10/dist-packages (from ipython>=5.0.0->ipykernel->notebook>=5.0->jupyter_http_over_ws) (4.4.2)
Requirement already satisfied: pickleshare in /usr/local/lib/python3.10/dist-packages (from ipython>=5.0.0->ipykernel->notebook>=5.0->jupyter_http_over_ws) (0.7.5)
Requirement already satisfied: prompt-toolkit!=3.0.0,!=3.0.1,<3.1.0,>=2.0.0 in /usr/local/lib/python3.10/dist-packages (from ipython>=5.0.0->ipykernel->notebook>=5.0->jupyter_http_over_ws) (3.0.47)
Requirement already satisfied: backcall in /usr/local/lib/python3.10/dist-packages (from ipython>=5.0.0->ipykernel->notebook>=5.0->jupyter_http_over_ws) (0.2.0)
Requirement already satisfied: matplotlib-inline in /usr/local/lib/python3.10/dist-packages (from ipython>=5.0.0->ipykernel->notebook>=5.0->jupyter_http_over_ws) (0.1.7)
Requirement already satisfied: pexpect>4.3 in /usr/local/lib/python3.10/dist-packages (from ipython>=5.0.0->ipykernel->notebook>=5.0->jupyter_http_over_ws) (4.9.0)
Requirement already satisfied: attrs>=22.2.0 in /usr/local/lib/python3.10/dist-packages (from jsonschema>=2.6->nbformat->notebook>=5.0->jupyter_http_over_ws) (23.2.0)
Requirement already satisfied: jsonschema-specifications>=2023.03.6 in /usr/local/lib/python3.10/dist-packages (from jsonschema>=2.6->nbformat->notebook>=5.0->jupyter_http_over_ws) (2023.12.1)
Requirement already satisfied: referencing>=0.28.4 in /usr/local/lib/python3.10/dist-packages (from jsonschema>=2.6->nbformat->notebook>=5.0->jupyter_http_over_ws) (0.35.1)
Requirement already satisfied: rpds-py>=0.7.1 in /usr/local/lib/python3.10/dist-packages (from jsonschema>=2.6->nbformat->notebook>=5.0->jupyter_http_over_ws) (0.18.1)
Requirement already satisfied: jupyter-server<3,>=1.8 in /usr/local/lib/python3.10/dist-packages (from notebook-shim>=0.2.3->nbclassic>=0.4.7->notebook>=5.0->jupyter_http_over_ws) (1.24.0)
Requirement already satisfied: cffi>=1.0.1 in /usr/local/lib/python3.10/dist-packages (from argon2-cffi-bindings->argon2-cffi->notebook>=5.0->jupyter_http_over_ws) (1.16.0)
Requirement already satisfied: soupsieve>1.2 in /usr/local/lib/python3.10/dist-packages (from beautifulsoup4->nbconvert>=5->notebook>=5.0->jupyter_http_over_ws) (2.5)
Requirement already satisfied: webencodings in /usr/local/lib/python3.10/dist-packages (from bleach->nbconvert>=5->notebook>=5.0->jupyter_http_over_ws) (0.5.1)
Requirement already satisfied: pycparser in /usr/local/lib/python3.10/dist-packages (from cffi>=1.0.1->argon2-cffi-bindings->argon2-cffi->notebook>=5.0->jupyter_http_over_ws) (2.22)
Requirement already satisfied: parso<0.9.0,>=0.8.3 in /usr/local/lib/python3.10/dist-packages (from jedi>=0.16->ipython>=5.0.0->ipykernel->notebook>=5.0->jupyter_http_over_ws) (0.8.4)
Requirement already satisfied: anyio<4,>=3.1.0 in /usr/local/lib/python3.10/dist-packages (from jupyter-server<3,>=1.8->notebook-shim>=0.2.3->nbclassic>=0.4.7->notebook>=5.0->jupyter_http_over_ws) (3.7.1)
Requirement already satisfied: websocket-client in /usr/local/lib/python3.10/dist-packages (from jupyter-server<3,>=1.8->notebook-shim>=0.2.3->nbclassic>=0.4.7->notebook>=5.0->jupyter_http_over_ws) (1.8.0)
Requirement already satisfied: wcwidth in /usr/local/lib/python3.10/dist-packages (from prompt-toolkit!=3.0.0,!=3.0.1,<3.1.0,>=2.0.0->ipython>=5.0.0->ipykernel->notebook>=5.0->jupyter_http_over_ws) (0.2.13)
Requirement already satisfied: idna>=2.8 in /usr/local/lib/python3.10/dist-packages (from anyio<4,>=3.1.0->jupyter-server<3,>=1.8->notebook-shim>=0.2.3->nbclassic>=0.4.7->notebook>=5.0->jupyter_http_over_ws) (3.7)
Requirement already satisfied: sniffio>=1.1 in /usr/local/lib/python3.10/dist-packages (from anyio<4,>=3.1.0->jupyter-server<3,>=1.8->notebook-shim>=0.2.3->nbclassic>=0.4.7->notebook>=5.0->jupyter_http_over_ws) (1.3.1)
Requirement already satisfied: exceptiongroup in /usr/local/lib/python3.10/dist-packages (from anyio<4,>=3.1.0->jupyter-server<3,>=1.8->notebook-shim>=0.2.3->nbclassic>=0.4.7->notebook>=5.0->jupyter_http_over_ws) (1.2.1)
# import all needed libraries
import zipfile, os, shutil, splitfolders, re, random
import tensorflow as tf
import matplotlib.pyplot as plt
import numpy as np

from imutils import paths
from tensorflow.keras.preprocessing.image import ImageDataGenerator
from tensorflow.keras.applications.vgg16 import preprocess_input
from tensorflow.keras.applications import VGG16
from tensorflow.keras.layers import Dropout
from tensorflow.keras.layers import Flatten
from tensorflow.keras.layers import Dense
from tensorflow.keras.layers import Input
from tensorflow.keras.models import Model
from tensorflow.keras.optimizers import SGD
from tensorflow.keras.backend import clear_session
from tensorflow.keras.callbacks import EarlyStopping, ReduceLROnPlateau
from sklearn.metrics import classification_report
# defining directories
local_dir = '/tmp/'
dataset_name = 'rockpaperscissors'
local_data = local_dir + dataset_name
local_zip = local_data + '.zip'
# dowload dataset (pass if exist)
!test -f $local_zip || wget --no-check-certificate \
 https://github.com/dicodingacademy/assets/releases/download/release/rockpaperscissors.zip \
 -O $local_zip
# extract dataset
zip_ref = zipfile.ZipFile(local_zip, 'r')
zip_ref.extractall(local_dir)
zip_ref.close()
# prepare train & validation split
shutil.rmtree(local_data + '/rps-cv-images')
os.remove(local_data + '/README_rpc-cv-images.txt')
print(os.listdir(local_data))

base_dir = local_dir + '/rps'
if os.path.exists(base_dir) == True:
  shutil.rmtree(base_dir)

splitfolders.ratio(local_data, base_dir, ratio=(.6,.4))
train_dir = os.path.join(base_dir, 'train')
print('amount of training sample : ', sum(len(files) for _, _, files in os.walk(re.escape(base_dir) + r'/train')))
validation_dir = os.path.join(base_dir, 'val')
print('amount of validation sample : ', sum(len(files) for _, _, files in os.walk(re.escape(base_dir) + r'/val')))
['rock', 'scissors', 'paper']
amount of training sample :  1312
amount of validation sample :  876
Copying files: 2188 files [00:01, 2052.05 files/s]
# prepare test folder for classification report
files_list = []

for root, dirs, files in os.walk(local_data):
  for dir in dirs:
    results = os.walk(local_data + '/' + dir)
    for result in results:
      cat_list = []
      for file in result[2]:
        if file.endswith(".jpg") or file.endswith(".png") or file.endswith(".jpeg"):
          cat_list.append(os.path.join(root + '/' + dir, file))
    files_list.append(cat_list)

for imgs in files_list:
  category = re.findall('(?<=rs\/).*?(?=\/)', imgs[0])[0]
  filesToCopy = random.sample(imgs, 4)
  destPath = base_dir + '/test/' + str(category)
  if os.path.isdir(destPath) == False:
    os.makedirs(destPath)
  for file in filesToCopy:
    shutil.copy(file, destPath)

test_dir = os.path.join(base_dir, 'test')
# directory summary
print(os.listdir(train_dir))
print(os.listdir(validation_dir))
print(os.listdir(test_dir))
['rock', 'scissors', 'paper']
['rock', 'scissors', 'paper']
['rock', 'scissors', 'paper']
# preparing generator
train_datagen = ImageDataGenerator(
    rescale = 1./255,
    rotation_range = 20,
    horizontal_flip = True,
    shear_range = 0.2,
    fill_mode = 'nearest',
    preprocessing_function = preprocess_input,
)

test_datagen = ImageDataGenerator(
    rescale = 1./255)
# defining rgb mean for every generator
mean = np.array([123.68, 116.779, 103.939], dtype = 'float32')
train_datagen.mean = mean
test_datagen.mean = mean
# flow data to generator
BATCH_SIZE = 32
train_generator = train_datagen.flow_from_directory(
    train_dir,
    target_size = (224,224),
    batch_size = BATCH_SIZE,
    class_mode = 'categorical',
    shuffle = False,
)

validation_generator = test_datagen.flow_from_directory(
    validation_dir,
    target_size = (224,224),
    batch_size = BATCH_SIZE,
    class_mode = 'categorical',
    shuffle = False,
)

test_generator = test_datagen.flow_from_directory(
    test_dir,
    target_size = (224,224),
    batch_size = BATCH_SIZE,
    class_mode = 'categorical',
    shuffle = False,
)

totalTrain = len(list(paths.list_images(train_dir)))
totalVal = len(list(paths.list_images(validation_dir)))
totalTest = len(list(paths.list_images(test_dir)))
Found 1312 images belonging to 3 classes.
Found 876 images belonging to 3 classes.
Found 12 images belonging to 3 classes.
# preparing baseModel
baseModel = VGG16(weights="imagenet", include_top=False,
    input_tensor=Input(shape=(224, 224, 3)))
headModel = baseModel.output
headModel = Flatten(name="flatten")(headModel)
headModel = Dense(512, activation="relu")(headModel)
headModel = Dropout(0.5)(headModel)
headModel = Dense(3, activation="softmax")(headModel)
model = Model(inputs=baseModel.input, outputs=headModel)
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/vgg16/vgg16_weights_tf_dim_ordering_tf_kernels_notop.h5
58889256/58889256 [==============================] - 0s 0us/step
# freeze hidden layers to preserve model features
for layer in baseModel.layers:
    layer.trainable = False
# prepare callbacks
EarlyStop = EarlyStopping(
    monitor = 'val_loss',
    patience = 4,
    verbose = 1,
    restore_best_weights = True,
    min_delta = 0.1
)

ModelCP = ReduceLROnPlateau(
    monitor = 'val_loss',
    factor = 0.5,
    patience = 1,
    verbose = 1
)

callbacks = [EarlyStop, ModelCP]
# compile the model with frozen layers
print("[INFO] compiling model...")
opt = SGD(learning_rate=1e-4, momentum=0.9)
model.compile(loss="categorical_crossentropy", optimizer=opt,
    metrics=["accuracy"])
print("[INFO] training head...")
H = model.fit(
    x=train_generator,
    steps_per_epoch=totalTrain // BATCH_SIZE,
    validation_data=validation_generator,
    validation_steps=totalVal // BATCH_SIZE,
    epochs=25,
    callbacks=callbacks,
)
[INFO] compiling model...
[INFO] training head...
Epoch 1/25
41/41 [==============================] - 38s 606ms/step - loss: 1.6104 - accuracy: 0.3948 - val_loss: 1.1712 - val_accuracy: 0.3299 - lr: 1.0000e-04
Epoch 2/25
41/41 [==============================] - 24s 587ms/step - loss: 1.1336 - accuracy: 0.4223 - val_loss: 0.9012 - val_accuracy: 0.5764 - lr: 1.0000e-04
Epoch 3/25
41/41 [==============================] - ETA: 0s - loss: 0.9340 - accuracy: 0.5777
Epoch 3: ReduceLROnPlateau reducing learning rate to 4.999999873689376e-05.
41/41 [==============================] - 22s 532ms/step - loss: 0.9340 - accuracy: 0.5777 - val_loss: 1.0275 - val_accuracy: 0.3738 - lr: 1.0000e-04
Epoch 4/25
41/41 [==============================] - 24s 596ms/step - loss: 0.8847 - accuracy: 0.5998 - val_loss: 0.7451 - val_accuracy: 0.7083 - lr: 5.0000e-05
Epoch 5/25
41/41 [==============================] - 22s 541ms/step - loss: 0.7422 - accuracy: 0.7919 - val_loss: 0.6416 - val_accuracy: 0.9375 - lr: 5.0000e-05
Epoch 6/25
41/41 [==============================] - 21s 520ms/step - loss: 0.6953 - accuracy: 0.7698 - val_loss: 0.5985 - val_accuracy: 0.9201 - lr: 5.0000e-05
Epoch 7/25
41/41 [==============================] - ETA: 0s - loss: 0.6743 - accuracy: 0.7790
Epoch 7: ReduceLROnPlateau reducing learning rate to 2.499999936844688e-05.
41/41 [==============================] - 22s 532ms/step - loss: 0.6743 - accuracy: 0.7790 - val_loss: 0.6319 - val_accuracy: 0.8194 - lr: 5.0000e-05
Epoch 8/25
41/41 [==============================] - 23s 565ms/step - loss: 0.7432 - accuracy: 0.7096 - val_loss: 0.5822 - val_accuracy: 0.8646 - lr: 2.5000e-05
Epoch 9/25
41/41 [==============================] - 21s 518ms/step - loss: 0.6423 - accuracy: 0.8056 - val_loss: 0.5275 - val_accuracy: 0.9225 - lr: 2.5000e-05
Epoch 10/25
41/41 [==============================] - 22s 534ms/step - loss: 0.5945 - accuracy: 0.8697 - val_loss: 0.5073 - val_accuracy: 0.9363 - lr: 2.5000e-05
Epoch 11/25
41/41 [==============================] - ETA: 0s - loss: 0.5495 - accuracy: 0.8933
Epoch 11: ReduceLROnPlateau reducing learning rate to 1.249999968422344e-05.
41/41 [==============================] - 24s 589ms/step - loss: 0.5495 - accuracy: 0.8933 - val_loss: 0.5212 - val_accuracy: 0.9271 - lr: 2.5000e-05
Epoch 12/25
41/41 [==============================] - 22s 537ms/step - loss: 0.5244 - accuracy: 0.9002 - val_loss: 0.4796 - val_accuracy: 0.9340 - lr: 1.2500e-05
Epoch 13/25
41/41 [==============================] - ETA: 0s - loss: 0.5346 - accuracy: 0.8941Restoring model weights from the end of the best epoch: 9.

Epoch 13: ReduceLROnPlateau reducing learning rate to 6.24999984211172e-06.
41/41 [==============================] - 21s 522ms/step - loss: 0.5346 - accuracy: 0.8941 - val_loss: 0.4812 - val_accuracy: 0.9213 - lr: 1.2500e-05
Epoch 13: early stopping
# create classification_report for trained model
print("[INFO] evaluating after fine-tuning network head...")
test_generator.reset()
predIdxs = model.predict(x=test_generator,
    steps=(totalTest // BATCH_SIZE) + 1)
predIdxs = np.argmax(predIdxs, axis=1)
print(classification_report(test_generator.classes, predIdxs,
    target_names=test_generator.class_indices.keys()))
[INFO] evaluating after fine-tuning network head...
1/1 [==============================] - 3s 3s/step
              precision    recall  f1-score   support

       paper       1.00      1.00      1.00         4
        rock       1.00      1.00      1.00         4
    scissors       1.00      1.00      1.00         4

    accuracy                           1.00        12
   macro avg       1.00      1.00      1.00        12
weighted avg       1.00      1.00      1.00        12
# recreate model with unfrozen layers
clear_session()
train_generator.reset()
validation_generator.reset()
for layer in baseModel.layers[15:]:
    layer.trainable = True
# recompile and retrain the model after unfreezing layers
print("[INFO] re-compiling model...")
opt = SGD(learning_rate=1e-4, momentum=0.9)
model.compile(loss="categorical_crossentropy", optimizer=opt,
    metrics=["accuracy"])
H = model.fit(
    x=train_generator,
    steps_per_epoch=totalTrain // BATCH_SIZE,
    validation_data=validation_generator,
    validation_steps=totalVal // BATCH_SIZE,
    epochs=20,
    callbacks=callbacks,
)
[INFO] re-compiling model...
Epoch 1/20
41/41 [==============================] - 26s 562ms/step - loss: 0.5498 - accuracy: 0.8255 - val_loss: 0.5427 - val_accuracy: 0.7882 - lr: 1.0000e-04
Epoch 2/20
41/41 [==============================] - 22s 535ms/step - loss: 0.3568 - accuracy: 0.9093 - val_loss: 0.2297 - val_accuracy: 0.9525 - lr: 1.0000e-04
Epoch 3/20
41/41 [==============================] - 21s 514ms/step - loss: 0.2275 - accuracy: 0.9459 - val_loss: 0.1615 - val_accuracy: 0.9618 - lr: 1.0000e-04
Epoch 4/20
41/41 [==============================] - 23s 557ms/step - loss: 0.1356 - accuracy: 0.9688 - val_loss: 0.1183 - val_accuracy: 0.9711 - lr: 1.0000e-04
Epoch 5/20
41/41 [==============================] - 24s 571ms/step - loss: 0.1202 - accuracy: 0.9726 - val_loss: 0.1065 - val_accuracy: 0.9803 - lr: 1.0000e-04
Epoch 6/20
41/41 [==============================] - 24s 576ms/step - loss: 0.1012 - accuracy: 0.9771 - val_loss: 0.0918 - val_accuracy: 0.9838 - lr: 1.0000e-04
Epoch 7/20
41/41 [==============================] - 22s 527ms/step - loss: 0.0926 - accuracy: 0.9787 - val_loss: 0.0674 - val_accuracy: 0.9873 - lr: 1.0000e-04
Epoch 8/20
41/41 [==============================] - ETA: 0s - loss: 0.0648 - accuracy: 0.9870Restoring model weights from the end of the best epoch: 4.
41/41 [==============================] - 23s 557ms/step - loss: 0.0648 - accuracy: 0.9870 - val_loss: 0.0622 - val_accuracy: 0.9873 - lr: 1.0000e-04
Epoch 8: early stopping
# create classification_report for full model
print("[INFO] evaluating after fine-tuning network...")
test_generator.reset()
predIdxs = model.predict(x=test_generator,
    steps=(totalTest // BATCH_SIZE) + 1)
predIdxs = np.argmax(predIdxs, axis=1)
print(classification_report(test_generator.classes, predIdxs,
    target_names=test_generator.class_indices.keys()))
[INFO] evaluating after fine-tuning network...
1/1 [==============================] - 0s 188ms/step
              precision    recall  f1-score   support

       paper       1.00      1.00      1.00         4
        rock       1.00      1.00      1.00         4
    scissors       1.00      1.00      1.00         4

    accuracy                           1.00        12
   macro avg       1.00      1.00      1.00        12
weighted avg       1.00      1.00      1.00        12
import os
import numpy as np
from tensorflow.keras.preprocessing import image
from sklearn.metrics import multilabel_confusion_matrix
import matplotlib.pyplot as plt
import matplotlib.image as mpimg
%matplotlib inline

testdir = test_dir
uploaded = list(paths.list_images(testdir))
fig = plt.figure(figsize= (10, 10))
plt.subplots_adjust(left=0.1,
                    bottom=0.1,
                    right=0.9,
                    top=0.9,
                    wspace=0.4,
                    hspace=0.4)
for i in range(len(uploaded)):
    path = uploaded[i]
    img = image.load_img(path, target_size = (224,224))

    ax = fig.add_subplot(4, 4, i+1)
    ax.imshow(img)
    images = image.img_to_array(img)
    images = np.expand_dims(images, axis=0)
    images = preprocess_input(images)
    pred = model.predict(images)
    ax.title.set_text(list(train_generator.class_indices.keys())[np.argmax(pred, axis = 1)[0]])
1/1 [==============================] - 1s 1s/step
1/1 [==============================] - 0s 19ms/step
1/1 [==============================] - 0s 19ms/step
1/1 [==============================] - 0s 17ms/step
1/1 [==============================] - 0s 17ms/step
1/1 [==============================] - 0s 17ms/step
1/1 [==============================] - 0s 19ms/step
1/1 [==============================] - 0s 17ms/step
1/1 [==============================] - 0s 18ms/step
1/1 [==============================] - 0s 18ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 17ms/step

Back to top