It has been explained ImageAI Realize picture prediction , Function object detection and object detection in video .
Again , It only takes a few lines of body code to complete the prediction of the user-defined model .( Here we mainly introduce the process )
ImageAI github Address
Preparatory work and ImageAI See for installation of Picture prediction ImageAI ( One )
Object detection ImageAI ( Two )
Object detection in video ImageAI ( 3、 ... and )
Custom Model Training
Similar to before ,ImageAI Provides 4 Species algorithm ( SqueezeNet,ResNet,InceptionV3 and DenseNet) It can be used to customize the training of prediction model .
I'm going to use ResNet
To customize the prediction model, you need to prepare training data ( It's used here IMAGEAI Provided IdenProf Data sets It includes staff of ten different majors )
The data structure is as follows
==IdenProf
==test
==chef
->chef-1...
==doctor
->doctor-1...
...
==train
==chef
->chef-1...
==doctor
->doctor-1...
The address provided in the code can no longer be downloaded to the data The new address is :https://github.com/OlafenwaMoses/IdenProf/releases
After downloading, you can decompress and use the data
Only three types of data are selected in the following code Due to the insufficient performance of the machine, only demonstration
The code is as follows
from io import open
import requests
import shutil
from zipfile import ZipFile
import os
from imageai.Prediction.Custom import ModelTraining
# Import ModelTraining class
##############################################################################
############################# The following is data loading processing ################################
execution_path = os.getcwd()
TRAIN_ZIP_ONE = os.path.join(execution_path, "idenprof-train1.zip")
TRAIN_ZIP_TWO = os.path.join(execution_path, "idenprof-train2.zip")
TEST_ZIP = os.path.join(execution_path, "idenprof-test.zip")
DATASET_DIR = os.path.join(execution_path, "idenprof")
DATASET_TRAIN_DIR = os.path.join(DATASET_DIR, "train")
DATASET_TEST_DIR = os.path.join(DATASET_DIR, "test")
if(os.path.exists(DATASET_DIR) == False):
os.mkdir(DATASET_DIR)
if(os.path.exists(DATASET_TRAIN_DIR) == False):
os.mkdir(DATASET_TRAIN_DIR)
if(os.path.exists(DATASET_TEST_DIR) == False):
os.mkdir(DATASET_TEST_DIR)
if(len(os.listdir(DATASET_TRAIN_DIR)) < 3):
if(os.path.exists(TRAIN_ZIP_ONE) == False):
print("Downloading idenprof-train1.zip")
data = requests.get("https://github.com/OlafenwaMoses/IdenProf/releases/download/v1.0/idenprof-train1.zip", stream = True)
with open(TRAIN_ZIP_ONE, "wb") as file:
shutil.copyfileobj(data.raw, file)
del data
if (os.path.exists(TRAIN_ZIP_TWO) == False):
print("Downloading idenprof-train2.zip")
data = requests.get("https://github.com/OlafenwaMoses/IdenProf/releases/download/v1.0/idenprof-train2.zip", stream=True)
with open(TRAIN_ZIP_TWO, "wb") as file:
shutil.copyfileobj(data.raw, file)
del data
print("Extracting idenprof-train1.zip")
extract1 = ZipFile(TRAIN_ZIP_ONE)
extract1.extractall(DATASET_TRAIN_DIR)
extract1.close()
print("Extracting idenprof-train2.zip")
extract2 = ZipFile(TRAIN_ZIP_TWO)
extract2.extractall(DATASET_TRAIN_DIR)
extract2.close()
if(len(os.listdir(DATASET_TEST_DIR)) < 3):
if (os.path.exists(TEST_ZIP) == False):
print("Downloading idenprof-test.zip")
data = requests.get("https://github.com/OlafenwaMoses/IdenProf/releases/download/v1.0/idenprof-test.zip", stream=True)
with open(TEST_ZIP, "wb") as file:
shutil.copyfileobj(data.raw, file)
del data
print("Extracting idenprof-test.zip")
extract = ZipFile(TEST_ZIP)
extract.extractall(DATASET_TEST_DIR)
extract.close()
#################### The above is the processing of training and test data ##################################
##############################################################################
model_trainer = ModelTraining()
# establish ModelTraining Class instance
model_trainer.setModelTypeAsResNet()
# Set the model type to ResNet Other models can also be used
#setModelTypeAsSqueezeNet()
#setModelTypeAsInceptionV3()
#setModelTypeAsDenseNet()
model_trainer.setDataDirectory(DATASET_DIR)
# Set the training data set path
model_trainer.trainModel(num_objects=3, num_experiments=10, enhance_data=True, batch_size=32, show_network_summary=True)
# Training models And set the parameters See instructions
num_objects Specify the number of objects in the image data set and the type of image ( For simplicity, only three categories are used here )
num_experiments Number of image training epochs
enhance_data( Optional ) Specify whether to generate a copy of the training image for better performance
batch_size Batch processing Quantity of each batch
show_network_summary Whether to display the training process in the console
First, the structure of the model is shown Then there is the result of training
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_2 (InputLayer) (None, 224, 224, 3) 0
__________________________________________________________________________________________________
conv2d_54 (Conv2D) (None, 112, 112, 64) 9472 input_2[0][0]
__________________________________________________________________________________________________
batch_normalization_54 (BatchNo (None, 112, 112, 64) 256 conv2d_54[0][0]
__________________________________________________________________________________________________
activation_51 (Activation) (None, 112, 112, 64) 0 batch_normalization_54[0][0]
__________________________________________________________________________________________________
max_pooling2d_2 (MaxPooling2D) (None, 55, 55, 64) 0 activation_51[0][0]
__________________________________________________________________________________________________
conv2d_56 (Conv2D) (None, 55, 55, 64) 4160 max_pooling2d_2[0][0]
__________________________________________________________________________________________________
batch_normalization_56 (BatchNo (None, 55, 55, 64) 256 conv2d_56[0][0]
__________________________________________________________________________________________________
activation_52 (Activation) (None, 55, 55, 64) 0 batch_normalization_56[0][0]
__________________________________________________________________________________________________
conv2d_57 (Conv2D) (None, 55, 55, 64) 36928 activation_52[0][0]
__________________________________________________________________________________________________
batch_normalization_57 (BatchNo (None, 55, 55, 64) 256 conv2d_57[0][0]
__________________________________________________________________________________________________
activation_53 (Activation) (None, 55, 55, 64) 0 batch_normalization_57[0][0]
__________________________________________________________________________________________________
conv2d_58 (Conv2D) (None, 55, 55, 256) 16640 activation_53[0][0]
__________________________________________________________________________________________________
conv2d_55 (Conv2D) (None, 55, 55, 256) 16640 max_pooling2d_2[0][0]
__________________________________________________________________________________________________
batch_normalization_58 (BatchNo (None, 55, 55, 256) 1024 conv2d_58[0][0]
__________________________________________________________________________________________________
batch_normalization_55 (BatchNo (None, 55, 55, 256) 1024 conv2d_55[0][0]
__________________________________________________________________________________________________
add_17 (Add) (None, 55, 55, 256) 0 batch_normalization_58[0][0]
batch_normalization_55[0][0]
__________________________________________________________________________________________________
activation_54 (Activation) (None, 55, 55, 256) 0 add_17[0][0]
__________________________________________________________________________________________________
conv2d_59 (Conv2D) (None, 55, 55, 64) 16448 activation_54[0][0]
__________________________________________________________________________________________________
batch_normalization_59 (BatchNo (None, 55, 55, 64) 256 conv2d_59[0][0]
__________________________________________________________________________________________________
activation_55 (Activation) (None, 55, 55, 64) 0 batch_normalization_59[0][0]
__________________________________________________________________________________________________
conv2d_60 (Conv2D) (None, 55, 55, 64) 36928 activation_55[0][0]
__________________________________________________________________________________________________
batch_normalization_60 (BatchNo (None, 55, 55, 64) 256 conv2d_60[0][0]
__________________________________________________________________________________________________
activation_56 (Activation) (None, 55, 55, 64) 0 batch_normalization_60[0][0]
__________________________________________________________________________________________________
conv2d_61 (Conv2D) (None, 55, 55, 256) 16640 activation_56[0][0]
__________________________________________________________________________________________________
batch_normalization_61 (BatchNo (None, 55, 55, 256) 1024 conv2d_61[0][0]
__________________________________________________________________________________________________
add_18 (Add) (None, 55, 55, 256) 0 batch_normalization_61[0][0]
activation_54[0][0]
__________________________________________________________________________________________________
activation_57 (Activation) (None, 55, 55, 256) 0 add_18[0][0]
__________________________________________________________________________________________________
conv2d_62 (Conv2D) (None, 55, 55, 64) 16448 activation_57[0][0]
__________________________________________________________________________________________________
batch_normalization_62 (BatchNo (None, 55, 55, 64) 256 conv2d_62[0][0]
__________________________________________________________________________________________________
activation_58 (Activation) (None, 55, 55, 64) 0 batch_normalization_62[0][0]
__________________________________________________________________________________________________
conv2d_63 (Conv2D) (None, 55, 55, 64) 36928 activation_58[0][0]
__________________________________________________________________________________________________
batch_normalization_63 (BatchNo (None, 55, 55, 64) 256 conv2d_63[0][0]
__________________________________________________________________________________________________
activation_59 (Activation) (None, 55, 55, 64) 0 batch_normalization_63[0][0]
__________________________________________________________________________________________________
conv2d_64 (Conv2D) (None, 55, 55, 256) 16640 activation_59[0][0]
__________________________________________________________________________________________________
batch_normalization_64 (BatchNo (None, 55, 55, 256) 1024 conv2d_64[0][0]
__________________________________________________________________________________________________
add_19 (Add) (None, 55, 55, 256) 0 batch_normalization_64[0][0]
activation_57[0][0]
__________________________________________________________________________________________________
activation_60 (Activation) (None, 55, 55, 256) 0 add_19[0][0]
__________________________________________________________________________________________________
conv2d_66 (Conv2D) (None, 28, 28, 128) 32896 activation_60[0][0]
__________________________________________________________________________________________________
batch_normalization_66 (BatchNo (None, 28, 28, 128) 512 conv2d_66[0][0]
__________________________________________________________________________________________________
activation_61 (Activation) (None, 28, 28, 128) 0 batch_normalization_66[0][0]
__________________________________________________________________________________________________
conv2d_67 (Conv2D) (None, 28, 28, 128) 147584 activation_61[0][0]
__________________________________________________________________________________________________
batch_normalization_67 (BatchNo (None, 28, 28, 128) 512 conv2d_67[0][0]
__________________________________________________________________________________________________
activation_62 (Activation) (None, 28, 28, 128) 0 batch_normalization_67[0][0]
__________________________________________________________________________________________________
conv2d_68 (Conv2D) (None, 28, 28, 512) 66048 activation_62[0][0]
__________________________________________________________________________________________________
conv2d_65 (Conv2D) (None, 28, 28, 512) 131584 activation_60[0][0]
__________________________________________________________________________________________________
batch_normalization_68 (BatchNo (None, 28, 28, 512) 2048 conv2d_68[0][0]
__________________________________________________________________________________________________
batch_normalization_65 (BatchNo (None, 28, 28, 512) 2048 conv2d_65[0][0]
__________________________________________________________________________________________________
add_20 (Add) (None, 28, 28, 512) 0 batch_normalization_68[0][0]
batch_normalization_65[0][0]
__________________________________________________________________________________________________
activation_63 (Activation) (None, 28, 28, 512) 0 add_20[0][0]
__________________________________________________________________________________________________
conv2d_69 (Conv2D) (None, 28, 28, 128) 65664 activation_63[0][0]
__________________________________________________________________________________________________
batch_normalization_69 (BatchNo (None, 28, 28, 128) 512 conv2d_69[0][0]
__________________________________________________________________________________________________
activation_64 (Activation) (None, 28, 28, 128) 0 batch_normalization_69[0][0]
__________________________________________________________________________________________________
conv2d_70 (Conv2D) (None, 28, 28, 128) 147584 activation_64[0][0]
__________________________________________________________________________________________________
batch_normalization_70 (BatchNo (None, 28, 28, 128) 512 conv2d_70[0][0]
__________________________________________________________________________________________________
activation_65 (Activation) (None, 28, 28, 128) 0 batch_normalization_70[0][0]
__________________________________________________________________________________________________
conv2d_71 (Conv2D) (None, 28, 28, 512) 66048 activation_65[0][0]
__________________________________________________________________________________________________
batch_normalization_71 (BatchNo (None, 28, 28, 512) 2048 conv2d_71[0][0]
__________________________________________________________________________________________________
add_21 (Add) (None, 28, 28, 512) 0 batch_normalization_71[0][0]
activation_63[0][0]
__________________________________________________________________________________________________
activation_66 (Activation) (None, 28, 28, 512) 0 add_21[0][0]
__________________________________________________________________________________________________
conv2d_72 (Conv2D) (None, 28, 28, 128) 65664 activation_66[0][0]
__________________________________________________________________________________________________
batch_normalization_72 (BatchNo (None, 28, 28, 128) 512 conv2d_72[0][0]
__________________________________________________________________________________________________
activation_67 (Activation) (None, 28, 28, 128) 0 batch_normalization_72[0][0]
__________________________________________________________________________________________________
conv2d_73 (Conv2D) (None, 28, 28, 128) 147584 activation_67[0][0]
__________________________________________________________________________________________________
batch_normalization_73 (BatchNo (None, 28, 28, 128) 512 conv2d_73[0][0]
__________________________________________________________________________________________________
activation_68 (Activation) (None, 28, 28, 128) 0 batch_normalization_73[0][0]
__________________________________________________________________________________________________
conv2d_74 (Conv2D) (None, 28, 28, 512) 66048 activation_68[0][0]
__________________________________________________________________________________________________
batch_normalization_74 (BatchNo (None, 28, 28, 512) 2048 conv2d_74[0][0]
__________________________________________________________________________________________________
add_22 (Add) (None, 28, 28, 512) 0 batch_normalization_74[0][0]
activation_66[0][0]
__________________________________________________________________________________________________
activation_69 (Activation) (None, 28, 28, 512) 0 add_22[0][0]
__________________________________________________________________________________________________
conv2d_75 (Conv2D) (None, 28, 28, 128) 65664 activation_69[0][0]
__________________________________________________________________________________________________
batch_normalization_75 (BatchNo (None, 28, 28, 128) 512 conv2d_75[0][0]
__________________________________________________________________________________________________
activation_70 (Activation) (None, 28, 28, 128) 0 batch_normalization_75[0][0]
__________________________________________________________________________________________________
conv2d_76 (Conv2D) (None, 28, 28, 128) 147584 activation_70[0][0]
__________________________________________________________________________________________________
batch_normalization_76 (BatchNo (None, 28, 28, 128) 512 conv2d_76[0][0]
__________________________________________________________________________________________________
activation_71 (Activation) (None, 28, 28, 128) 0 batch_normalization_76[0][0]
__________________________________________________________________________________________________
conv2d_77 (Conv2D) (None, 28, 28, 512) 66048 activation_71[0][0]
__________________________________________________________________________________________________
batch_normalization_77 (BatchNo (None, 28, 28, 512) 2048 conv2d_77[0][0]
__________________________________________________________________________________________________
add_23 (Add) (None, 28, 28, 512) 0 batch_normalization_77[0][0]
activation_69[0][0]
__________________________________________________________________________________________________
activation_72 (Activation) (None, 28, 28, 512) 0 add_23[0][0]
__________________________________________________________________________________________________
conv2d_79 (Conv2D) (None, 14, 14, 256) 131328 activation_72[0][0]
__________________________________________________________________________________________________
batch_normalization_79 (BatchNo (None, 14, 14, 256) 1024 conv2d_79[0][0]
__________________________________________________________________________________________________
activation_73 (Activation) (None, 14, 14, 256) 0 batch_normalization_79[0][0]
__________________________________________________________________________________________________
conv2d_80 (Conv2D) (None, 14, 14, 256) 590080 activation_73[0][0]
__________________________________________________________________________________________________
batch_normalization_80 (BatchNo (None, 14, 14, 256) 1024 conv2d_80[0][0]
__________________________________________________________________________________________________
activation_74 (Activation) (None, 14, 14, 256) 0 batch_normalization_80[0][0]
__________________________________________________________________________________________________
conv2d_81 (Conv2D) (None, 14, 14, 1024) 263168 activation_74[0][0]
__________________________________________________________________________________________________
conv2d_78 (Conv2D) (None, 14, 14, 1024) 525312 activation_72[0][0]
__________________________________________________________________________________________________
batch_normalization_81 (BatchNo (None, 14, 14, 1024) 4096 conv2d_81[0][0]
__________________________________________________________________________________________________
batch_normalization_78 (BatchNo (None, 14, 14, 1024) 4096 conv2d_78[0][0]
__________________________________________________________________________________________________
add_24 (Add) (None, 14, 14, 1024) 0 batch_normalization_81[0][0]
batch_normalization_78[0][0]
__________________________________________________________________________________________________
activation_75 (Activation) (None, 14, 14, 1024) 0 add_24[0][0]
__________________________________________________________________________________________________
conv2d_82 (Conv2D) (None, 14, 14, 256) 262400 activation_75[0][0]
__________________________________________________________________________________________________
batch_normalization_82 (BatchNo (None, 14, 14, 256) 1024 conv2d_82[0][0]
__________________________________________________________________________________________________
activation_76 (Activation) (None, 14, 14, 256) 0 batch_normalization_82[0][0]
__________________________________________________________________________________________________
conv2d_83 (Conv2D) (None, 14, 14, 256) 590080 activation_76[0][0]
__________________________________________________________________________________________________
batch_normalization_83 (BatchNo (None, 14, 14, 256) 1024 conv2d_83[0][0]
__________________________________________________________________________________________________
activation_77 (Activation) (None, 14, 14, 256) 0 batch_normalization_83[0][0]
__________________________________________________________________________________________________
conv2d_84 (Conv2D) (None, 14, 14, 1024) 263168 activation_77[0][0]
__________________________________________________________________________________________________
batch_normalization_84 (BatchNo (None, 14, 14, 1024) 4096 conv2d_84[0][0]
__________________________________________________________________________________________________
add_25 (Add) (None, 14, 14, 1024) 0 batch_normalization_84[0][0]
activation_75[0][0]
__________________________________________________________________________________________________
activation_78 (Activation) (None, 14, 14, 1024) 0 add_25[0][0]
__________________________________________________________________________________________________
conv2d_85 (Conv2D) (None, 14, 14, 256) 262400 activation_78[0][0]
__________________________________________________________________________________________________
batch_normalization_85 (BatchNo (None, 14, 14, 256) 1024 conv2d_85[0][0]
__________________________________________________________________________________________________
activation_79 (Activation) (None, 14, 14, 256) 0 batch_normalization_85[0][0]
__________________________________________________________________________________________________
conv2d_86 (Conv2D) (None, 14, 14, 256) 590080 activation_79[0][0]
__________________________________________________________________________________________________
batch_normalization_86 (BatchNo (None, 14, 14, 256) 1024 conv2d_86[0][0]
__________________________________________________________________________________________________
activation_80 (Activation) (None, 14, 14, 256) 0 batch_normalization_86[0][0]
__________________________________________________________________________________________________
conv2d_87 (Conv2D) (None, 14, 14, 1024) 263168 activation_80[0][0]
__________________________________________________________________________________________________
batch_normalization_87 (BatchNo (None, 14, 14, 1024) 4096 conv2d_87[0][0]
__________________________________________________________________________________________________
add_26 (Add) (None, 14, 14, 1024) 0 batch_normalization_87[0][0]
activation_78[0][0]
__________________________________________________________________________________________________
activation_81 (Activation) (None, 14, 14, 1024) 0 add_26[0][0]
__________________________________________________________________________________________________
conv2d_88 (Conv2D) (None, 14, 14, 256) 262400 activation_81[0][0]
__________________________________________________________________________________________________
batch_normalization_88 (BatchNo (None, 14, 14, 256) 1024 conv2d_88[0][0]
__________________________________________________________________________________________________
activation_82 (Activation) (None, 14, 14, 256) 0 batch_normalization_88[0][0]
__________________________________________________________________________________________________
conv2d_89 (Conv2D) (None, 14, 14, 256) 590080 activation_82[0][0]
__________________________________________________________________________________________________
batch_normalization_89 (BatchNo (None, 14, 14, 256) 1024 conv2d_89[0][0]
__________________________________________________________________________________________________
activation_83 (Activation) (None, 14, 14, 256) 0 batch_normalization_89[0][0]
__________________________________________________________________________________________________
conv2d_90 (Conv2D) (None, 14, 14, 1024) 263168 activation_83[0][0]
__________________________________________________________________________________________________
batch_normalization_90 (BatchNo (None, 14, 14, 1024) 4096 conv2d_90[0][0]
__________________________________________________________________________________________________
add_27 (Add) (None, 14, 14, 1024) 0 batch_normalization_90[0][0]
activation_81[0][0]
__________________________________________________________________________________________________
activation_84 (Activation) (None, 14, 14, 1024) 0 add_27[0][0]
__________________________________________________________________________________________________
conv2d_91 (Conv2D) (None, 14, 14, 256) 262400 activation_84[0][0]
__________________________________________________________________________________________________
batch_normalization_91 (BatchNo (None, 14, 14, 256) 1024 conv2d_91[0][0]
__________________________________________________________________________________________________
activation_85 (Activation) (None, 14, 14, 256) 0 batch_normalization_91[0][0]
__________________________________________________________________________________________________
conv2d_92 (Conv2D) (None, 14, 14, 256) 590080 activation_85[0][0]
__________________________________________________________________________________________________
batch_normalization_92 (BatchNo (None, 14, 14, 256) 1024 conv2d_92[0][0]
__________________________________________________________________________________________________
activation_86 (Activation) (None, 14, 14, 256) 0 batch_normalization_92[0][0]
__________________________________________________________________________________________________
conv2d_93 (Conv2D) (None, 14, 14, 1024) 263168 activation_86[0][0]
__________________________________________________________________________________________________
batch_normalization_93 (BatchNo (None, 14, 14, 1024) 4096 conv2d_93[0][0]
__________________________________________________________________________________________________
add_28 (Add) (None, 14, 14, 1024) 0 batch_normalization_93[0][0]
activation_84[0][0]
__________________________________________________________________________________________________
activation_87 (Activation) (None, 14, 14, 1024) 0 add_28[0][0]
__________________________________________________________________________________________________
conv2d_94 (Conv2D) (None, 14, 14, 256) 262400 activation_87[0][0]
__________________________________________________________________________________________________
batch_normalization_94 (BatchNo (None, 14, 14, 256) 1024 conv2d_94[0][0]
__________________________________________________________________________________________________
activation_88 (Activation) (None, 14, 14, 256) 0 batch_normalization_94[0][0]
__________________________________________________________________________________________________
conv2d_95 (Conv2D) (None, 14, 14, 256) 590080 activation_88[0][0]
__________________________________________________________________________________________________
batch_normalization_95 (BatchNo (None, 14, 14, 256) 1024 conv2d_95[0][0]
__________________________________________________________________________________________________
activation_89 (Activation) (None, 14, 14, 256) 0 batch_normalization_95[0][0]
__________________________________________________________________________________________________
conv2d_96 (Conv2D) (None, 14, 14, 1024) 263168 activation_89[0][0]
__________________________________________________________________________________________________
batch_normalization_96 (BatchNo (None, 14, 14, 1024) 4096 conv2d_96[0][0]
__________________________________________________________________________________________________
add_29 (Add) (None, 14, 14, 1024) 0 batch_normalization_96[0][0]
activation_87[0][0]
__________________________________________________________________________________________________
activation_90 (Activation) (None, 14, 14, 1024) 0 add_29[0][0]
__________________________________________________________________________________________________
conv2d_98 (Conv2D) (None, 7, 7, 512) 524800 activation_90[0][0]
__________________________________________________________________________________________________
batch_normalization_98 (BatchNo (None, 7, 7, 512) 2048 conv2d_98[0][0]
__________________________________________________________________________________________________
activation_91 (Activation) (None, 7, 7, 512) 0 batch_normalization_98[0][0]
__________________________________________________________________________________________________
conv2d_99 (Conv2D) (None, 7, 7, 512) 2359808 activation_91[0][0]
__________________________________________________________________________________________________
batch_normalization_99 (BatchNo (None, 7, 7, 512) 2048 conv2d_99[0][0]
__________________________________________________________________________________________________
activation_92 (Activation) (None, 7, 7, 512) 0 batch_normalization_99[0][0]
__________________________________________________________________________________________________
conv2d_100 (Conv2D) (None, 7, 7, 2048) 1050624 activation_92[0][0]
__________________________________________________________________________________________________
conv2d_97 (Conv2D) (None, 7, 7, 2048) 2099200 activation_90[0][0]
__________________________________________________________________________________________________
batch_normalization_100 (BatchN (None, 7, 7, 2048) 8192 conv2d_100[0][0]
__________________________________________________________________________________________________
batch_normalization_97 (BatchNo (None, 7, 7, 2048) 8192 conv2d_97[0][0]
__________________________________________________________________________________________________
add_30 (Add) (None, 7, 7, 2048) 0 batch_normalization_100[0][0]
batch_normalization_97[0][0]
__________________________________________________________________________________________________
activation_93 (Activation) (None, 7, 7, 2048) 0 add_30[0][0]
__________________________________________________________________________________________________
conv2d_101 (Conv2D) (None, 7, 7, 512) 1049088 activation_93[0][0]
__________________________________________________________________________________________________
batch_normalization_101 (BatchN (None, 7, 7, 512) 2048 conv2d_101[0][0]
__________________________________________________________________________________________________
activation_94 (Activation) (None, 7, 7, 512) 0 batch_normalization_101[0][0]
__________________________________________________________________________________________________
conv2d_102 (Conv2D) (None, 7, 7, 512) 2359808 activation_94[0][0]
__________________________________________________________________________________________________
batch_normalization_102 (BatchN (None, 7, 7, 512) 2048 conv2d_102[0][0]
__________________________________________________________________________________________________
activation_95 (Activation) (None, 7, 7, 512) 0 batch_normalization_102[0][0]
__________________________________________________________________________________________________
conv2d_103 (Conv2D) (None, 7, 7, 2048) 1050624 activation_95[0][0]
__________________________________________________________________________________________________
batch_normalization_103 (BatchN (None, 7, 7, 2048) 8192 conv2d_103[0][0]
__________________________________________________________________________________________________
add_31 (Add) (None, 7, 7, 2048) 0 batch_normalization_103[0][0]
activation_93[0][0]
__________________________________________________________________________________________________
activation_96 (Activation) (None, 7, 7, 2048) 0 add_31[0][0]
__________________________________________________________________________________________________
conv2d_104 (Conv2D) (None, 7, 7, 512) 1049088 activation_96[0][0]
__________________________________________________________________________________________________
batch_normalization_104 (BatchN (None, 7, 7, 512) 2048 conv2d_104[0][0]
__________________________________________________________________________________________________
activation_97 (Activation) (None, 7, 7, 512) 0 batch_normalization_104[0][0]
__________________________________________________________________________________________________
conv2d_105 (Conv2D) (None, 7, 7, 512) 2359808 activation_97[0][0]
__________________________________________________________________________________________________
batch_normalization_105 (BatchN (None, 7, 7, 512) 2048 conv2d_105[0][0]
__________________________________________________________________________________________________
activation_98 (Activation) (None, 7, 7, 512) 0 batch_normalization_105[0][0]
__________________________________________________________________________________________________
conv2d_106 (Conv2D) (None, 7, 7, 2048) 1050624 activation_98[0][0]
__________________________________________________________________________________________________
batch_normalization_106 (BatchN (None, 7, 7, 2048) 8192 conv2d_106[0][0]
__________________________________________________________________________________________________
add_32 (Add) (None, 7, 7, 2048) 0 batch_normalization_106[0][0]
activation_96[0][0]
__________________________________________________________________________________________________
activation_99 (Activation) (None, 7, 7, 2048) 0 add_32[0][0]
__________________________________________________________________________________________________
global_avg_pooling (GlobalAvera (None, 2048) 0 activation_99[0][0]
__________________________________________________________________________________________________
dense_2 (Dense) (None, 3) 6147 global_avg_pooling[0][0]
__________________________________________________________________________________________________
activation_100 (Activation) (None, 3) 0 dense_2[0][0]
==================================================================================================
Total params: 23,593,859
Trainable params: 23,540,739
Non-trainable params: 53,120
__________________________________________________________________________________________________
Using Enhanced Data Generation
Found 300 images belonging to 3 classes.
Found 120 images belonging to 3 classes.
JSON Mapping for the model classes saved to F:\nn\OlafenwaMoses\customModelTraining\idenprof\json\model_class.json
Number of experiments (Epochs) : 10
Epoch 1/10
8/9 [=========================>....] - ETA: 52s - loss: 1.9142 - acc: 0.3464
Epoch 00001: saving model to F:\nn\OlafenwaMoses\customModelTraining\idenprof\models\model_ex-001_acc-0.354167.h5
9/9 [==============================] - 533s 59s/step - loss: 1.8041 - acc: 0.3671 - val_loss: 9.9489 - val_acc: 0.3542
Epoch 2/10
8/9 [=========================>....] - ETA: 56s - loss: 1.2488 - acc: 0.5117
Epoch 00002: saving model to F:\nn\OlafenwaMoses\customModelTraining\idenprof\models\model_ex-002_acc-0.322917.h5
9/9 [==============================] - 549s 61s/step - loss: 1.2566 - acc: 0.5019 - val_loss: 1.1253 - val_acc: 0.3229
Epoch 3/10
8/9 [=========================>....] - ETA: 52s - loss: 1.3101 - acc: 0.4570
Epoch 00003: saving model to F:\nn\OlafenwaMoses\customModelTraining\idenprof\models\model_ex-003_acc-0.322917.h5
9/9 [==============================] - 517s 57s/step - loss: 1.3536 - acc: 0.4545 - val_loss: 1.2735 - val_acc: 0.3229
Epoch 4/10
8/9 [=========================>....] - ETA: 56s - loss: 1.1787 - acc: 0.4531
Epoch 00004: saving model to F:\nn\OlafenwaMoses\customModelTraining\idenprof\models\model_ex-004_acc-0.302083.h5
9/9 [==============================] - 550s 61s/step - loss: 1.1517 - acc: 0.4792 - val_loss: 1.4054 - val_acc: 0.3021
Epoch 5/10
8/9 [=========================>....] - ETA: 47s - loss: 1.0439 - acc: 0.5065
Epoch 00005: saving model to F:\nn\OlafenwaMoses\customModelTraining\idenprof\models\model_ex-005_acc-0.302083.h5
9/9 [==============================] - 468s 52s/step - loss: 1.0489 - acc: 0.5072 - val_loss: 1.6589 - val_acc: 0.3021
Epoch 6/10
8/9 [=========================>....] - ETA: 55s - loss: 1.0135 - acc: 0.5586
Epoch 00006: saving model to F:\nn\OlafenwaMoses\customModelTraining\idenprof\models\model_ex-006_acc-0.302083.h5
9/9 [==============================] - 546s 61s/step - loss: 1.0391 - acc: 0.5625 - val_loss: 1.5787 - val_acc: 0.3021
Epoch 7/10
8/9 [=========================>....] - ETA: 50s - loss: 1.1323 - acc: 0.4935
Epoch 00007: saving model to F:\nn\OlafenwaMoses\customModelTraining\idenprof\models\model_ex-007_acc-0.302083.h5
9/9 [==============================] - 498s 55s/step - loss: 1.1025 - acc: 0.4967 - val_loss: 1.5267 - val_acc: 0.3021
Epoch 8/10
8/9 [=========================>....] - ETA: 51s - loss: 1.1020 - acc: 0.5195
Epoch 00008: saving model to F:\nn\OlafenwaMoses\customModelTraining\idenprof\models\model_ex-008_acc-0.302083.h5
9/9 [==============================] - 508s 56s/step - loss: 1.0763 - acc: 0.5315 - val_loss: 1.4837 - val_acc: 0.3021
Epoch 9/10
8/9 [=========================>....] - ETA: 52s - loss: 0.9851 - acc: 0.4648
Epoch 00009: saving model to F:\nn\OlafenwaMoses\customModelTraining\idenprof\models\model_ex-009_acc-0.302083.h5
9/9 [==============================] - 513s 57s/step - loss: 0.9761 - acc: 0.4879 - val_loss: 1.4427 - val_acc: 0.3021
Epoch 10/10
8/9 [=========================>....] - ETA: 51s - loss: 1.0747 - acc: 0.4961
Epoch 00010: saving model to F:\nn\OlafenwaMoses\customModelTraining\idenprof\models\model_ex-010_acc-0.302083.h5
9/9 [==============================] - 507s 56s/step - loss: 1.0479 - acc: 0.5035 - val_loss: 1.4121 - val_acc: 0.3021
The above results are due to the small number of training and less times So the result was bad Just for your reference
Idenprof Image data set and sample code : link : https://pan.baidu.com/s/1vHXNDaSugbGjMaWWratQqA Extraction code : x8m7
If you have any questions, please leave a message to discuss .