top of page
Image by Pascal Meier

Hosting Deep Learning apps on AWS (Stage 5): Deep Learning (CNN) Training with TensorFlow & Keras

Updated: May 26, 2023


We have a model-ready app up and running on AWS, but no model.


This stage describes how to train an image classification model for predicting German traffic signs with Convolutional Neural Networks (CNNs) and TensorFlow /Keras but the actual dataset can be swapped out with any suitably catalogued image dataset and the model training process can be completed first or as Stage 5 as described here.





i. Import the training notebook to Google Colab


Copy the notebook below directly into Colab:



Colab has better support for TensorFlow (no install!) so it’s easier to run python deep learning training notebooks from there.


ii. Run through the modelling process


Make use of the hyperlinked table of contents on the RHS panel of the Colab UI – importing the code from Kaggle, performing EDA, model Prep/image wrangling, building, and running the CNN using Transfer Learning, and finally performing inference


iii. Run Early Stopping Criteria to avoid overfitting


The second “Early Stopping Criteria” run produces a more generalisable model, so it is advised to train the model on that basis rather than rely on the first (overfit) run.


iv. Export the model and convert to .tflite format


We need the trained model exported from this notebook but it is too big (~ 170 MB) for most purposes including our Docker container image pushed from our EC2 Linux instance to ECR. But we can compress it down to < 60 MB using the same python code we used deploying an app to Heroku i.e.


import tensorflow as tf
from pathlib import Path
from tensorflow.keras.models import load_model

# load the model
myModel = load_model('myModel.h5')

# create a TFLiteConverter object from a TensorFlow Keras model 
converter = tf.lite.TFLiteConverter.from_keras_model(myModel)

# converts a Keras model based on instance variable
myModel_tflite = converter.convert()

# Save the model
tflite_model_file = Path('tfliteConv-model.tflite')
tflite_model_file.write_bytes(myModel_tflite)

Run the above code in Colab after exporting the large model (renamed as myModel.h5), then download the resultant tflite model tfliteConv-model.tflite to your local drive.


We then need to upload this to S3 (acting here as a staging layer for our AWS-hosted model) before pulling into our prepared code base on EC2 (in our project > python folder).


Upload to S3 via the AWS Console, click on the model and copy the URI.


Finally, in your EC2 Linux instance run the command:


aws s3 cp [S3 URI to .tflite model] python










12 views0 comments

Recent Posts

See All
bottom of page