To create a derivative container, you'll use a process similar to this:
Create the initial Dockerfile and run modification commands.
To start, you create a Deep Learning Containers container using one of the available image types. Then use conda, pip, or Jupyter commands to modify the container image for your needs.
Build and push the container image.
Build the container image, and then push it to somewhere that is accessible to your Compute Engine service account.
Create the initial Dockerfile and run modification commands
Use the following commands to select a Deep Learning Containers image type and make a small change to the container image. This example shows how to start with a TensorFlow image and updates the image with the latest version of TensorFlow. Write the following commands to the Dockerfile:
FROM us-docker.pkg.dev/deeplearning-platform-release/gcr.io/tf-gpu:latest # Uninstall the container's TensorFlow version and install the latest version RUN pip install --upgrade pip && \ pip uninstall -y tensorflow && \ pip install tensorflow
Build and push the container image
Use the following commands to build and push the container image to Artifact Registry, where it can be accessed by your Google Compute Engine service account.
LOCATION: The regional or multi-regional location of the repository, for example us. To view a list of supported locations, run the command gcloud artifacts locations list.
REPOSITORY_NAME: The name of the repository that you want to create, for example my-tf-repo.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-25 UTC."],[[["\u003cp\u003eThis guide details the process of creating a derivative container from a standard Deep Learning Containers image, using either Cloud Shell or an environment with the Google Cloud CLI installed.\u003c/p\u003e\n"],["\u003cp\u003eThe process involves creating an initial Dockerfile and executing modification commands, such as using conda, pip, or Jupyter commands, to customize the container image.\u003c/p\u003e\n"],["\u003cp\u003eBefore starting, ensure you have completed the necessary setup steps, including enabling billing for your Google Cloud project and the Artifact Registry API.\u003c/p\u003e\n"],["\u003cp\u003eAfter modifying the container, you need to build it and push the resulting image to a repository, such as Artifact Registry, that is accessible to your Compute Engine service account.\u003c/p\u003e\n"],["\u003cp\u003eThe example provided shows how to take a tensorflow image, and modify the container by uninstalling the current version and installing the latest version of Tensorflow.\u003c/p\u003e\n"]]],[],null,["# Create a derivative container\n\nThis page describes how to create a derivative container based on one\nof the standard available Deep Learning Containers images.\n\nTo complete the steps in this guide, you can use either\n[Cloud Shell](https://console.cloud.google.com?cloudshell=true) or any\nenvironment where the [Google Cloud CLI](/sdk/docs) is installed.\n\nBefore you begin\n----------------\n\nBefore you begin, make sure you have completed the following steps.\n\n1. Complete the set up steps in the Before you begin section of [Getting\n started with a local deep learning\n container](/deep-learning-containers/docs/getting-started-local).\n\n2. Make sure that billing is enabled for your Google Cloud project.\n\n [Learn how to enable\n billing](https://cloud.google.com/billing/docs/how-to/modify-project)\n3. Enable the Artifact Registry API.\n\n [Enable the\n API](https://console.cloud.google.com/flows/enableapi?apiid=artifactregistry.googleapis.com)\n\nThe Process\n-----------\n\nTo create a derivative container, you'll use a process similar to this:\n\n1. Create the initial Dockerfile and run modification commands.\n\n To start, you create a Deep Learning Containers container using\n one of the [available image types](/deep-learning-containers/docs/choosing-container).\n Then use conda, pip, or\n Jupyter commands to modify the container\n image for your needs.\n2. Build and push the container image.\n\n Build the container image, and then push it to somewhere that is\n accessible to your Compute Engine service account.\n\nCreate the initial Dockerfile and run modification commands\n-----------------------------------------------------------\n\nUse the following commands to select a Deep Learning Containers image type\nand make a small change to the container image. This example shows how to\nstart with a TensorFlow image and updates the image\nwith the latest version of TensorFlow.\nWrite the following commands to the Dockerfile: \n\n```text\nFROM us-docker.pkg.dev/deeplearning-platform-release/gcr.io/tf-gpu:latest\n# Uninstall the container's TensorFlow version and install the latest version\nRUN pip install --upgrade pip && \\\n pip uninstall -y tensorflow && \\\n pip install tensorflow\n```\n\nBuild and push the container image\n----------------------------------\n\nUse the following commands to build and push the container image to\nArtifact Registry, where it can be accessed by your\nGoogle Compute Engine service account.\n\nCreate and authenticate the repository: \n\n```bash\nexport PROJECT=$(gcloud config list project --format \"value(core.project)\")\ngcloud artifacts repositories create REPOSITORY_NAME \\\n --repository-format=docker \\\n --location=LOCATION\ngcloud auth configure-docker LOCATION-docker.pkg.dev\n```\n\nReplace the following:\n\n- \u003cvar translate=\"no\"\u003eLOCATION\u003c/var\u003e: The regional or multi-regional [location](/artifact-registry/docs/repositories#locations) of the repository, for example `us`. To view a list of supported locations, run the command `gcloud artifacts locations list`.\n- \u003cvar translate=\"no\"\u003eREPOSITORY_NAME\u003c/var\u003e: The name of the repository that you want to create, for example `my-tf-repo`.\n\nThen, build and push the image: \n\n```bash\nexport IMAGE_NAME=\"\u003cvar translate=\"no\"\u003eLOCATION\u003c/var\u003e-docker.pkg.dev/${PROJECT}/\u003cvar translate=\"no\"\u003eREPOSITORY_NAME\u003c/var\u003e/tf-custom:v1\"\ndocker build . -t $IMAGE_NAME\ndocker push $IMAGE_NAME\n```"]]