Starting April 29, 2025, Gemini 1.5 Pro and Gemini 1.5 Flash models are not available in projects that have no prior usage of these models, including new projects. For details, see Model versions and lifecycle.
Quickstart: Send text prompts to Gemini using Vertex AI Studio
Stay organized with collections Save and categorize content based on your preferences.
You can use Vertex AI Studio to design, test, and manage prompts for Google's Gemini large language models (LLMs) and third-party models. Vertex AI Studio supports certain third-party models that are offered on Vertex AI as models as a service (MaaS), such as Anthropic's Claude models and Meta's Llama models.
In this quickstart, you:
Send these prompts to the Gemini API using samples from the generative AI prompt gallery, including the following:
A summarization text prompt
A code generation prompt
View the code used to generate the responses
Before you begin prompting in Vertex AI Studio
This quickstart requires you to complete the following steps to set up a Google Cloud project and enable the Vertex AI API.
Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
A prompt is a natural language request submitted to a language model that generates a response. Prompts can contain questions, instructions, contextual information, few-shot examples, and partial input for the model to complete. After the model receives a prompt, depending on the type of model used, it can generate text, embeddings, code, images, videos, music, and more.
The sample prompts in Vertex AI Studio prompt gallery are predesigned to help demonstrate model capabilities. Each prompt is preconfigured with specified model and parameter values so you can open the sample prompt and click Submit to generate a response.
Test the Gemini flash model using a summarization text prompt
Send a summarization text prompt to the Gemini API in Vertex AI. A summarization task extracts the most important information from text. You can provide information in the prompt to help the model create a summary, or ask the model to create a summary on its own.
Go to the Prompt gallery page from the Vertex AI section in the Google Cloud console. Go to prompt gallery
In the Tasks drop-down menu, select Summarize.
Open the Audio summarization card.
This sample prompt includes an audio file and requests a summary of the file contents in a bulleted list.
Notice that in the settings panel, the model's default value is set to Gemini-2.0-flash-001. You can choose a different Gemini model by clicking Switch model.
Click Submit to generate the summary.
The output is displayed in the response.
To view the Vertex AI API code used to generate the transcript summary, click Build with code > Get code.
In the Get code panel, you can choose your preferred language to get the sample code for the prompt, or you can open the Python code in a Colab Enterprise notebook.
Test the Gemini flash model using a code generation prompt
Send a code generation prompt to the Gemini API in Vertex AI. A code generation task generates code using a natural language description.
Go to the Prompt gallery page from the Vertex AI section in the Google Cloud console. Go to prompt gallery
In the Tasks drop-down menu, select Code.
Open the Generate code from comments card.
This sample prompt includes a system instruction that tells the model how to respond and some incomplete Java methods.
Notice that in the settings panel, the model's default value is set to Gemini-2.0-flash-001. You can choose a different Gemini model by clicking Switch model.
To complete each method by generating code in the areas marked <WRITE CODE HERE>, click Submit .
The output is displayed in the response.
To view the Vertex AI API code used to generate the transcript summary, click Build with code > Get code.
In the Get code panel, you can choose your preferred language to get the sample code for the prompt, or you can open the Python code in a Colab Enterprise notebook.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-27 UTC."],[],[],null,["# Quickstart: Send text prompts to Gemini using Vertex AI Studio\n\nYou can use Vertex AI Studio to design, test, and manage prompts for\nGoogle's [Gemini](/vertex-ai/generative-ai/docs/overview) large language models\n(LLMs) and third-party models. Vertex AI Studio supports certain\nthird-party models that are offered on Vertex AI as [models as a\nservice (MaaS)](/vertex-ai/generative-ai/docs/partner-models/use-partner-models), such as\nAnthropic's Claude models and Meta's Llama models.\n| **Note:** On your initial use for third-party models, Vertex AI prompts you to accept the third-party's terms and conditions. You must do this once for each third-party provider to start using their models.\n\nIn this quickstart, you:\n\n- Send these prompts to the Gemini API using samples from the generative AI prompt gallery, including the following:\n - A summarization text prompt\n - A code generation prompt\n- View the code used to generate the responses\n\nBefore you begin prompting in Vertex AI Studio\n----------------------------------------------\n\nThis quickstart requires you to complete the following steps to set up a\nGoogle Cloud project and enable the Vertex AI API.\n\n- Sign in to your Google Cloud account. If you're new to Google Cloud, [create an account](https://console.cloud.google.com/freetrial) to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.\n- In the Google Cloud console, on the project selector page,\n select or create a Google Cloud project.\n\n | **Note**: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.\n\n [Go to project selector](https://console.cloud.google.com/projectselector2/home/dashboard)\n-\n [Verify that billing is enabled for your Google Cloud project](/billing/docs/how-to/verify-billing-enabled#confirm_billing_is_enabled_on_a_project).\n\n-\n\n\n Enable the Vertex AI API.\n\n\n [Enable the API](https://console.cloud.google.com/flows/enableapi?apiid=aiplatform.googleapis.com)\n\n- In the Google Cloud console, on the project selector page,\n select or create a Google Cloud project.\n\n | **Note**: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.\n\n [Go to project selector](https://console.cloud.google.com/projectselector2/home/dashboard)\n-\n [Verify that billing is enabled for your Google Cloud project](/billing/docs/how-to/verify-billing-enabled#confirm_billing_is_enabled_on_a_project).\n\n-\n\n\n Enable the Vertex AI API.\n\n\n [Enable the API](https://console.cloud.google.com/flows/enableapi?apiid=aiplatform.googleapis.com)\n\n\u003cbr /\u003e\n\nSample prompts in Vertex AI Studio\n----------------------------------\n\nA prompt is a natural language request submitted to a language model that\ngenerates a response. Prompts can contain questions, instructions, contextual\ninformation, [few-shot examples](/vertex-ai/generative-ai/docs/learn/prompts/introduction-prompt-design#few-shot-examples),\nand partial input for the model to complete. After the model receives a prompt,\ndepending on the type of model used, it can generate text, embeddings, code,\nimages, videos, music, and more.\n\nThe sample prompts in Vertex AI Studio [prompt gallery](/vertex-ai/generative-ai/docs/prompt-gallery)\nare predesigned to help demonstrate model capabilities. Each prompt is\npreconfigured with specified model and parameter values so you can open the\nsample prompt and click **Submit** to generate a response.\n\nTest the Gemini flash model using a summarization text prompt\n-------------------------------------------------------------\n\nSend a summarization text prompt to the Gemini API in Vertex AI. A summarization\ntask extracts the most important information from text. You can provide\ninformation in the prompt to help the model create a summary, or ask the model\nto create a summary on its own.\n\n1. Go to the **Prompt gallery** page from the Vertex AI\n section in the Google Cloud console. \n\n [Go to prompt gallery](https://console.cloud.google.com/vertex-ai/studio/prompt-gallery)\n\n2. In the **Tasks** drop-down menu, select **Summarize**.\n\n3. Open the **Audio summarization** card.\n\n This sample prompt includes an audio file and requests a summary of the file\n contents in a bulleted list.\n\n4. Notice that in the settings panel, the model's default value is set to\n **Gemini-2.0-flash-001** . You can choose a different Gemini model\n by clicking **Switch model**.\n\n5. Click **Submit** to generate the summary.\n\n The output is displayed in the response.\n6. To view the Vertex AI API code used to generate the transcript\n summary, click **Build with code** \\\u003e **Get code**.\n\n In the **Get code** panel, you can choose your preferred language to get the\n sample code for the prompt, or you can open the Python code in a\n Colab Enterprise notebook.\n\nTest the Gemini flash model using a code generation prompt\n----------------------------------------------------------\n\nSend a code generation prompt to the Gemini API in Vertex AI. A code generation task generates code\nusing a natural language description.\n\n1. Go to the **Prompt gallery** page from the Vertex AI\n section in the Google Cloud console. \n\n [Go to prompt gallery](https://console.cloud.google.com/vertex-ai/studio/prompt-gallery)\n\n2. In the **Tasks** drop-down menu, select **Code**.\n\n3. Open the **Generate code from comments** card.\n\n This sample prompt includes a [system instruction](/vertex-ai/generative-ai/docs/learn/prompts/system-instruction-introduction)\n that tells the model how to respond and some incomplete Java methods.\n\n4. Notice that in the settings panel, the model's default value is set to\n **Gemini-2.0-flash-001** . You can choose a different Gemini model\n by clicking **Switch model**.\n\n5. To complete each method by generating code in the areas marked\n `\u003cWRITE CODE HERE\u003e`, click **Submit** .\n\n The output is displayed in the response.\n6. To view the Vertex AI API code used to generate the transcript\n summary, click **Build with code** \\\u003e **Get code**.\n\n In the **Get code** panel, you can choose your preferred language to get the\n sample code for the prompt, or you can open the Python code in a\n Colab Enterprise notebook.\n\nDiscover what's next with prompts\n---------------------------------\n\n- See an [introduction to prompt design](/vertex-ai/generative-ai/docs/learn/prompts/introduction-prompt-design).\n- Learn about [designing multimodal prompts](/vertex-ai/generative-ai/docs/multimodal/design-multimodal-prompts) and [chat prompts](/vertex-ai/generative-ai/docs/chat/chat-prompts)."]]