This page explains how to ground a model's responses usingGoogle Search, which uses publicly-available web data.
Grounding with Google Search
If you want to connect your model with world knowledge, a wide possible range oftopics, or up-to-date information on the Internet, then use Grounding with Google Search.
To learn more about model grounding in Vertex AI,see the Grounding overview.
Supported models
This section lists the models that support grounding withSearch.
- Gemini 2.5 Flash-Lite
previewPreview - Gemini2.5Flash with Live API native audio
previewPreview - Gemini2.0Flash with Live API
previewPreview - Gemini2.5Pro
- Gemini2.5Flash
- Gemini2.0Flash
Supported languages
For a list of supported languages, seeLanguages.
Ground your model with Google Search
Use the following instructions to ground a model with publicly available webdata.
Considerations
To use grounding with Google Search, you must enableGoogle Search Suggestions. See more at Use Google Searchsuggestions.
For ideal results, use a temperature of
1.0
. To learn more about settingthis configuration, see the Gemini API requestbodyfrom the model reference.Grounding with Google Search has a limit of one million queries per day. Ifyou require more queries, contact Google Cloud support for assistance.
Console
To use Grounding with Google Search with the Vertex AI Studio, follow these steps:
- In the Google Cloud console, go to the Vertex AI Studio page.
Go to Vertex AI Studio
- Click the Freeform tab.
- In the side panel, click the Ground model responses toggle.
- Click Customize and set Google Search as the source.
- Enter your prompt in the text box and click Submit.
Your prompt responses now ground to Google Search.
Gen AI SDK for Python
Install
pip install --upgrade google-genai
To learn more, see the SDK reference documentation.
Set environment variables to use the Gen AI SDK with Vertex AI:
# Replace the `GOOGLE_CLOUD_PROJECT` and `GOOGLE_CLOUD_LOCATION` values# with appropriate values for your project.export GOOGLE_CLOUD_PROJECT=GOOGLE_CLOUD_PROJECTexport GOOGLE_CLOUD_LOCATION=globalexport GOOGLE_GENAI_USE_VERTEXAI=True
from google import genaifrom google.genai.types import ( GenerateContentConfig, GoogleSearch, HttpOptions, Tool,)client = genai.Client(http_options=HttpOptions(api_version="v1"))response = client.models.generate_content( model="gemini-2.5-flash", contents="When is the next total solar eclipse in the United States?", config=GenerateContentConfig( tools=[ # Use Google Search Tool Tool(google_search=GoogleSearch()) ], ),)print(response.text)# Example response:# 'The next total solar eclipse in the United States will occur on ...'
Gen AI SDK for Go
Learn how to install or update the Gen AI SDK for Go.
To learn more, see the SDK reference documentation.
Set environment variables to use the Gen AI SDK with Vertex AI:
# Replace the `GOOGLE_CLOUD_PROJECT` and `GOOGLE_CLOUD_LOCATION` values# with appropriate values for your project.export GOOGLE_CLOUD_PROJECT=GOOGLE_CLOUD_PROJECTexport GOOGLE_CLOUD_LOCATION=globalexport GOOGLE_GENAI_USE_VERTEXAI=True
import ("context""fmt""io"genai "google.golang.org/genai")// generateWithGoogleSearch shows how to generate text using Google Search.func generateWithGoogleSearch(w io.Writer) error {ctx := context.Background()client, err := genai.NewClient(ctx, &genai.ClientConfig{HTTPOptions: genai.HTTPOptions{APIVersion: "v1"},})if err != nil {return fmt.Errorf("failed to create genai client: %w", err)}modelName := "gemini-2.0-flash-001"contents := []*genai.Content{{Parts: []*genai.Part{{Text: "When is the next total solar eclipse in the United States?"},}},}config := &genai.GenerateContentConfig{Tools: []*genai.Tool{{GoogleSearch: &genai.GoogleSearch{}},},}resp, err := client.Models.GenerateContent(ctx, modelName, contents, config)if err != nil {return fmt.Errorf("failed to generate content: %w", err)}respText, err := resp.Text()if err != nil {return fmt.Errorf("failed to convert model response to text: %w", err)}fmt.Fprintln(w, respText)// Example response:// The next total solar eclipse in the United States will occur on March 30, 2033, but it will only ...return nil}
REST
Before using any of the request data, make the following replacements:
- LOCATION: The region to process the request.
- PROJECT_ID: Your project ID.
- MODEL_ID: The model ID of the multimodal model.
- TEXT: The text instructions to include in the prompt.
HTTP method and URL:
POST https://LOCATION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/publishers/google/models/MODEL_ID:generateContent
Request JSON body:
{ "contents": [{ "role": "user", "parts": [{ "text": "TEXT" }] }], "tools": [{ "googleSearch": {} }], "model": "projects/PROJECT_ID/locations/LOCATION/publishers/google/models/MODEL_ID"}
To send your request, expand one of these options:
curl (Linux, macOS, or Cloud Shell)
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://LOCATION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/publishers/google/models/MODEL_ID:generateContent"
PowerShell (Windows)
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://LOCATION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION/publishers/google/models/MODEL_ID:generateContent" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
{ "candidates": [ { "content": { "role": "model", "parts": [ { "text": "The weather in Chicago this weekend, will be partly cloudy. The temperature will be between 49°F (9°C) and 55°F (13°C) on Saturday and between 51°F (11°C) and 56°F (13°C) on Sunday. There is a slight chance of rain on both days.\n" } ] }, "finishReason": "STOP", "groundingMetadata": { "webSearchQueries": [ "weather in Chicago this weekend" ], "searchEntryPoint": { "renderedContent": "..." }, "groundingChunks": [ { "web": { "uri": "https://www.google.com/search?q=weather+in+Chicago,+IL", "title": "Weather information for locality: Chicago, administrative_area: IL", "domain": "google.com" } }, { "web": { "uri": "...", "title": "weatherbug.com", "domain": "weatherbug.com" } } ], "groundingSupports": [ { "segment": { "startIndex": 85, "endIndex": 214, "text": "The temperature will be between 49°F (9°C) and 55°F (13°C) on Saturday and between 51°F (11°C) and 56°F (13°C) on Sunday." }, "groundingChunkIndices": [ 0 ], "confidenceScores": [ 0.8662828 ] }, { "segment": { "startIndex": 215, "endIndex": 261, "text": "There is a slight chance of rain on both days." }, "groundingChunkIndices": [ 1, 0 ], "confidenceScores": [ 0.62836814, 0.6488607 ] } ], "retrievalMetadata": {} } } ], "usageMetadata": { "promptTokenCount": 10, "candidatesTokenCount": 98, "totalTokenCount": 108, "trafficType": "ON_DEMAND", "promptTokensDetails": [ { "modality": "TEXT", "tokenCount": 10 } ], "candidatesTokensDetails": [ { "modality": "TEXT", "tokenCount": 98 } ] }, "modelVersion": "gemini-2.0-flash", "createTime": "2025-05-19T14:42:55.000643Z", "responseId": "b0MraIMFoqnf-Q-D66G4BQ"}
Understand your response
If your model prompt successfully grounds to Google Search from theVertex AI Studio or from the API, then the responses include metadata withsource links (web URLs). However, there are several reasons this metadata mightnot be provided, and the prompt response won't be grounded. These reasonsinclude low source relevance or incomplete information within the model'sresponse.
Grounding support
Displaying grounding support is recommended, because it aids you in validatingresponses from the publishers and adds avenues for further learning.
Grounding support for responses from Google Search sources should be shownboth inline and in aggregate. For example, see the following image as asuggestion on how to do this.
Use of alternative search engine options
Customer's use of Grounding with Google Search does not prevent Customer from offering alternativesearch engine options, making alternative search options the default option forCustomer Applications, or displaying their own or third party search suggestionsor search results in Customer Applications, provided that that any suchnon-Google Search services or associated results are displayed separately fromthe Grounded Results and Search Suggestions and can't reasonably be attributedto, or confused with results provided by, Google.
Benefits
The following complex prompts and workflows that require planning, reasoning,and thinking can be done when you use Google Search as a tool:
- You can ground to help ensure responses are based on the latest and mostaccurate information.
- You can retrieve artifacts from the web to do analysis.
- You can find relevant images, videos, or other media to assist in multimodalreasoning or task generation.
- You can perform coding, technical troubleshooting, and other specializedtasks.
- You can find region-specific information, or assist in translating contentaccurately.
- You can find relevant websites for browsing.
What's next
- To learn more about grounding, see Grounding overview.
- To learn how to send chat prompt requests, seeMultiturn chat.
- To learn about responsible AI best practices and Vertex AI's safety filters,see Safety best practices.