从内存上传对象

本页面介绍如何使用客户端库将对象从内存上传到 Cloud Storage 存储桶。如果您想避免从内存到本地文件系统的不必要写入,则从内存上传会非常有用。

上传的对象包含要存储的数据以及所有关联元数据。如需查看概念性概览(包括如何根据文件大小选择最佳上传方法),请参阅上传和下载

所需的角色

如需获得将对象从内存上传到存储桶所需的权限,请让您的管理员为您授予存储桶的 Storage Object User (roles/storage.objectUser) IAM 角色。此预定义角色可提供将对象上传到存储桶所需的权限。如需查看所需的确切权限,请展开所需权限部分:

所需权限

  • storage.objects.create
  • storage.objects.delete
    • 只有覆盖现有对象的上传操作需要此权限。

您还可以通过自定义角色获得这些权限。

如需了解如何授予存储桶的角色,请参阅将 IAM 与存储桶搭配使用

从内存上传对象

客户端库

C++

如需了解详情,请参阅 Cloud Storage C++ API 参考文档

如需向 Cloud Storage 进行身份验证,请设置应用默认凭证。如需了解详情,请参阅为客户端库设置身份验证

namespace gcs = ::google::cloud::storage; using ::google::cloud::StatusOr; [](gcs::Client client, std::string const& bucket_name,    std::string const& object_name) {   std::string const text = "Lorem ipsum dolor sit amet";   // For small uploads where the data is contiguous in memory use   // `InsertObject()`. For more specific size recommendations see   //     https://cloud.google.com/storage/docs/uploads-downloads#size   auto metadata = client.InsertObject(bucket_name, object_name, text);   if (!metadata) throw std::move(metadata).status();   std::cout << "Successfully wrote to object " << metadata->name()             << " its size is: " << metadata->size() << "\n";    // For larger uploads, or uploads where the data is not contiguous in   // memory, use `WriteObject()`. Consider using `std::ostream::write()` for   // best performance.   std::vector<std::string> v(100, text);   gcs::ObjectWriteStream stream =       client.WriteObject(bucket_name, object_name);   std::copy(v.begin(), v.end(), std::ostream_iterator<std::string>(stream));   stream.Close();    metadata = std::move(stream).metadata();   if (!metadata) throw std::move(metadata).status();   std::cout << "Successfully wrote to object " << metadata->name()             << " its size is: " << metadata->size()             << "\nFull metadata: " << *metadata << "\n"; }

C#

如需了解详情,请参阅 Cloud Storage C# API 参考文档

如需向 Cloud Storage 进行身份验证,请设置应用默认凭证。如需了解详情,请参阅为客户端库设置身份验证

 using Google.Cloud.Storage.V1; using System; using System.IO; using System.Text;  public class UploadObjectFromMemorySample {     public void UploadObjectFromMemory(         string bucketName = "unique-bucket-name",         string objectName = "file-name",         string contents = "Hello world!")     {         var storage = StorageClient.Create();         byte[] byteArray = Encoding.UTF8.GetBytes(contents);         MemoryStream stream = new MemoryStream(byteArray);         storage.UploadObject(bucketName, objectName, "application/octet-stream" , stream);          Console.WriteLine($" {objectName} uploaded to bucket {bucketName} with contents: {contents}");     } } 

Go

如需了解详情,请参阅 Cloud Storage Go API 参考文档

如需向 Cloud Storage 进行身份验证,请设置应用默认凭证。如需了解详情,请参阅为客户端库设置身份验证

import ( 	"bytes" 	"context" 	"fmt" 	"io" 	"time"  	"cloud.google.com/go/storage" )  // streamFileUpload uploads an object via a stream. func streamFileUpload(w io.Writer, bucket, object string) error { 	// bucket := "bucket-name" 	// object := "object-name" 	ctx := context.Background() 	client, err := storage.NewClient(ctx) 	if err != nil { 		return fmt.Errorf("storage.NewClient: %w", err) 	} 	defer client.Close()  	b := []byte("Hello world.") 	buf := bytes.NewBuffer(b)  	ctx, cancel := context.WithTimeout(ctx, time.Second*50) 	defer cancel()  	// Upload an object with storage.Writer. 	wc := client.Bucket(bucket).Object(object).NewWriter(ctx) 	wc.ChunkSize = 0 // note retries are not supported for chunk size 0.  	if _, err = io.Copy(wc, buf); err != nil { 		return fmt.Errorf("io.Copy: %w", err) 	} 	// Data can continue to be added to the file until the writer is closed. 	if err := wc.Close(); err != nil { 		return fmt.Errorf("Writer.Close: %w", err) 	} 	fmt.Fprintf(w, "%v uploaded to %v.\n", object, bucket)  	return nil } 

Java

如需了解详情,请参阅 Cloud Storage Java API 参考文档

如需向 Cloud Storage 进行身份验证,请设置应用默认凭证。如需了解详情,请参阅为客户端库设置身份验证

 import com.google.cloud.storage.BlobId; import com.google.cloud.storage.BlobInfo; import com.google.cloud.storage.Storage; import com.google.cloud.storage.StorageOptions; import java.io.IOException; import java.nio.charset.StandardCharsets;  public class UploadObjectFromMemory {   public static void uploadObjectFromMemory(       String projectId, String bucketName, String objectName, String contents) throws IOException {     // The ID of your GCP project     // String projectId = "your-project-id";      // The ID of your GCS bucket     // String bucketName = "your-unique-bucket-name";      // The ID of your GCS object     // String objectName = "your-object-name";      // The string of contents you wish to upload     // String contents = "Hello world!";      Storage storage = StorageOptions.newBuilder().setProjectId(projectId).build().getService();     BlobId blobId = BlobId.of(bucketName, objectName);     BlobInfo blobInfo = BlobInfo.newBuilder(blobId).build();     byte[] content = contents.getBytes(StandardCharsets.UTF_8);      // Optional: set a generation-match precondition to enable automatic retries, avoid potential     // race     // conditions and data corruptions. The request returns a 412 error if the     // preconditions are not met.     Storage.BlobTargetOption precondition;     if (storage.get(bucketName, objectName) == null) {       // For a target object that does not yet exist, set the DoesNotExist precondition.       // This will cause the request to fail if the object is created before the request runs.       precondition = Storage.BlobTargetOption.doesNotExist();     } else {       // If the destination already exists in your bucket, instead set a generation-match       // precondition. This will cause the request to fail if the existing object's generation       // changes before the request runs.       precondition =           Storage.BlobTargetOption.generationMatch(               storage.get(bucketName, objectName).getGeneration());     }     storage.create(blobInfo, content, precondition);      System.out.println(         "Object "             + objectName             + " uploaded to bucket "             + bucketName             + " with contents "             + contents);   } }

Node.js

如需了解详情,请参阅 Cloud Storage Node.js API 参考文档

如需向 Cloud Storage 进行身份验证,请设置应用默认凭证。如需了解详情,请参阅为客户端库设置身份验证

/**  * TODO(developer): Uncomment the following lines before running the sample.  */ // The ID of your GCS bucket // const bucketName = 'your-unique-bucket-name';  // The contents that you want to upload // const contents = 'these are my contents';  // The new ID for your GCS file // const destFileName = 'your-new-file-name';  // Imports the Google Cloud Node.js client library const {Storage} = require('@google-cloud/storage');  // Creates a client const storage = new Storage();  async function uploadFromMemory() {   await storage.bucket(bucketName).file(destFileName).save(contents);    console.log(     `${destFileName} with contents ${contents} uploaded to ${bucketName}.`   ); }  uploadFromMemory().catch(console.error);

PHP

如需了解详情,请参阅 Cloud Storage PHP API 参考文档

如需向 Cloud Storage 进行身份验证,请设置应用默认凭证。如需了解详情,请参阅为客户端库设置身份验证

use Google\Cloud\Storage\StorageClient;  /**  * Upload an object from memory buffer.  *  * @param string $bucketName The name of your Cloud Storage bucket.  *        (e.g. 'my-bucket')  * @param string $objectName The name of your Cloud Storage object.  *        (e.g. 'my-object')  * @param string $contents The contents to upload to the file.  *        (e.g. 'these are my contents')  */ function upload_object_from_memory(     string $bucketName,     string $objectName,     string $contents ): void {     $storage = new StorageClient();     if (!$stream = fopen('data://text/plain,' . $contents, 'r')) {         throw new \InvalidArgumentException('Unable to open file for reading');     }     $bucket = $storage->bucket($bucketName);     $bucket->upload($stream, [         'name' => $objectName,     ]);     printf('Uploaded %s to gs://%s/%s' . PHP_EOL, $contents, $bucketName, $objectName); }

Python

如需了解详情,请参阅 Cloud Storage Python API 参考文档

如需向 Cloud Storage 进行身份验证,请设置应用默认凭证。如需了解详情,请参阅为客户端库设置身份验证

from google.cloud import storage   def upload_blob_from_memory(bucket_name, contents, destination_blob_name):     """Uploads a file to the bucket."""      # The ID of your GCS bucket     # bucket_name = "your-bucket-name"      # The contents to upload to the file     # contents = "these are my contents"      # The ID of your GCS object     # destination_blob_name = "storage-object-name"      storage_client = storage.Client()     bucket = storage_client.bucket(bucket_name)     blob = bucket.blob(destination_blob_name)      blob.upload_from_string(contents)      print(         f"{destination_blob_name} with contents {contents} uploaded to {bucket_name}."     ) 

Ruby

如需了解详情,请参阅 Cloud Storage Ruby API 参考文档

如需向 Cloud Storage 进行身份验证,请设置应用默认凭证。如需了解详情,请参阅为客户端库设置身份验证

# The ID of your GCS bucket # bucket_name = "your-unique-bucket-name"  # The ID of your GCS object # file_name = "your-file-name"  # The contents to upload to your file # file_content = "Hello, world!"  require "google/cloud/storage"  storage = Google::Cloud::Storage.new bucket  = storage.bucket bucket_name, skip_lookup: true  file = bucket.create_file StringIO.new(file_content), file_name  puts "Uploaded file #{file.name} to bucket #{bucket_name} with content: #{file_content}"

后续步骤