Creating a Google Cloud Storage bucket

Body

Quest and Kellogg Linux Cluster Downtime, December 14 - 18.

Quest, including the Quest Analytics Nodes, the Genomics Compute Cluster (GCC), the Kellogg Linux Cluster (KLC), and Quest OnDemand, will be unavailable for scheduled maintenance starting at 8 A.M. on Saturday, December 14, and ending approximately at 5 P.M. on Wednesday, December 18. During the maintenance window, you will not be able to login to Quest, Quest Analytics Nodes, the GCC, KLC, or Quest OnDemand submit new jobs, run jobs, or access files stored on Quest in any way including Globus. For details on this maintenance, please see the Status of University IT Services page.

Quest RHEL8 Pilot Environment - November 18.

Starting November 18, all Quest users are invited to test and run their workflows in a RHEL8 pilot environment to prepare for Quest moving completely to RHEL8 in March 2025. We invite researchers to provide us with feedback during the pilot by contacting the Research Computing and Data Services team at quest-help@northwestern.edu. The pilot environment will consist of 24 H100 GPU nodes and seventy-two CPU nodes, and it will expand with additional nodes through March 2025. Details on how to access this pilot environment will be published in a KB article on November 18.

Creating a Google Cloud Storage bucket 

How to create a Google Cloud Storage bucket using the Google Cloud Console 

Setup 

Contact consultant@northwestern.edu to request access to the NU Google Cloud organization. Using Google Cloud Storage requires:

  1. Access to the Northwestern University Google Cloud organization
  2. Permissions to deploy resources in a Google Cloud project

that you have access to a Google Cloud Project where the bucket, or other cloud resources, can be deployed. 

Using the Google Cloud Platform Console 

To log in to the Google Cloud Platform, 

  1. Go to https://console.cloud.google.com/ and log in with your Northwestern identity 

  1. You will be logged into the console and can see your active project in the upper left corner of the page. If you are a part of multiple projects, you are able to select the project where you would like to deploy the cloud storage bucket by clicking the project box and selecting the project. 

 

Uploaded Image (Thumbnail)

To Access Cloud Storage 

  1. Click on Cloud Storage in the Quick access menu. The console will take you to the Cloud Storage view. This view will list any existing buckets in the project if they exist. 

Uploaded Image (Thumbnail)

  1.  Next, click Create 

Uploaded Image (Thumbnail)

  1. You will be taken to a new screen where you can set the attributes of your storage bucket: 

    • Name your bucket: The name must be globally unique and must contain 3-63 alphanumeric characters, lowercase letters, dashes, underscores, and dots. 
    • Choose where to store your data: If you use other GCP resources, pick the same region that you use for other work. Otherwise, pick the region that is the most geographically near to where you do your work. 
    • Choose a storage class for your data: You will be presented with four options for storage class. Each storage class has its own balance of performance, availability, and cost, allowing you to choose the best fit for your specific data storage requirements: 

Storage Tier Description
 Standard Storage This is the default storage class, offering high performance and low-latency access for frequently accessed data. 
Nearline Storage Designed for data accessed less frequently but still with quick retrieval times, making it suitable for backups and archives. Minimum storage duration is 30 days. 
Coldline Storage Ideal for long-term storage of data that is rarely accessed, with a lower cost but slightly longer retrieval times. Minimum storage duration is 90 days. 
Archive Storage The most cost-effective option for long-term storage of data that is accessed very infrequently, with longer retrieval times. Minimum storage duration is 365 days. 

*​​​​​It is important to note that you can delete, replace, or move your data before the minimum duration, but you will still be charged as if the data was stored for the minimum duration

  1. Choose how to control access to objects: By default, the bucket will restrict data from being publicly available via the internet. Access is granted on a case-by-case basis to those that need it. We recommended that you keep the default setting unless the bucket will be used for web hosting. 

You are also given two options for Access control: Uniform and Fine-grained. Uniform access control treats all objects in a bucket the same in terms of access, while fine-grained access control lets you customize access permissions for each object separately. The choice between the two depends on whether you need a consistent level of access across all objects or if you require more specific control over who can access each object. 

  1. Choose how to protect object data: Select either object versioning or define a data retention policy as optional data protection tools you can use with your bucket. Select object versioning if you need previous versions of files after they have been deleted or changed. Storage costs will apply to each version. You may also select the encryption key method you would like to use with the bucket. A Google-managed encryption key is selected by default, but you have the option to manage your own encryption key as well. In most cases a Google-managed encryption key will suffice. 

Once you have defined the properties of your bucket, click Create at the bottom of the page. GCP will then create the bucket and display the bucket details. For more information on granting additional users with read/write permissions to this bucket, please reference the Google IAM permissions documentation

 

Details

Details

Article ID: 2444
Created
Mon 9/18/23 2:56 PM
Modified
Fri 10/25/24 3:11 PM

Related Services / Offerings

Related Services / Offerings (1)

Northwestern IT offers support, training, and workshops on research data management topics.