Skip to main content
0 votes
0 answers
45 views

I have a Dockerfile that builds an image based on golang:bookworm. It installs the google-cloud-cli package from the https://packages.cloud.google.com/apt repository. I used it to build an image that'...
derat's user avatar
  • 218
Advice
0 votes
1 replies
70 views

I receive ZIP files everyday to a bucket. Part of my Python pipeline is to extract these out into the individual CSVs. However, wondering if there's a quicker way? There's roughly 20 files in each ZIP,...
Aaron's user avatar
  • 5
Advice
0 votes
0 replies
35 views

I am setting up lifecycle policies for some of my buckets via terraform. I have this strategy: standard > nearline > deletion. The thing is, I am not sure if the AGE property of a file in the ...
Dasph's user avatar
  • 460
Best practices
0 votes
0 replies
59 views

My use case is simple in nature. I have a platform where users can upload any files up to 20GB. My current solution is: Frontend Client asks for a presignedURL which the Backend generates return ...
Asif Alam's user avatar
1 vote
0 answers
43 views

I have a Frontend Client which lets users uploads any number of files of any size (think up to 100 GB File). Currently I am using GCS buckets presigned URL to upload the file. My current ...
Asif Alam's user avatar
0 votes
0 answers
31 views

I'm trying to deploy Cloud Storage CORS settings using the Firebase CLI, but it consistently fails with a generic "Error: An unexpected error has occurred." Project ID: oa-maintenance-v2 ...
プラス田口's user avatar
0 votes
0 answers
41 views

I am trying to create a virtual image from a tar file that is present in the storage bucket When tried from my Java SDK code which is my actual requirement I get this Required ‘read’ permission for ‘${...
ROHAN ACHAR V's user avatar
2 votes
1 answer
89 views

I wrote this code that uses Google's Cloud API to get an object from my bucket and download it. It works perfectly when I had my bucket set to public (allUsers added to Principal w/ all the required ...
Art T.'s user avatar
  • 31
0 votes
1 answer
74 views

Context: using distcp, I am trying to copy HDFS directory including files to GCP bucket. I am using hadoop distcp -Dhadoop.security.credential.provider.path=jceks://$JCEKS_FILE hdfs://nameservice1/...
Jhon's user avatar
  • 49
1 vote
2 answers
102 views

I was trying to use google cloud storage in a python virtual environment. I tried installing google-cloud-storage and whenever I run the code I always get the error ModuleNotFoundError: No module ...
Asem Shaath's user avatar
0 votes
1 answer
94 views

I have a requirement to move files from a source folder to destination folder in different GCS buckets. I am using GCSToGCSOperator with following config: source_bucket: "source_bucket" ...
A B's user avatar
  • 1,936
1 vote
0 answers
300 views

I have been trying to run some models from huggingface locally. The script is being hosted on google cloud run. Since running the instance multiple times triggers rate limiting, I have downloaded the ...
GentleClash's user avatar
0 votes
0 answers
49 views

I have a small app where I am using firebase functions to upload an image into firebase storage. Once done, I store this image url against an object in firebase db and then reuse this image in the app ...
feeyam's user avatar
  • 1
1 vote
0 answers
108 views

I'm trying to upload files to a Google Cloud Storage bucket from a Ballerina application. Right now, the only way I’ve found to authenticate is by manually generating an access token using a service-...
Virul Nirmala Wickremesinghe's user avatar
0 votes
1 answer
85 views

On my Django Project I have a model that a property is used to store videos in a specific and single Google Cloud Storage bucket, using FileField. The model is defined like this: from storages....
Raul Chiarella's user avatar

15 30 50 per page
1
2 3 4 5
751