Unix 셸 스크립트를 사용하여 분산된 DT Google 버킷에서 자체 Google API GCS 버킷으로 파일을 복사할 수 있습니다. 두 가지 옵션이 있습니다.
gsutil에서 Unix 시스템을 사용하는 경우 매일 모든 버킷에 대해 다음을 실행합니다.
$day=$(date--date="1 days ago"+"%m-%d-%Y")
$gsutil-mcpgs://{<dcmhashid_A>,<dcmhashid_B>,etc.}/*$day*.log.gzgs://<client_bucket>/
또는 좀 더 까다로운 해결책은 bash 파일을 사용하는 것입니다.
#!/bin/bashset-x
buckets={dfa_-hasid_Adfa_-hashid_B,...}#include all hash idsday=$(date--date="1 days ago"+"%m-%d-%Y")forbin${buckets[@]};do/
gsutil-mcpgs://$b/*$day*.log.gzgs:////
done
프로그래매틱 방식으로 데이터에 액세스
Google Cloud Storage에는 다양한 프로그래밍을 위한 API와 샘플이 있습니다.
프로그래매틱 방식으로 데이터에 액세스할 수 있는 언어를 제공합니다. 다음은 작동하는 통합을 빌드하기 위해 취해야 하는 데이터 전송 v2.0 관련 단계입니다.
서비스 계정 가져오기
데이터 전송 v2.0을 사용하려면 먼저
사용
설정 도구를 볼 수 있습니다. 이 도구는
Google API 콘솔, API 사용 설정, 사용자 인증 정보 만들기
새 서비스 계정을 설정하려면 다음 단계를 따르세요.
사용자 인증 정보 만들기 > 서비스 계정 키를 클릭합니다.
서비스 계정의 공개/비공개 키를
표준 P12 파일 또는 Google API 클라이언트에 의해 로드될 수 있는 JSON 파일
있습니다.
새로운 공개 키/비공개 키 쌍이 생성되고 기기에 다운로드됩니다.
생성된 파일은 이 키의 유일한 사본입니다. 안전하게 보관할 책임은 사용자에게 있습니다.
[[["이해하기 쉬움","easyToUnderstand","thumb-up"],["문제가 해결됨","solvedMyProblem","thumb-up"],["기타","otherUp","thumb-up"]],[["필요한 정보가 없음","missingTheInformationINeed","thumb-down"],["너무 복잡함/단계 수가 너무 많음","tooComplicatedTooManySteps","thumb-down"],["오래됨","outOfDate","thumb-down"],["번역 문제","translationIssue","thumb-down"],["샘플/코드 문제","samplesCodeIssue","thumb-down"],["기타","otherDown","thumb-down"]],["최종 업데이트: 2025-08-31(UTC)"],[[["\u003cp\u003eData Transfer v2.0 provides access to your data files stored in a Google Cloud Storage bucket, controlled by a Google Group you specify.\u003c/p\u003e\n"],["\u003cp\u003eYou can access your data using the gsutil command-line utility or programmatically via Google Cloud Storage APIs.\u003c/p\u003e\n"],["\u003cp\u003eTo access data programmatically, you'll need to set up a service account and grant it read-only access to your Google Group and Cloud Storage bucket.\u003c/p\u003e\n"]]],[],null,["# Get Started\n\nWhen you work with your sales or support contact to setup access to Data Transfer v2.0,\nyou will be provided with a bucket name. You will need to provide your sales contact a\n[Google Group](http://groups.google.com/) which enables you to control\naccess to your data files in [Google Cloud Storage](//cloud.google.com/storage/).\n\n\nYou can choose to access your data using a [utility](#access-data-using-gsutil)\nor you can write your own [code.](#access-data-programmatically)\n\nAccess data using gsutil\n------------------------\n\nThe gsutil tool is a command-line application, written in Python, that\nlets you access your data without having to do any coding. You\ncould, for example, use gsutil as part of a script or batch file instead of\ncreating custom applications.\n\n\nTo get started with gsutil read the [gsutil\ndocumentation](/storage/docs/gsutil). The tool will prompt you for your credentials the first time\nyou use it and then store them for use later on.\n\n### gsutil examples\n\nYou can list all of your files using gsutil as follows:\n`gsutil ls gs://[bucket_name]/[object name/file name]`\n\ngsutil uses much of the same syntax as UNIX, including the wildcard\nasterisk (\\*), so you can list all NetworkImpression files:\n`gsutil ls gs://[bucket_name]/dcm_account6837_impression_*`\n\nIt's also easy to download a file:\n`gsutil cp gs://[bucket_name]/dcm_account6837_impression_2015120100.log.gz`\n\nYou can copy your files from the dispersed DT Google buckets to your own Google API GCS Bucket\nusing a Unix shell script, there are two options:\n\n- In gsutil, if you are using a Unix System, run the following for all your buckets daily:\n\n ```bash\n $ day=$(date --date=\"1 days ago\" +\"%m-%d-%Y\")\n $ gsutil -m cp gs://{\u003cdcmhashid_A\u003e,\u003cdcmhashid_B\u003e,etc.}/*$day*.log.gz gs://\u003cclient_bucket\u003e/\n ```\n- Alternatively, a solution that is a little trickier is to use a bash file:\n\n ```bash\n #!/bin/bash\n\n set -x\n\n buckets={dfa_-hasid_A dfa_-hashid_B,...} #include all hash ids\n day=$(date --date=\"1 days ago\" +\"%m-%d-%Y\")\n for b in ${buckets[@]}; do /\n gsutil -m cp gs://$b/*$day*.log.gz gs:/// /\n done\n ```\n\nAccess data programmatically\n----------------------------\n\n\n[Google Cloud Storage](/storage) has APIs and [samples](/storage/docs/json_api/v1/libraries) for many programming\nlanguages that allow you to access your data in a programmatic way. Below are\nthe steps specific to Data Transfer v2.0 that you must take to build a\nworking integration.\n\n### Get a service account\n\n\nTo get started using Data Transfer v2.0, you need to first\n[use\nthe setup tool](https://console.cloud.google.com/start/api?id=storage_component&credential=client_key), which guides you through creating a project in the\nGoogle API Console, enabling the API, and creating credentials.\n\n\u003cbr /\u003e\n\n\nTo set up a new service account, do the following:\n\n1. Click **Create credentials \\\u003e Service account key**.\n2. Choose whether to download the service account's public/private key as a standard P12 file, or as a JSON file that can be loaded by a Google API client library.\n\nYour new public/private key pair is generated and downloaded to your machine;\nit serves as the only copy of this key. You are responsible for storing it\nsecurely.\n\n\u003cbr /\u003e\n\n|\n| **Note:** If you plan to access Google Cloud Storage using the\n| [JSON API](/storage/docs/json_api), then you must also verify\n| that the [Google Cloud Storage JSON API](//console.developers.google.com//project/_/apiui/apiview/storage_api/overview) component is activated as well.\n\nBe sure to keep this window open, you will need the service account email\nin the next step.\n\n\n### Add a service account to your group\n\n- Go to [Google Group](http://groups.google.com/)\n- Click on My Groups and select the group you use for managing access to your DT v2.0 Cloud Storage Bucket\n- Click Manage\n- **Do not click Invite Members!**\n- Click Direct add members\n- Copy the service account email from the previous step into the members box\n- Select No email\n- Click the Add button\n\n#### I accidentally clicked Invite Members\n\n[More...]()\n\n- Don't Panic! You can fix it\n- Head back to the Manage screen as before\n- Click on Outstanding Invitations\n- Find the service account and select it\n- Click Revoke invitation at the top of the screen\n- Click Direct add members and resume steps above\n\n### Scope\n\n\n**Any scopes passed to Cloud Storage must be Read Only**\n\nFor example, when using the Java client library the correct scope to\nuse is: \n\n```scdoc\nStorageScopes.DEVSTORAGE_READ_ONLY\n```\n\n\u003cbr /\u003e"]]