SpecialistOff.NET / Вопросы / Статьи / Фрагменты кода / Резюме / Метки / Помощь / Файлы
НазадМетки: airflow apache airflow
The GoogleCloud Build is a service that executes your builds on Google Cloud Platform infrastructure. Cloud Build can import source code from Google Cloud Storage, Cloud Source Repositories, execute a build to your specifications, and produce artifacts such as Docker containers or Java archives.
To use these operators, you must do a few things:
Select or create a Cloud Platform project using Cloud Console.
Enable billing for your project, as described in Google Cloud documentation.
Enable API, as described in Cloud Console documentation.
Install API libraries via pip.
pip install 'apache-airflow[gcp]'Detailed information is available Installation
In order to trigger a build, it is necessary to pass the build configuration.
airflow/contrib/example_dags/example_gcp_cloud_build.pyVIEW SOURCE
create_build_from_storage_body = {
"source": {"storageSource": GCP_SOURCE_ARCHIVE_URL},
"steps": [
{
"name": "gcr.io/cloud-builders/docker",
"args": ["build", "-t", "gcr.io/$PROJECT_ID/{}".format(GCP_SOURCE_BUCKET_NAME), "."],
}
],
"images": ["gcr.io/$PROJECT_ID/{}".format(GCP_SOURCE_BUCKET_NAME)],
}
The source code for the build can come from Google Cloud Build Storage:
"source": {"storageSource": {"bucket": "bucket-name", "object": "object-name.tar.gz"}},
It is also possible to specify it using the URL:
"source": {"storageSource": "gs://bucket-name/object-name.tar.gz"},
In addition, a build can refer to source stored in Google Cloud Source Repositories.
"source": {
"repoSource": {
"projectId": "airflow-project",
"repoName": "airflow-repo",
"branchName": "master",
}
},
It is also possible to specify it using the URL:
"source": {"repoSource": "https://source.developers.google.com/p/airflow-project/r/airflow-repo"},
Read Build Configuration Overview to understand all the fields you can include in a build config file.
Trigger a build is performed with the CloudBuildCreateBuildOperator operator.
airflow/contrib/example_dags/example_gcp_cloud_build.pyVIEW SOURCE
create_build_from_storage = CloudBuildCreateBuildOperator(
task_id="create_build_from_storage", project_id=GCP_PROJECT_ID, body=create_build_from_storage_body
)
You can use Jinja templating with body, gcp_conn_id, api_version parameters which allows you to dynamically determine values. The result is saved to XCom, which allows it to be used by other operators.
airflow/contrib/example_dags/example_gcp_cloud_build.pyVIEW SOURCE
create_build_from_storage_result = BashOperator(
bash_command="echo '{{ task_instance.xcom_pull('create_build_from_storage')['images'][0] }}'",
task_id="create_build_from_storage_result",
)
For further information, look at: