Docker部署Apache Airflow

https://airflow.apache.org/docs/apache-airflow/stable/start/docker.html#

<span class="pre">docker-compose.yaml</span>

This file contains several service definitions:

  • <span class="pre">airflow-scheduler</span> – The scheduler monitors all tasks and DAGs, then triggers the task instances once their dependencies are complete.

  • <span class="pre">airflow-webserver</span> – The webserver is available at <span class="pre">http://localhost:8080</span>.

  • <span class="pre">airflow-worker</span> – The worker that executes the tasks given by the scheduler.

  • <span class="pre">airflow-init</span> – The initialization service.

  • <span class="pre">flower</span>The flower app for monitoring the environment. It is available at <span class="pre">http://localhost:5555</span>.

  • <span class="pre">postgres</span> – The database.

  • <span class="pre">redis</span>The redis – broker that forwards messages from scheduler to worker.

In general, if you want to use airflow locally, your DAGs may try to connect to servers which are running on the host. In order to achieve that, an extra configuration must be added in <span class="pre">docker-compose.yaml</span>. For example, on Linux the configuration must be in the section <span class="pre">services:&#xA0;<span class="pre">airflow-worker</span></span> adding <span class="pre">extra_hosts:&#xA0;<span class="pre">-&#xA0;<span class="pre">"host.docker.internal:host-gateway"</span></span></span>; and use <span class="pre">host.docker.internal</span> instead of <span class="pre">localhost</span>. This configuration vary in different platforms. Please, see documentation for Windows and Mac for further information.

Some directories in the container are mounted, which means that their contents are synchronized between your computer and the container.

  • <span class="pre">./dags</span> – you can put your DAG files here.

  • <span class="pre">./logs</span> – contains logs from task execution and scheduler.

  • <span class="pre">./plugins</span> – you can put your custom plugins here.

This file uses the latest Airflow image (apache/airflow). If you need to install a new Python library or system library, you can build your image

https://www.cnblogs.com/braveym/p/13503549.html

Original: https://www.cnblogs.com/youxin/p/16182944.html
Author: youxin
Title: Docker部署Apache Airflow

原创文章受到原创版权保护。转载请注明出处:https://www.johngo689.com/534126/

转载文章受原作者版权保护。转载请注明原作者出处!

(0)

大家都在看

亲爱的 Coder【最近整理,可免费获取】👉 最新必读书单  | 👏 面试题下载  | 🌎 免费的AI知识星球