No xcom_push or xcom_pull needed – the TaskFlow wiring handles it. With traditional operators, you must push/pull manually.
def pull_function(**context): user_id = context['ti'].xcom_pull(task_ids='push_task', key='user_id') print(f"Received user_id")
process(extract()) # XCom passed implicitly xcom in airflow
@task def aggregate(results: list[str]): print(f"All results: results")
push = PythonOperator(task_id='push_task', python_callable=push_function) pull = PythonOperator(task_id='pull_task', python_callable=pull_function) No xcom_push or xcom_pull needed – the TaskFlow
Here’s a structured, useful blog post about — written for data engineers who want to move beyond basic tasks and build real DAGs. Mastering XComs in Apache Airflow: Cross‑Task Communication Without the Pain One of the first surprises when learning Airflow is that tasks run isolated from each other. You can’t just set task_2.data = task_1.data . So how do you pass a value from one task to another? XComs .
XCom (short for cross‑communication ) is Airflow’s built‑in mechanism for exchanging small pieces of data between tasks. When used wisely, they unlock powerful patterns. When abused, they break your DAGs. Let’s see how to use them correctly. XComs are key‑value pairs stored in Airflow’s metadata database. A task can push an XCom (write a value under a key), and another task can pull that value (read it). and results. Keep them light.
XComs are for coordination , not data transfer . Final Takeaway XComs are Airflow’s glue. They turn a set of isolated tasks into a coherent pipeline. Use them for small control signals, IDs, and results. Keep them light. And when you’re tempted to pass a big blob of data – stop, and ask yourself: should this be in object storage instead?