4 of 4 Apache Airflow Jobs in the North East

Data Engineer

Hiring Organisation
Leonardo
Location
Newcastle Upon Tyne, England, United Kingdom
with CI/CD pipelines, containerisation, and workflow orchestration. Familiar with ETL/ELT frameworks, and experienced with Big Data Processing Tools (e.g. Spark, Airflow, Hive, etc.) Knowledge of programming languages (e.g. Java, Python, SQL) Hands-on experience with SQL/NoSQL database design Degree in STEM, or similar ...

Data Engineer

Hiring Organisation
Noir
Location
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom
Employment Type
Full-Time
Salary
£45,000 - £80,000 per annum
Data Engineer - FinTech - Newcastle (Tech stack: Data Engineer, SQL, Python, AWS, Git, Airflow, Data Pipelines, Data Platforms, Programmer, Developer, Architect, Data Engineer) Our client is a trailblazer in the FinTech space, known for delivering innovative technology solutions to global financial markets. They are expanding their engineering capability in Newcastle ...

Data Engineer - Databricks (Newcastle or Erskine)

Hiring Organisation
DXC Technology
Location
Newcastle, Co. Down, UK
using Databricks. Manage and optimize Databricks Workspace, Clusters, Jobs, Repos, and Delta Live Tables. Develop modular ETL workflows using dbt and orchestrate them with Apache Airflow. Write advanced PySpark code, including UDFs and Pandas UDFs, for efficient data processing. Ensure data quality, reliability, and performance across all systems. Collaborate … Experience Proven experience with Databricks and its ecosystem Strong proficiency in PySpark, especially with UDFs and Pandas UDFs Hands-on experience with dbt and Airflow Databricks Certified Data Engineer Associate certification or willingness to achieve one Excellent problem-solving skills and a passion for data and AI Bachelor ...

Data Engineer - Databricks (Newcastle or Erskine)

Hiring Organisation
DXC Technology
Location
Newcastle, Northern Ireland, United Kingdom
using Databricks. Manage and optimize Databricks Workspace, Clusters, Jobs, Repos, and Delta Live Tables. Develop modular ETL workflows using dbt and orchestrate them with Apache Airflow. Write advanced PySpark code, including UDFs and Pandas UDFs, for efficient data processing. Ensure data quality, reliability, and performance across all systems. Collaborate … Experience Proven experience with Databricks and its ecosystem Strong proficiency in PySpark, especially with UDFs and Pandas UDFs Hands-on experience with dbt and Airflow Databricks Certified Data Engineer Associate certification or willingness to achieve one Excellent problem-solving skills and a passion for data and AI Bachelor ...