Apache Airflow Jobs in Glasgow

4 of 4 Apache Airflow Jobs in Glasgow

Data Engineer

Glasgow, Scotland, United Kingdom
Seargin
REST APIs and integration techniques Familiarity with data visualization tools and libraries (e.g., Power BI) Background in database administration or performance tuning Familiarity with data orchestration tools, such as Apache Airflow Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing Strong analytical skills, including a thorough understanding of how to interpret customer business requirements More ❯
Posted:

Data Engineer

Glasgow, Scotland, United Kingdom
Hybrid/Remote Options
NLB Services
REST APIs and integration techniques · Familiarity with data visualization tools and libraries (e.g., Power BI) · Background in database administration or performance tuning · Familiarity with data orchestration tools, such as Apache Airflow · Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing · Strong analytical skills, including a thorough understanding of how to interpret customer business requirements More ❯
Posted:

Data Engineer (DataBricks)

Glasgow, Scotland, United Kingdom
Synechron
APIs and integration techniques. Additional Assets (Preferred): Familiarity with data visualization tools (e.g., Power BI). Experience in database administration or performance tuning. Knowledge of data orchestration tools like Apache Airflow. Exposure to big data technologies such as Hadoop or Spark. Why Join Synechron? Be part of a dynamic, innovative team driving digital transformation in the financial sector. We More ❯
Posted:

AWS Lead Data Engineer

Glasgow, Scotland, United Kingdom
Hybrid/Remote Options
Square One Resources
optimise, and maintain large-scale data pipelines. Develop and enhance data lakes and data warehouses. Implement scalable SQL and Python/PySpark-based solutions. Use orchestration tools such as Airflow and AWS Step Functions. Analyse current processes and propose end-to-end technical improvements. Engage with stakeholders to translate business needs into technical designs. Support migrations into Databricks and … CloudFormation (mandatory) UI development experience (mandatory) Strong SQL, Python, and PySpark Experience with GitLab and unit testing Knowledge of modern data engineering patterns and best practices Desirable (Databricks Track) Apache Spark Databricks (Delta Lake, Unity Catalog, MLflow) Experience with Databricks migration or development AI/ML understanding (nice to have More ❯
Posted: