more than 90 million passengers this year, we employ over 10,000 people. Its big-scale stuff and we’re still growing. Job Purpose With a big investment into Databricks, and with a large amount of interesting data, this is the chance for you to come and be part of an exciting transformation in the way we store, analyse and … solutions. Job Accountabilities Develop robust, scalable data pipelines to serve the easyJet analyst and data science community. Highly competent hands-on experience with relevant Data Engineering technologies, such as Databricks, Spark, Spark API, Python, SQL Server, Scala. Work with data scientists, machine learning engineers and DevOps engineers to develop, develop and deploy machine learning models and algorithms aimed at addressing … workflow and knowledge of when and how to use dedicated hardware. Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. Experience with data quality and/or and data lineage frameworks like Great Expectations, dbt data quality, OpenLineage or Marquez More ❯
more than 90 million passengers this year, we employ over 10,000 people. Its big-scale stuff and we’re still growing. Job Purpose With a big investment into Databricks and a large amount of interesting data this is the chance for you to come and be part of an exciting transformation in the way we store, analyse and use … engineering practices (e.g. TDD, CI/CD). Experience with Apache Spark or any other distributed data programming frameworks. Comfortable writing efficient SQL and debugging on cloud warehouses like Databricks SQL or Snowflake. Experience with cloud infrastructure like AWS or Azure. Experience with Linux and containerisation (e.g Docker, shell scripting). Understanding Data modelling and Data Cataloguing principles. Understanding of … end monitoring, quality checks, lineage tracking and automated alerts to ensure reliable and trustworthy data across the platform. Experience of building a data transformation framework with dbt. Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. What you’ll get in return Competitive base salary Up to 20% bonus 25 days holiday BAYE, SAYE & Performance share More ❯
Hertfordshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
I need to apply for the role Expert SQL, Power BI, and data modelling. Familiarity working in an Azure environment (ADF ideal). Strong communication and stakeholder management skills. Databricks or Python desirable (not essential at all). My client are looking to book in first stage interviews for later this week and slots are already filling up fast. I More ❯
Cambridge, Cambridgeshire, United Kingdom Hybrid / WFH Options
Softwire
firm Modeling data for a civil service department replacing a legacy HR system Experience and qualifications Technical 3+ years' experience in data or software engineering Knowledge of Python, SQL, Databricks, Snowflake, and major cloud platforms (AWS/Azure/GCP) Ability to learn quickly and adapt to new technologies and sectors Understanding of data engineering best practices and system design More ❯
Luton, Bedfordshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
processes effectively Desirable Skills: GCP Professional Data Engineer certification Exposure to Agentic AI systems or intelligent/autonomous data workflows Experience with BI tools such as Looker Exposure to Databricks, Snowflake, AWS, Azure or DBT Academic background in Computer Science, Mathematics or a related field This is an opportunity to work in a forward-thinking environment with access to cutting More ❯
. Strong background in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact More ❯
. Strong background in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact More ❯
. Strong background in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact More ❯
. Strong background in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact More ❯
. Strong background in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact More ❯
. Strong background in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact More ❯
. Strong background in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact More ❯
. Strong background in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact More ❯
. Strong background in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact More ❯
. Strong background in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact More ❯
. Strong background in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact More ❯
. Strong background in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact More ❯
Peterborough, Cambridgeshire, UK Hybrid / WFH Options
Amtis - Digital, Technology, Transformation
with clients to uncover data challenges and craft tailored solutions using Microsoft Fabric. Lead the design of cutting-edge data architectures that integrate tools like Azure Synapse, Data Factory, Databricks, and Power BI. Drive end-to-end solution delivery—from concept to deployment—ensuring performance, scalability, and seamless integration. Build intelligent data models and pipelines that power real-time insights More ❯
with clients to uncover data challenges and craft tailored solutions using Microsoft Fabric. Lead the design of cutting-edge data architectures that integrate tools like Azure Synapse, Data Factory, Databricks, and Power BI. Drive end-to-end solution delivery—from concept to deployment—ensuring performance, scalability, and seamless integration. Build intelligent data models and pipelines that power real-time insights More ❯
with clients to uncover data challenges and craft tailored solutions using Microsoft Fabric. Lead the design of cutting-edge data architectures that integrate tools like Azure Synapse, Data Factory, Databricks, and Power BI. Drive end-to-end solution delivery—from concept to deployment—ensuring performance, scalability, and seamless integration. Build intelligent data models and pipelines that power real-time insights More ❯
with clients to uncover data challenges and craft tailored solutions using Microsoft Fabric. Lead the design of cutting-edge data architectures that integrate tools like Azure Synapse, Data Factory, Databricks, and Power BI. Drive end-to-end solution delivery—from concept to deployment—ensuring performance, scalability, and seamless integration. Build intelligent data models and pipelines that power real-time insights More ❯
with clients to uncover data challenges and craft tailored solutions using Microsoft Fabric. Lead the design of cutting-edge data architectures that integrate tools like Azure Synapse, Data Factory, Databricks, and Power BI. Drive end-to-end solution delivery—from concept to deployment—ensuring performance, scalability, and seamless integration. Build intelligent data models and pipelines that power real-time insights More ❯
Cambridge, Cambridgeshire, UK Hybrid / WFH Options
Amtis - Digital, Technology, Transformation
with clients to uncover data challenges and craft tailored solutions using Microsoft Fabric. Lead the design of cutting-edge data architectures that integrate tools like Azure Synapse, Data Factory, Databricks, and Power BI. Drive end-to-end solution delivery—from concept to deployment—ensuring performance, scalability, and seamless integration. Build intelligent data models and pipelines that power real-time insights More ❯
with clients to uncover data challenges and craft tailored solutions using Microsoft Fabric. Lead the design of cutting-edge data architectures that integrate tools like Azure Synapse, Data Factory, Databricks, and Power BI. Drive end-to-end solution delivery—from concept to deployment—ensuring performance, scalability, and seamless integration. Build intelligent data models and pipelines that power real-time insights More ❯
with clients to uncover data challenges and craft tailored solutions using Microsoft Fabric. Lead the design of cutting-edge data architectures that integrate tools like Azure Synapse, Data Factory, Databricks, and Power BI. Drive end-to-end solution delivery—from concept to deployment—ensuring performance, scalability, and seamless integration. Build intelligent data models and pipelines that power real-time insights More ❯