In this role, you will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, ApacheSpark, and Apache Airflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and … significant impact, we encourage you to apply! Job Responsibilities ETL/ELT Pipeline Development: Design, develop, and optimize efficient and scalable ETL/ELT pipelines using Python, PySpark, and Apache Airflow. Implement batch and real-time data processing solutions using Apache Spark. Ensure data quality, governance, and security throughout the data lifecycle. Cloud Data Engineering: Manage and optimize … effectiveness. Implement and maintain CI/CD pipelines for data workflows to ensure smooth and reliable deployments. Big Data & Analytics: Develop and optimize large-scale data processing pipelines using ApacheSpark and PySpark. Implement data partitioning, caching, and performance tuning techniques to enhance Spark-based workloads. Work with diverse data formats (structured and unstructured) to support advanced More ❯
In this role, you will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, ApacheSpark, and Apache Airflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and … to apply! Job Responsibilities Data Engineering & Data Pipeline Development Design, develop, and optimize scalable DATA workflows using Python, PySpark, and Airflow Implement real-time and batch data processing using Spark Enforce best practices for data quality, governance, and security throughout the data lifecycle Ensure data availability, reliability and performance through monitoring and automation. Cloud Data Engineering : Manage cloud infrastructure … data processing workloads Implement CI/CD pipelines for data workflows to ensure smooth and reliable deployments. Big Data & Analytics: Build and optimize large-scale data processing pipelines using ApacheSpark and PySpark Implement data partitioning, caching, and performance tuning for Spark-based workloads. Work with diverse data formats (structured and unstructured) to support advanced analytics and More ❯
Nursling, Southampton, Hampshire, England, United Kingdom Hybrid / WFH Options
Ordnance Survey
lead Support the Ordnance Survey Testing Community, with common standards such as metrics and use of test tools Here is a snapshot of the technologies that we use Scala, ApacheSpark, Databricks, Apache Parquet, YAML, Azure Cloud Platform, Azure DevOps (Test plans, Backlogs, Pipelines), GIT, GeoJSON What we're looking for Highly skilled in creating, maintaining and More ❯
in data engineering, architecture, or platform management roles, with 5+ years in leadership positions. Expertise in modern data platforms (e.g., Azure, AWS, Google Cloud) and big data technologies (e.g., Spark, Kafka, Hadoop). Strong knowledge of data governance frameworks, regulatory compliance (e.g., GDPR, CCPA), and data security best practices. Proven experience in enterprise-level architecture design and implementation. Hands More ❯
South East London, London, United Kingdom Hybrid / WFH Options
TEN10 SOLUTIONS LIMITED
Driven Projects: Collaborate on exciting and complex data pipelines, platforms, and automation solutions across industries including finance, retail, and government. Cutting-Edge Tech: Work hands-on with tools like Spark, Databricks, Azure Deequ, and more. Career Development: We invest in your growth with dedicated training, mentoring, and support for certifications. Collaborative Culture: Join a diverse, inclusive team that thrives … you'll do: Maintain test automation frameworks tailored for data-intensive environments. Implement validation tests for data pipelines, data quality, and data transformation logic. Use tools like Azure Deequ , Spark , and Databricks to ensure data accuracy and completeness. Write robust, scalable test scripts in Scala , Python , and Java . Integrate testing into CI/CD pipelines and support infrastructure … and data validation techniques. Experience using test automation frameworks for data pipelines and ETL workflows Strong communication and stakeholder management skills. Nice-to-Have: Hands-on experience with Databricks , ApacheSpark , and Azure Deequ . Familiarity with Big Data tools and distributed data processing. Experience with data observability and data quality monitoring. Proficiency with CI/CD tools More ❯
Reading, Oxfordshire, United Kingdom Hybrid / WFH Options
Henderson Drake
e.g., Azure Data Factory, Synapse, Databricks, Fabric) Data warehousing and lakehouse design ETL/ELT pipelines SQL, Python for data manipulation and machine learning Big Data frameworks (e.g., Hadoop, Spark) Data visualisation (e.g., Power BI) Understanding of statistical analysis and predictive modelling Experience: 5+ years working with Microsoft data platforms 5+ years in a customer-facing consulting or professional More ❯
Reading, Berkshire, United Kingdom Hybrid / WFH Options
Henderson Drake
e.g., Azure Data Factory, Synapse, Databricks, Fabric) Data warehousing and lakehouse design ETL/ELT pipelines SQL, Python for data manipulation and machine learning Big Data frameworks (e.g., Hadoop, Spark) Data visualisation (e.g., Power BI) Understanding of statistical analysis and predictive modelling Experience: 5+ years working with Microsoft data platforms 5+ years in a customer-facing consulting or professional More ❯
Haywards Heath, Sussex, United Kingdom Hybrid / WFH Options
First Central Services
to define Bicep templates and deploy Azure infrastructure effectively. Experience creating Azure DevOps YAML pipelines. Knowledge of source control, preferably Git. Strong skills in SQL, Azure Synapse, Data Factory, Spark, and Databricks. Experience establishing coding standards and testing practices. Deep understanding of the full data lifecycle. Involvement in implementing CI/CD methodologies. Experience working in agile, rapid development More ❯
management and monitoring. Hands-on experience with AWS Have a good grasp of IaC (Infrastructure-as-code) tools like Terraform and CloudFormation. Previous exposure to additional technologies like Python, Spark, Docker, Kubernetes is desirable. Ability to develop across a diverse technology stack and willingness and ability to take on new technologies. Demonstrated experience participating on cross functional teams in More ❯
Reigate, Surrey, England, United Kingdom Hybrid / WFH Options
esure Group
time, taking project briefs and refining them to strong results. Exposure to Python data science stack Knowledge and working experience of AGILE methodologies. Proficient with SQL. Familiarity with Databricks, Spark and geospatial data/modelling are a plus. Interview process: Subject to change 1) 15minute conversation with Talent Team 2) 30minute interview with hiring manager 3) You'll be More ❯
management and monitoring. Hands-on experience with AWS Have a good grasp of IaC (Infrastructure-as-code) tools like Terraform and CloudFormation. Previous exposure to additional technologies like Python, Spark, Docker, Kubernetes is desirable. Ability to develop across a diverse technology stack and willingness and ability to take on new technologies. Demonstrated experience participating on cross functional teams in More ❯
to join our defence & security client on a contract basis. You will helpdesign, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi These roles are supporting our clients team in Worcester (fully onsite), and requires active UK DV clearance. Key Responsibilities: Design, develop, and maintain secure and scalable data pipelines … using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi Implement data ingestion, transformation, and integration processes, ensuring data quality and security Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards Manage and monitor large-scale data flows in real-time, ensuring system performance, reliability, and data integrity Develop robust data … Experience working as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with Apache NiFi for building and managing complex data flows and integration processes. Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control. Familiarity with data governance More ❯
to join our defence & security client on a contract basis. You will helpdesign, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi These roles are supporting our clients team in Worcester (fully onsite), and requires active UK DV clearance. Key Responsibilities: Design, develop, and maintain secure and scalable data pipelines … using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi Implement data ingestion, transformation, and integration processes, ensuring data quality and security Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards Manage and monitor large-scale data flows in real-time, ensuring system performance, reliability, and data integrity Develop robust data … Experience working as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with Apache NiFi for building and managing complex data flows and integration processes. Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control. Familiarity with data governance More ❯
to join our defence & security client on a contract basis. You will helpdesign, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi These roles are supporting our clients team in Worcester (fully onsite), and requires active UK DV clearance. Key Responsibilities: Design, develop, and maintain secure and scalable data pipelines … using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi Implement data ingestion, transformation, and integration processes, ensuring data quality and security Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards Manage and monitor large-scale data flows in real-time, ensuring system performance, reliability, and data integrity Develop robust data … Experience working as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with Apache NiFi for building and managing complex data flows and integration processes. Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control. Familiarity with data governance More ❯
to join our defence & security client on a contract basis. You will helpdesign, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi These roles are supporting our clients team in Worcester (fully onsite), and requires active UK DV clearance. Key Responsibilities: Design, develop, and maintain secure and scalable data pipelines … using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi Implement data ingestion, transformation, and integration processes, ensuring data quality and security Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards Manage and monitor large-scale data flows in real-time, ensuring system performance, reliability, and data integrity Develop robust data … Experience working as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with Apache NiFi for building and managing complex data flows and integration processes. Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control. Familiarity with data governance More ❯
to join our defence & security client on a contract basis. You will helpdesign, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi These roles are supporting our clients team in Worcester (fully onsite), and requires active UK DV clearance. Key Responsibilities: Design, develop, and maintain secure and scalable data pipelines … using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi Implement data ingestion, transformation, and integration processes, ensuring data quality and security Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards Manage and monitor large-scale data flows in real-time, ensuring system performance, reliability, and data integrity Develop robust data … Experience working as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with Apache NiFi for building and managing complex data flows and integration processes. Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control. Familiarity with data governance More ❯
to join our defence & security client on a contract basis. You will helpdesign, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi These roles are supporting our clients team in Worcester (fully onsite), and requires active UK DV clearance. Key Responsibilities: Design, develop, and maintain secure and scalable data pipelines … using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi Implement data ingestion, transformation, and integration processes, ensuring data quality and security Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards Manage and monitor large-scale data flows in real-time, ensuring system performance, reliability, and data integrity Develop robust data … Experience working as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with Apache NiFi for building and managing complex data flows and integration processes. Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control. Familiarity with data governance More ❯
to join our defence & security client on a contract basis. You will helpdesign, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi These roles are supporting our clients team in Worcester (fully onsite), and requires active UK DV clearance. Key Responsibilities: Design, develop, and maintain secure and scalable data pipelines … using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi Implement data ingestion, transformation, and integration processes, ensuring data quality and security Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards Manage and monitor large-scale data flows in real-time, ensuring system performance, reliability, and data integrity Develop robust data … Experience working as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with Apache NiFi for building and managing complex data flows and integration processes. Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control. Familiarity with data governance More ❯
to join our defence & security client on a contract basis. You will helpdesign, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi These roles are supporting our clients team in Worcester (fully onsite), and requires active UK DV clearance. Key Responsibilities: Design, develop, and maintain secure and scalable data pipelines … using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi Implement data ingestion, transformation, and integration processes, ensuring data quality and security Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards Manage and monitor large-scale data flows in real-time, ensuring system performance, reliability, and data integrity Develop robust data … Experience working as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with Apache NiFi for building and managing complex data flows and integration processes. Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control. Familiarity with data governance More ❯
to join our defence & security client on a contract basis. You will helpdesign, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi These roles are supporting our clients team in Worcester (fully onsite), and requires active UK DV clearance. Key Responsibilities: Design, develop, and maintain secure and scalable data pipelines … using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi Implement data ingestion, transformation, and integration processes, ensuring data quality and security Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards Manage and monitor large-scale data flows in real-time, ensuring system performance, reliability, and data integrity Develop robust data … Experience working as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with Apache NiFi for building and managing complex data flows and integration processes. Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control. Familiarity with data governance More ❯
to join our defence & security client on a contract basis. You will helpdesign, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi These roles are supporting our clients team in Worcester (fully onsite), and requires active UK DV clearance. Key Responsibilities: Design, develop, and maintain secure and scalable data pipelines … using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi Implement data ingestion, transformation, and integration processes, ensuring data quality and security Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards Manage and monitor large-scale data flows in real-time, ensuring system performance, reliability, and data integrity Develop robust data … Experience working as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with Apache NiFi for building and managing complex data flows and integration processes. Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control. Familiarity with data governance More ❯
to join our defence & security client on a contract basis. You will helpdesign, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi These roles are supporting our clients team in Worcester (fully onsite), and requires active UK DV clearance. Key Responsibilities: Design, develop, and maintain secure and scalable data pipelines … using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi Implement data ingestion, transformation, and integration processes, ensuring data quality and security Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards Manage and monitor large-scale data flows in real-time, ensuring system performance, reliability, and data integrity Develop robust data … Experience working as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with Apache NiFi for building and managing complex data flows and integration processes. Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control. Familiarity with data governance More ❯
to join our defence & security client on a contract basis. You will helpdesign, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi These roles are supporting our clients team in Worcester (fully onsite), and requires active UK DV clearance. Key Responsibilities: Design, develop, and maintain secure and scalable data pipelines … using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi Implement data ingestion, transformation, and integration processes, ensuring data quality and security Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards Manage and monitor large-scale data flows in real-time, ensuring system performance, reliability, and data integrity Develop robust data … Experience working as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with Apache NiFi for building and managing complex data flows and integration processes. Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control. Familiarity with data governance More ❯
to join our defence & security client on a contract basis. You will helpdesign, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi These roles are supporting our clients team in Worcester (fully onsite), and requires active UK DV clearance. Key Responsibilities: Design, develop, and maintain secure and scalable data pipelines … using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi Implement data ingestion, transformation, and integration processes, ensuring data quality and security Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards Manage and monitor large-scale data flows in real-time, ensuring system performance, reliability, and data integrity Develop robust data … Experience working as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with Apache NiFi for building and managing complex data flows and integration processes. Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control. Familiarity with data governance More ❯