London, South East, England, United Kingdom Hybrid/Remote Options
Involved Solutions
Desirable Skills for the AWS Data Engineer: Experience with Databricks , Kafka , or Kinesis for real-time data streaming Knowledge of containerisation (Docker, ECS) and modern orchestration tools such as Airflow Familiarity with machine learning model deployment pipelines or data lakehouse architectures Data Engineer, AWS Data Engineer More ❯
Azure, or GCP, with hands-on experience in cloud-based data services. Proficiency in SQL and Python for data manipulation and transformation. Experience with modern data engineering tools, including Apache Spark, Kafka, and Airflow. Strong understanding of data modelling, schema design, and data warehousing concepts. Familiarity with data governance, privacy, and compliance frameworks (e.g., GDPR, ISO27001). Hands-on More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Robert Half
/ETL processes for improved performance, cost efficiency, and scalability.* Troubleshoot and resolve data pipeline issues swiftly and effectively across the data platform.* Work with orchestration tools such as Airflow, ADF, or Prefect to schedule and automate workflows.* Keep abreast of industry trends and emerging technologies in data engineering, and continuously improve your skills and knowledge. Profile * Minimum More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Lorien
data storytelling and operational insights. Optimise data workflows across cloud and on-prem environments, ensuring performance and reliability. Skills & Experience: Strong experience in ETL pipeline development using tools like ApacheAirflow, Informatica, or similar. Advanced SQL skills and experience with large-scale relational and cloud-based databases. Hands-on experience with Tableau for data visualisation and dashboarding. Exposure More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Searchability
are giving express consent for us to process (subject to required skills) your application to our client in conjunction with this vacancy only. KEY SKILLS: GCP | Python | SQL | MongoDB | Airflow | dbt | Terraform | Docker | ETL | AI | Machine Learning More ❯
Machine Learning: Experience supporting and understanding ML pipelines and models in a production setting. Direct experience with Google Cloud Platform, BigQuery, and associated tooling. Experience with workflow tools like Airflow or Kubeflow. Familiarity with dbt (Data Build Tool). Please send your CV for more information on these roles. Reasonable Adjustments: Respect and equality are core values to us. More ❯
and dimensional data modelling (SCDs, fact/dim, conformed dimensions) Experience with PostgreSQL optimisation. Advanced Python skills ETL/ELT Pipelines: Hands-on experience building pipelines using SSIS, dbt, Airflow, or similar Strong understanding of enterprise ETL frameworks, lineage, and data quality Cloud & Infrastructure: Experience designing and supporting AWS-based analytical infrastructure Skilled in working with S3 and integrating More ❯
libraries, and container environments for production readiness. Deployment & Delivery Oversight Provide technical leadership across the full deployment life cycle. Partner with datacenter operations to ensure correct rack layouts, cabling, airflow and power design. Support delivery teams during build-out phases, ensuring the design is executed correctly. Participate in factory acceptance tests (FAT), site acceptance tests (SAT), and operational readiness More ❯
cross-functional teams to build scalable data pipelines and contribute to digital transformation initiatives across government departments. Key Responsibilities Design, develop and maintain robust data pipelines using PostgreSQL and Airflow or Apache Spark Collaborate with frontend/backend developers using Node.js or React Implement best practices in data modelling, ETL processes and performance optimisation Contribute to containerised deployments … essential Operate within Agile teams and support DevOps practices What We're Looking For Proven experience as a Data Engineer in complex environments Strong proficiency in PostgreSQL and either Airflow or Spark Solid understanding of Node.js or React for integration and tooling Familiarity with containerisation technologies (Docker/Kubernetes) is a plus Excellent communication and stakeholder engagement skills Experience More ❯