South West London, London, United Kingdom Hybrid/Remote Options
ARC IT Recruitment Ltd
. AWS Platform Build: Demonstrable experience designing and building modern data platforms in AWS. ETL/Orchestration Expertise: Expertise in ETL/ELT design and data orchestration, specifically with Apache Airflow. SQL Mastery: Strong SQL skills with significant experience in query tuning and performance optimisation. Programming Proficiency: Proficiency in Python and Bash (for data processing, scripting, and automation). More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Arc IT Recruitment
. AWS Platform Build: Demonstrable experience designing and building modern data platforms in AWS. ETL/Orchestration Expertise: Expertise in ETL/ELT design and data orchestration, specifically with Apache Airflow. SQL Mastery: Strong SQL skills with significant experience in query tuning and performance optimisation. Programming Proficiency: Proficiency in Python and Bash (for data processing, scripting, and automation). More ❯
East London, London, United Kingdom Hybrid/Remote Options
Client Server
London office. About you : You have strong Python backend software engineer skills You have experience working with large data sets You have experience of using PySpark and ideally also Apache Spark You believe in automating wherever possible You're a collaborative problem solver with great communication skills Other technology in the stack includes: FastAPI, Django, Airflow, Kafka, ETL, CI More ❯
AWS (S3, Lambda, Glue, Redshift) and/or Azure (Data Lake, Synapse). Programming & Scripting:Proficiency in Python, SQL, PySpark etc. ETL/ELT & Streaming:Expertise in technologies like Apache Airflow, Glue, Kafka, Informatica, EventBridge etc. Industrial Data Integration:Familiarity with OT data schema originating from OSIsoft PI, SCADA, MES, and Historian systems. Information Modeling:Experience in defining semantic More ❯
Google Cloud, Databricks) are a strong plus Technical Skills: • Proficiency in SQL and experience with relational databases (e.g., MySQL, PostgreSQL) • Familiarity with data pipeline and workflow management tools (e.g., Apache Airflow) • Experience with programming languages such as Python, Java, or Scala. Python is highly preferred • Basic understanding of cloud platforms and services (e.g., AWS, Azure, Google Cloud) • Knowledge of More ❯
City Of Westminster, London, United Kingdom Hybrid/Remote Options
Additional Resources
of Kubernetes, Docker, and cloud-native data ecosystems. Demonstrable experience with Infrastructure as Code tools (Terraform, Ansible). Hands-on experience with PostgreSQL and familiarity with lakehouse technologies (e.g. Apache Parquet, Delta Tables). Exposure to Spark, Databricks, and data lake/lakehouse environments. Understanding of Agile development methods, CI/CD pipelines, GitHub, and automated testing. Practical experience More ❯
using AWS PaaS such as Glue, EMR, Sagemaker, Redshift, Aurora and Snowflake Building data processing and analytics pipelines as code, using Python, SQL, PySpark, spark, CloudFormation, lambda, step functions, Apache Airflow Monitoring and reporting on the data platform performance, usage and security Designing and applying security and access control architectures to secure sensitive data You will have: 6+ years More ❯
could deploy infrastructure into different environments Owning the cloud infrastructure underpinning data systems through a DevOps approach Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop Good understanding of the possible architectures involved in modern data system design (e.g. Data Warehouse, Data Lakes and Data Meshes) and the different use cases More ❯
of automation IT WOULD BE NICE FOR THE SENIOR SOFTWARE ENGINEER TO HAVE. Cloud based experience Microservice architecture or server-less architecture Big Data/Messaging technologies such as Apache Nifi/MiNiFi/Kafka TO BE CONSIDERED. Please either apply by clicking online or emailing me directly to For further information please call me on 07704 152 640. More ❯
Desirable Terraform or other Infrastructure as Code tools Experience with NoSQL databases (e.g., MongoDB) or SQL (e.g., Postgres) Understanding of web standards, accessibility and development best practice Experience with Apache NiFi Bonus Skills Microservices in C# Integrating LLMs (e.g., LangChain, Vercel AI) Experience using the GOV.UK Design System and Nunjucks What You'll Love You'll be part of More ❯
experience Preferred Experience with policy languages such as Rego or Cedar Experience with cloud databases such as Amazon DynamoDB, Aurora or Snowflake Experience with data platform tools such as Apache Flink and Kafka Experience with Java/python CI/CD pipelines – CloudBees/Jenkins SailPoint is an equal opportunity employer and we welcome all qualified candidates to apply More ❯
experience Preferred Experience with policy languages such as Rego or Cedar Experience with cloud databases such as Amazon DynamoDB, Aurora or Snowflake Experience with data platform tools such as Apache Flink and Kafka Experience with Java/python CI/CD pipelines – CloudBees/Jenkins SailPoint is an equal opportunity employer and we welcome all qualified candidates to apply More ❯
GCP/BigQuery or other cloud data warehouses (e.g., Snowflake, Redshift). Familiarity with data orchestration tools (e.g., Airflow). Experience with data visualisation platforms (such as Preset.io/Apache Superset or other). Exposure to CI/CD pipelines, ideally using GitLab CI. Background working with media, marketing, or advertising data. The Opportunity: Work alongside smart, supportive teammates More ❯
London, Oxford Circus, United Kingdom Hybrid/Remote Options
Datatech
GCP/BigQuery or other cloud data warehouses (e.g., Snowflake, Redshift). ·Familiarity with data orchestration tools (e.g., Airflow). ·Experience with data visualisation platforms (such as Preset.io/Apache Superset or other). ·Exposure to CI/CD pipelines, ideally using GitLab CI. ·Background working with media, marketing, or advertising data. The Opportunity: ·Work alongside smart, supportive teammates More ❯
teams to build scalable data pipelines and contribute to digital transformation initiatives across government departments. Key Responsibilities Design, develop and maintain robust data pipelines using PostgreSQL and Airflow or Apache Spark Collaborate with frontend/backend developers using Node.js or React Implement best practices in data modelling, ETL processes and performance optimisation Contribute to containerised deployments (Docker/Kubernetes More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Harnham - Data & Analytics Recruitment
Proven experience designing and implementing end-to-end MLOps processes in a production environment. Cloud ML Stack: Expert proficiency with Databricks and MLflow . Big Data/Coding: Expert Apache Spark and Python engineering experience on large datasets. Core Engineering: Strong experience with GIT for version control and building CI/CD/release pipelines. Data Fundamentals: Excellent SQL More ❯
ensure data integrity and reliability. Optimise data workflows for performance, cost efficiency, and maintainability using modern data-engineering tools and platforms (e.g., Azure Data Factory, AWS Data Pipeline, Databricks, Apache Spark). Support the integration of data into visualisation platforms and analytical environments (e.g., Power BI, ServiceNow). Ensure adherence to data governance, security, and privacy policies. Document data More ❯
ensure data integrity and reliability. Optimise data workflows for performance, cost efficiency, and maintainability using modern data-engineering tools and platforms (e.g., Azure Data Factory, AWS Data Pipeline, Databricks, Apache Spark). Support the integration of data into visualisation platforms and analytical environments (e.g., Power BI, ServiceNow). Ensure adherence to data governance, security, and privacy policies. Document data More ❯
of containerisation and orchestration (e.g., Docker , Kubernetes , OpenShift ). Experience with CI/CD pipelines (e.g., Jenkins, TeamCity, Concourse). Familiarity with web/application servers such as NGINX, Apache, or JBoss. Exposure to monitoring and logging tools (ELK, Nagios, Splunk, DataDog, New Relic, etc.). Understanding of security and identity management (OAuth2, SSO, ADFS, Keycloak, etc.). Experience More ❯
GitHub, Jenkins) Proficiency with Git and cloud technologies Experience with Docker and agile delivery Desirable Skills Terraform or other IaC tools NoSQL (ideally MongoDB) or relational databases (e.g., Postgres) Apache NiFi Web standards, accessibility and development best practice Bonus Experience Microservices in C#/.NET Integrating LLMs with tools like LangChain or Vercel AI Additional Information You must be More ❯
to learn new technologies IT WOULD BE NICE FOR THE DATA ENGINEER TO HAVE.... Cloud based architectures Microservice architecture or server-less architecture Messaging/routing technologies such as Apache Nifi/RabbitMQ TO BE CONSIDERED.... Please either apply by clicking online or emailing me directly to For further information please call me on . I can make myself More ❯
Proven experience in platform migration projects and delivering secure, scalable cloud solutions. Nice to have: Immediate availability Understanding of Synapse Spark Architecture Familiarity with a variety of technologies, including Apache Spark, Spark ML, Serverless Spark Pools, TSQL, ETL Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. More ❯
Employment Type: Contract
Rate: £600.0 - £800.0 per day + £market rate (Inside IR35)
Proven experience in platform migration projects and delivering secure, scalable cloud solutions. Nice to have: Immediate availability Understanding of Synapse Spark Architecture Familiarity with a variety of technologies, including Apache Spark, Spark ML, Serverless Spark Pools, TSQL, ETL Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. More ❯
optimization of SQL queries. Experience in capturing reporting requirements, designing reports, and developing dashboards using an interactive and agile approach. Experience with at least one BI platform, such as Apache Superset, Looker, Metabase, SAP BI/SAC, MicroStrategy, Microsoft Power BI, Tableau, or Qlik. Familiarity with agile development methodologies. Nice to have Experience in Java Experience in test driven More ❯
Strong understanding of Frontend Technologies such as React, Redux, TypeScript, Material UI, Storybook, Jest, and Scala.js Proven working knowledge of Backend & Databases such as Scala (v2), Play Framework, ScalaTest, Apache Pekko (including testkits), Mockito, PostgreSQL, and MongoDB. Familiarity with DevOps & Cloud tools like Docker, Kubernetes, Sysdig, OpenSearch, and AWS. Excellent communication and stakeholder management skills. Nice to have: Immediate More ❯
Employment Type: Contract
Rate: Up to £425.0 per day + Up to £425 per day (Inside IR35)