City of London, London, United Kingdom Hybrid / WFH Options
Fortice
between the data warehouse and other systems. Create deployable data pipelines that are tested and robust using a variety of technologies and techniques depending on the available technologies (Nifi, Spark) Build analytics tools that utilise the data pipeline to provide actionable insights into client requirements, operational efficiency, and other key business performance metrics. Complete onsite client visits and provide More ❯
Central London, London, England, United Kingdom Hybrid / WFH Options
Reed
align tech strategy with business objectives and cost efficiency. Security & Compliance : Strong understanding of GDPR, API authentication, and observability. Big Data : Experience with data lakes, warehouses, and tools like Spark, Kafka, and Airflow. ETL Expertise : Ability to evaluate and optimize data ingestion and transformation pipelines. DevOps & CI/CD : Hands-on experience with Jenkins, GitHub Actions, Terraform, and CloudFormation. More ❯
Employment Type: Full-Time
Salary: £120,000 - £150,000 per annum, Inc benefits
City of London, London, United Kingdom Hybrid / WFH Options
Anson Mccade
knowledge of Kafka, Confluent, Databricks, Unity Catalog, and cloud-native architecture. Skilled in Data Mesh, Data Fabric, and product-led data strategy design. Experience with big data tools (e.g., Spark), ETL/ELT, SQL/NoSQL, and data visualisation. Confident communicator with a background in consultancy, stakeholder management, and Agile delivery. Want to hear more? Message me anytime. Linked More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Omnis Partners
deployment. 🛠️ Key Responsibilities Build and maintain high-performance data pipelines to power AI/ML use cases Architect cloud-native data platforms using tools like Databricks , Airflow , Snowflake , and Spark Collaborate with AI/ML teams to align data processing with model requirements Develop ETL/ELT workflows to support feature engineering, model training, and inference Optimise data workflows … with Scala or Java Experience supporting AI/ML workflows and working with Data Scientists Exposure to cloud platforms: AWS , Azure , or GCP Hands-on with modern data tooling: Spark , Databricks , Snowflake , Airflow Solid grasp of data modelling, orchestration, and infrastructure-as-code (Terraform, Docker, CI/CD) Excellent communication and client-facing skills—comfortable leading on technical delivery More ❯