and BI . Advanced skills in SQL and Python , with hands-on experience in relational databases across cloud and on-prem environments. Familiarity with modern data technologies such as Apache Spark , Kafka , or Snowflake . A comprehensive understanding of the data engineering lifecycle, including Agile delivery , DevOps , Git , APIs , containers , microservices , and pipeline orchestration . Nice to have: DP More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Medialab Group
data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation tools (we use Preset.io/Apache Superset). Exposure to CI/CD pipelines (GitLab CI preferred). Experience with advertising or media data is a plus. More ❯
with SPARQL and Python. Strong understanding of graph databases (e.g., RDF, Neo4j, GraphDB). Experience in data modeling and schema design. Familiarity with data pipeline tools and frameworks (e.g., Apache Airflow, Luigi). Excellent problem-solving and analytical skills. Ability to work independently and collaboratively within cross-functional teams. Preferred Background Experience in Health Informatics or Clinical Decision Support More ❯
web analytics, content management systems (CMS), subscription platforms, ad tech, and social media. Ability to automate and optimise data workflows, using modern ETL/ELT tools (e.g., Airflow, dbt, Apache Spark) to ensure timely and reliable delivery of data. Experience building robust data models and reporting layers to support performance dashboards, user engagement analytics, ad revenue tracking, and A … with 2+ years of hands-on experience in a data engineering role. Tools & Technologies: Databases: Proficient in relational SQL databases. Workflow Management Tools: Exposure to orchestration platforms such as Apache Airflow. Programming Languages: Skilled in one or more of the following languages, i.e.: Python, Java, Scala. Cloud Infrastructure: Understanding of cloud infrastructure such as GCP and tools within the More ❯
for Financial Services, Manufacturing, Life Sciences and Healthcare, Technology and Services, Telecom and Media, Retail and CPG, and Public Services. Consolidated revenues as of $13+ billion. Location - London Skill - Apache Hadoop We are looking for open-source contributors to Apache projects, who have an in-depth understanding of the code behind the Apache ecosystem, should have experience … possess in depth knowledge of bigdata tech stack. Requirement: Experience of platform engineering along with application engineering (hands-on) Experience in design of an open source platform based on Apache framework for Hadoop. Experience in integrating Infra-as-a-Code in their platform (Bespoke implementation from scratch) Experience of design & architect work for the open source Apache platform … in hybrid cloud environment Ability to do debug & fix code in the open source Apache code and should be an individual contributor to open source projects. Job description: The Apache Hadoop project requires up to 3 individuals with experience in designing and building platforms, and supporting applications both in cloud environments and on-premises. These resources are expected More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Rise Technical Recruitment Limited
a trusted partner across a wide range of businesses. In this role you'll take ownership of the reliability and performance of large-scale date pipelines built on AWS, Apache Flink, Kafka, and Python. You'll play a key role in diagnosing incidents, optimising system behaviour, and ensuring reporting data is delivered on time and without failure. The ideal … candidate will have a strong experience working with streaming and batch data systems, a solid understanding of monitoring a observability, and hands-on experience working with AWS, Apache Flink, Kafka, and Python. This is a fantastic opportunity to step into a SRE role focused on data reliability in a modern cloud native environment, with full ownership of incident management … with various other departments and teams to architect scalable, fault-tolerant data solutions The Person: *Experience in a data-focused SRE, Data Platform, or DevOps role *Strong knowledge of Apache Flink, Kafka, and Python in production environments *Hands-on AWS experience with AWS (Lambda, EMR, Step Functions, Redshift, etc.) *Comfortable with monitoring tools, distributed systems debugging, and incident response More ❯
City of London, London, United Kingdom Hybrid / WFH Options
QiH Group
unsupervised, and reinforcement learning methods. Experience with GCP services such as Vertex AI, BigQuery ML, Dataflow, AI Platform Pipelines, and Dataproc. Solid knowledge of distributed systems, data streaming (e.g., Apache Beam, Kafka), and large-scale data processing. ML Ops: Hands-on experience with continuous integration/deployment (CI/CD) for ML, model versioning, and monitoring. Business Acumen: Ability … to understand marketing and advertising concepts like customer lifetime value (CLV), attribution modeling, real-time bidding (RTB), and audience targeting. Strong understanding of data pipeline orchestration tools (e.g., Apache Airflow, Kubernetes). You thrive when working as part of a team Comfortable in a fast-paced environment Have excellent written and verbal English skills Last but not least, you More ❯