field. Technical Skills Required Hands-on software development experience with Python and experience with modern software development and release engineering practices (e.g. TDD, CI/CD). Experience with Apache Spark or any other distributed data programming frameworks. Comfortable writing efficient SQL and debugging on cloud warehouses like Databricks SQL or Snowflake. Experience with cloud infrastructure like AWS or More ❯
architectures (Lambda, Fargate, Cloud Run, et al.) and a clear understanding of when not to use them. Experience with message queues (SQS, PubSub, RabbitMQ etc.) and data pipelines (Kafka, Beam, Kinesis, etc.) You are an effective team player with effective communication, presentation and influencing skills. You have a passion for improving coding and development practices. You have worked with More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Singular Recruitment
applications and high proficiency SQL for complex querying and performance tuning. ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (ApacheBeam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud Storage , Pub/Sub , and orchestrating workflows with Composer or Vertex Pipelines. Data Architecture & Modelling: Ability More ❯
and infrastructure as code (Terraform) for rapid deployment. Experience that will give you an edge Experience using Python on Google Cloud Platform for Big Data projects, including BigQuery, DataFlow (ApacheBeam), Cloud Run, Cloud Functions, Cloud Workflows, and Cloud Composer. Strong SQL development skills. Proven expertise in data modeling, ETL development, and data warehousing. Knowledge of data management More ❯
IV, IFRS 9, CRD4). Strong leadership and stakeholder engagement skills. 15+ years in software development and cloud engineering, ideally in financial services. Experience with big data frameworks (e.g., ApacheBeam, Spark) and data governance tools. About working for us Our ambition is to be the leading UK business for diversity, equity and inclusion supporting our customers, colleagues More ❯
IV, IFRS 9, CRD4). Strong leadership and stakeholder engagement skills. 15+ years in software development and cloud engineering, ideally in financial services. Experience with big data frameworks (e.g., ApacheBeam, Spark) and data governance tools. About working for us Our ambition is to be the leading UK business for diversity, equity and inclusion supporting our customers, colleagues More ❯
Familiarity with big data technologies ( e.g. , Spark, Hadoop)Background in time-series analysis and forecastingExperience with data governance and security best practicesReal-time data streaming is a plus (Kafka, Beam, Flink)Experience with Kubernetes is a plusEnergy/maritime domain knowledge is a plus What We Offer Competitive salary commensurate with experience and comprehensive benefits package (medical, dental, vision More ❯
Java, SQL, Search Technologies, Messaging Technologies with a good understanding of Kubernetes, Dockers and preferably some exposure to GCP (Google Cloud Platform), ReactJS, Python, Big Data Technologies like Spark, Beam Excellent personal organisation, completer/finisher, consensus builder with excellent communication skills and able to lead global development teams both directly and via influence. Extensive experience practicing TDD (Test More ❯
unsupervised, and reinforcement learning methods. Experience with GCP services such as Vertex AI, BigQuery ML, Dataflow, AI Platform Pipelines, and Dataproc. Solid knowledge of distributed systems, data streaming (e.g., ApacheBeam, Kafka), and large-scale data processing. ML Ops: Hands-on experience with continuous integration/deployment (CI/CD) for ML, model versioning, and monitoring. Business Acumen … Ability to understand marketing and advertising concepts like customer lifetime value (CLV), attribution modeling, real-time bidding (RTB), and audience targeting. Strong understanding of data pipeline orchestration tools (e.g., Apache Airflow, Kubernetes). You thrive when working as part of a team Comfortable in a fast-paced environment Have excellent written and verbal English skills Last but not least More ❯
unsupervised, and reinforcement learning methods. Experience with GCP services such as Vertex AI, BigQuery ML, Dataflow, AI Platform Pipelines, and Dataproc. Solid knowledge of distributed systems, data streaming (e.g., ApacheBeam, Kafka), and large-scale data processing. ML Ops: Hands-on experience with continuous integration/deployment (CI/CD) for ML, model versioning, and monitoring. Business Acumen … Ability to understand marketing and advertising concepts like customer lifetime value (CLV), attribution modeling, real-time bidding (RTB), and audience targeting. Strong understanding of data pipeline orchestration tools (e.g., Apache Airflow, Kubernetes). You thrive when working as part of a team Comfortable in a fast-paced environment Have excellent written and verbal English skills Last but not least More ❯
City of London, London, United Kingdom Hybrid / WFH Options
QiH Group
unsupervised, and reinforcement learning methods. Experience with GCP services such as Vertex AI, BigQuery ML, Dataflow, AI Platform Pipelines, and Dataproc. Solid knowledge of distributed systems, data streaming (e.g., ApacheBeam, Kafka), and large-scale data processing. ML Ops: Hands-on experience with continuous integration/deployment (CI/CD) for ML, model versioning, and monitoring. Business Acumen … Ability to understand marketing and advertising concepts like customer lifetime value (CLV), attribution modeling, real-time bidding (RTB), and audience targeting. Strong understanding of data pipeline orchestration tools (e.g., Apache Airflow, Kubernetes). You thrive when working as part of a team Comfortable in a fast-paced environment Have excellent written and verbal English skills Last but not least More ❯
TensorFlow, PyTorch, scikit-learn) Experience with cloud platforms (AWS, GCP, Azure) Experience with CI/CD pipelines for machine learning (e.g., Vertex AI) Familiarity with data processing tools like ApacheBeam/Dataflow Strong understanding of monitoring and maintaining models in production environments Experience with containerization tools (e.g., Docker) Problem-solving skills with the ability to troubleshoot model More ❯
constructive feedback to foster accountability, growth, and collaboration within the team. Who You Are Experienced with Data Processing Frameworks: Skilled with higher-level JVM-based frameworks such as Flink, Beam, Dataflow, or Spark. Comfortable with Ambiguity: Able to work through loosely defined problems and thrive in autonomous team environments. Skilled in Cloud-based Environments: Proficient with large-scale data More ❯
constructive feedback to foster accountability, growth, and collaboration within the team. Who You Are Experienced with Data Processing Frameworks: Skilled with higher-level JVM-based frameworks such as Flink, Beam, Dataflow, or Spark. Comfortable with Ambiguity: Able to work through loosely defined problems and thrive in autonomous team environments. Skilled in Cloud-based Environments: Proficient with large-scale data More ❯
indexing, partitioning. Hands-on IaC development experience with Terraform or CloudFormation. Understanding of ML development workflow and knowledge of when and how to use dedicated hardware. Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. Experience with data More ❯
Programming Mastery: Advanced skills in Python or another major language; writing clean, testable, production-grade ETL code at scale. Modern Data Pipelines: Experience with batch and streaming frameworks (e.g., Apache Spark, Flink, Kafka Streams, Beam), including orchestration via Airflow, Prefect or Dagster. Data Modeling & Schema Management: Demonstrated expertise in designing, evolving, and documenting schemas (OLAP/OLTP, dimensional More ❯
backend engineering, while growing the ability to work effectively across both. Experience with processing large-scale transactional and financial data, using batch/streaming frameworks like Spark, Flink, or Beam (with Scala for data engineering), and building scalable backend systems in Java. You possess a foundational understanding of system design, data structures, and algorithms, coupled with a strong desire More ❯
and CNNs Excellent communication skills Degree in CS, maths, statistics, engineering, physics or similar Desirable Requirements: NoSQL databases - Elasticsearch, MongoDB etc (bonus) Modern Data tools such as Spark/Beam (bonus) Streaming technologies such as Spark/Akka Streams (bonus) Tagged as: Industry , NLP , United Kingdom More ❯