robust way possible! Diverse training opportunities and social benefits (e.g. UK pension schema) What do you offer? Strong hands-on experience working with modern Big Data technologies such as Apache Spark, Trino, Apache Kafka, Apache Hadoop, Apache HBase, Apache Nifi, Apache Airflow, Opensearch Proficiency in cloud-native technologies such as containerization and Kubernetes Strong More ❯
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with Apache Airflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and observability More ❯
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with Apache Airflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and observability More ❯
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with Apache Airflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and observability More ❯
london (city of london), south east england, united kingdom
HCLTech
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with Apache Airflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and observability More ❯
/medical devices preferred but not required) Strong Python programming and data engineering skills (Pandas, PySpark, Dask) Proficiency with databases (SQL/NoSQL), ETL processes, and modern data frameworks (Apache Spark, Airflow, Kafka) Solid experience with cloud platforms (AWS, GCP, or Azure) and CI/CD for data pipelines Understanding of data privacy and healthcare compliance (GDPR, HIPAA, ISO More ❯
data quality, or other areas directly relevant to data engineering responsibilities and tasks Proven project experience developing and maintaining data warehouses in big data solutions (Snowflake) Expert knowledge in Apache technologies such as Kafka, Airflow, and Spark to build scalable and efficient data pipelines Ability to design, build, and deploy data solutions that capture, explore, transform, and utilize data More ❯
Linux-based, concurrent, high-throughput, low-latency software systems Experience with pipeline orchestration frameworks (e.g. Airflow, Dagster) Experience with streaming platforms (e.g. Kafka), data lake platforms (e.g. Delta Lake, Apache Iceberg), and relational databases Have a Bachelors or advanced degree in Computer Science, Mathematics, Statistics, Physics, Engineering, or equivalent work experience For more information about DRW's processing activities More ❯
MySQL Exposure to Docker, Kubernetes, AWS, Helm, Terraform, Vault, Grafana, ELK Stack, New Relic Relevant experience in the maintenance of data APIs and data lake architectures, including experience with Apache Iceberg, Trino/Presto, Clickhouse, Snowflake, BigQuery. Master's degree in Computer Science or Engineering-related field Get to know us better YouGov is a global online research company More ❯
problem solving skills We'd love to see: Knowledge of Database Systems and trade offs in the distributed systems Familiarity with API Designs Familiarity with Orchestration Frameworks such as Apache Airflow, Argo Workflows, Conductor etc. Experience working with and designing systems utilizing AWS Bloomberg is an equal opportunity employer and we value diversity at our company. We do not More ❯
problem solving skills We'd love to see: Knowledge of Database Systems and trade offs in the distributed systems Familiarity with API Designs Familiarity with Orchestration Frameworks such as Apache Airflow, Argo Workflows, Conductor etc. Experience working with and designing systems utilizing AWS Bloomberg is an equal opportunity employer and we value diversity at our company. We do not More ❯
data engineering tasks. Experience building and maintaining web scraping pipelines. Strong SQL skills, with expertise in performance tuning. Strong proficiency with dbt for data transformations. Hands-on experience with Apache Airflow or Prefect. Proficiency with GitHub, GitHub Actions, and CI/CD pipelines. Nice to have: Experience with GCP (BigQuery, Dataflow, Composer, Pub/Sub) or AWS. Familiarity with More ❯
field. Technical Skills Required Hands-on software development experience with Python and experience with modern software development and release engineering practices (e.g. TDD, CI/CD). Experience with Apache Spark or any other distributed data programming frameworks. Comfortable writing efficient SQL and debugging on cloud warehouses like Databricks SQL or Snowflake. Experience with cloud infrastructure like AWS or More ❯
Knowledge of MiFID II, Dodd-Frank regulations and controls. Knowledge/experience of FIX flows and RFQ workflows (TradeWeb, RFQ Hub, BlackRock, Bloomberg). Additional technology experience: React JS, Apache NiFi, Mongo, DBaaS, SaaS, Tibco/Solace or similar messaging middleware. Education Bachelor's degree or equivalent experience operating in a similar role. This job description provides a high More ❯
of automation IT WOULD BE NICE FOR THE SENIOR SOFTWARE ENGINEER TO HAVE.... Cloud based experience Microservice architecture or server-less architecture Big Data/Messaging technologies such as Apache Nifi/MiNiFi/Kafka TO BE CONSIDERED.... Please either apply by clicking online or emailing me directly to For further information please call me on . I can More ❯
of automation IT WOULD BE NICE FOR THE SENIOR SOFTWARE ENGINEER TO HAVE.... Cloud based experience Microservice architecture or server-less architecture Big Data/Messaging technologies such as Apache Nifi/MiNiFi/Kafka TO BE CONSIDERED.... Please either apply by clicking online or emailing me directly to For further information please call me on . I can More ❯
to solve complex client challenges Strong software engineering foundation in Python, JavaScript/TypeScript, SQL , and cloud platforms such as AWS, GCP, or Azure Familiarity with data technologies like Apache Spark or Databricks , and a structured, analytical approach to problem-solving If you're passionate about building AI-powered applications that positively impact millions of people and businesses, and More ❯
applicable. Essential Skills & Experience Proven experience owning and operating production data platforms within AWS. Strong understanding of AWS core services: EventBridge, Lambda, EC2, S3, and MWAA (Managed Workflows for Apache Airflow). Experience with infrastructure reliability, observability tooling, and platform automation. Solid experience with CI/CD pipelines, preferably Bitbucket Pipelines. Familiarity with Snowflake administration and deployment practices. Comfortable More ❯
data quality, security, and best practices Collaborate with cross-functional teams Implement and manage MLOps capabilities Essential Skills: Advanced Python programming skills Expertise in data engineering tools and frameworks (Apache Flink) Hands-on AWS experience (Serverless, CloudFormation, CDK) Strong understanding of containerization, CI/CD, and DevOps Modern data storage knowledge Sound like you? Please get your CV over More ❯
and Data Practice. You will have the following experience : 8+ years of experience in data engineering or cloud development. Strong hands-on experience with AWS services Proficiency in Databricks, Apache Spark, SQL, and Python. Experience with data modeling, data warehousing, and DevOps practices. Familiarity with Delta Lake, Unity Catalog, and Databricks REST APIs. Excellent problem-solving and communication skills. More ❯
and Data Practice. You will have the following experience : 8+ years of experience in data engineering or cloud development. Strong hands-on experience with AWS services Proficiency in Databricks, Apache Spark, SQL, and Python. Experience with data modeling, data warehousing, and DevOps practices. Familiarity with Delta Lake, Unity Catalog, and Databricks REST APIs. Excellent problem-solving and communication skills. More ❯
and Data Practice. You will have the following experience : 8+ years of experience in data engineering or cloud development. Strong hands-on experience with AWS services Proficiency in Databricks, Apache Spark, SQL, and Python. Experience with data modeling, data warehousing, and DevOps practices. Familiarity with Delta Lake, Unity Catalog, and Databricks REST APIs. Excellent problem-solving and communication skills. More ❯
london (city of london), south east england, united kingdom
Capgemini
and Data Practice. You will have the following experience : 8+ years of experience in data engineering or cloud development. Strong hands-on experience with AWS services Proficiency in Databricks, Apache Spark, SQL, and Python. Experience with data modeling, data warehousing, and DevOps practices. Familiarity with Delta Lake, Unity Catalog, and Databricks REST APIs. Excellent problem-solving and communication skills. More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Singular Recruitment
applications and high proficiency SQL for complex querying and performance tuning. ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (Apache Beam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud Storage , Pub/Sub , and orchestrating workflows with Composer or Vertex Pipelines. Data Architecture & Modelling: Ability to More ❯
West London, London, United Kingdom Hybrid / WFH Options
Young's Employment Services Ltd
a Senior Data Engineer, Tech Lead, Data Engineering Manager etc. Proven success with modern data infrastructure: distributed systems, batch and streaming pipelines Hands-on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in Python, Pyspark, Scala or Java. Experience operating in More ❯