Reigate, England, United Kingdom Hybrid / WFH Options
esure Group
cloud data warehouses such as Databricks/Snowflake, ideally on AWS Strong Python experience, including deep knowledge of the Python data ecosystem, with hands-on expertise in Spark and Airflow Hands-on experience in all phases of data modelling from conceptualization to database optimization supported by advanced SQL skills Hands-on Experience with implementing CICD, using Git, Jenkins or More ❯
in trust metrics or customer experience analysis Knowledge of dashboard design and data visualization best practices Experience with cloud-based data infrastructure (AWS) Familiarity with modern data stack tools (Airflow, dbt, etc.) Why This Role Matters Judge.me is at an inflection point. As the market leader in Shopify reviews, we've chosen to build our future with Shopify because More ❯
have advanced SQL skills; experience with dbt and/or Looker is strongly preferred You will be proficient with modern data platforms (e.g., Snowflake, dbt, AWS, GCP, Looker, Tableau, Airflow) You will have experience with version control tools such as Git You will have exposure to Python for data or analytics engineering tasks (preferred) You will demonstrate excellent problem More ❯
have advanced SQL skills; experience with dbt and/or Looker is strongly preferred You will be proficient with modern data platforms (e.g., Snowflake, dbt, AWS, GCP, Looker, Tableau, Airflow) You will have experience with version control tools such as Git You will have exposure to Python for data or analytics engineering tasks (preferred) You will demonstrate excellent problem More ❯
maturity across analytics and engineering teams. Builder at Heart: Comfortable rolling up your sleeves to code, model, and optimise - not just direct others. Bonus Points: Experience with orchestration frameworks (Airflow, Dagster, Prefect), Python, AWS ecosystem, and exposure to insurance, fintech, or regulated industries. Strategic Thinker: You thrive on aligning technical roadmaps with business goals, and you naturally think about More ❯
Our technology stack Python and associated ML/DS libraries (Scikit-learn, Numpy, LightlGBM, Pandas, TensorFlow, etc...) PySpark AWS cloud infrastructure: EMR, ECS, S3, Athena, etc. MLOps: Terraform, Docker, Airflow, MLFlow, Jenkins More Information Enjoy fantastic perks like private healthcare & dental insurance, a generous work from abroad policy, 2-for-1 share purchase plans, extra festive time off, and More ❯
Wideopen, England, United Kingdom Hybrid / WFH Options
Working Families Party
judgment and project ownership Preferred: Experience in political, movement, or campaign work is a plus Experience or familiarity with some or all of the following: orchestration tools (e.g. Prefect, Airflow) transformation layers (dbt) CICD tooling like Github Actions, Jenkins, CircleCI, etc cloud-based infrastructure (AWS, GCP) and IaC tools like Terraform or Ansible Python API connectors, like the Parsons More ❯
London, England, United Kingdom Hybrid / WFH Options
black.ai
in the face of many nuanced trade offs and varied opinions. Experience in a range of tools sets comparable with our own: Database technologies: SQL, Redshift, Postgres, DBT, Dask, airflow etc. AI Feature Development: LangChain, LangSmith, pandas, numpy, sci kit learn, scipy, hugging face, etc. Data visualization tools such as plotly, seaborn, streamlit etc You are Able to chart More ❯
intelligence tool, providing reliable data access to users throughout Trustpilot Design, build, maintain, and rigorously monitor robust data pipelines and transformative models using our modern data stack, including GCP, Airflow, dbt, and potentially emerging technologies like real-time streaming platforms Develop and manage reverse ETL processes to seamlessly integrate data with our commercial systems, ensuring operational efficiency Maintain and More ❯
Experience designing scalable ML infrastructure on cloud platforms (AWS SageMaker, GCP AI Platform, Azure ML, or equivalent). Solid understanding of data-engineering concepts: SQL/noSQL, data pipelines (Airflow, Prefect, or similar), and batch/streaming frameworks (Spark, Kafka). Leadership & Communication: Proven ability to lead cross-functional teams in ambiguous startup settings. Exceptional written and verbal communication More ❯
London, England, United Kingdom Hybrid / WFH Options
Tasman
existing tooling, specific requirements and available budget. Some of the products and platforms that you are likely to come across at Tasman are: AWS, GCP and Azure cloud environments; Airflow and Prefect; Snowflake, BigQuery, Athena, and Redshift; Airbyte, Stitch, Fivetran, and Meltano; dbt (both Cloud and Core); Looker, Metabase, Tableau and Hollistics; Docker and Kubernetes; Snowplow, Segment, Rudderstack, and More ❯
London, England, United Kingdom Hybrid / WFH Options
Hays
Your new role My client is a Financial Services organisation with offices based in Central London, and is looking for a Senior Data Engineer to join their ranks. What you'll need to succeed Experience working with relational databases (Oracle More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Hays
Your new role My client is a Financial Services organisation with offices based in Central London, and is looking for a Senior Data Engineer to join their ranks. What you'll need to succeed Experience working with relational databases (Oracle More ❯
About 9fin The world's largest asset class, debt, operates with the worst data. Technology has revolutionized equity markets with electronic trading, quant algos and instantaneous news. However, in debt capital markets, the picture is completely different. It still behaves More ❯
City of London, England, United Kingdom Hybrid / WFH Options
Jefferson Frank
Lead Data Engineer - Snowflake, DBT, Airflow - London - Up to £100k I'm working with a key client of ours here at TRG who are looking to grow out their Data & Analytics function. My client are globally renowned for being a leader within their relative field. Whilst they are a very well recognised house hold brand, they are also known … happy employee means higher productivity! So they offer very flexible working arrangements through both WFH options and flexi working hours. Experience required... * Expert in Snowflake * Strong DBT experience * Strong Airflow experience * Expert knowledge and understanding of Data Warehousing * Strong AWS experience This is a great opportunity to join outstanding organisation who pride themselves on being one of the best … is of an interest then get in touch ASAP. Send across your CV to t.shahid@nigelfrank.com or alternatively, give me a call on 0191 3387551. Keywords: Snowflake, DBT, SQL, Airflow, AWS, Engineer, DWH, Data Warehouse, Data Warehousing, Architecture, London #J-18808-Ljbffr More ❯
London, England, United Kingdom Hybrid / WFH Options
Widen the Net Limited
will develop scalable data pipelines, ensure data quality, and support business decision-making with high-quality datasets. Work across technology stack: SQL, Python, ETL, Big Query, Spark, Hadoop, Git, ApacheAirflow, Data Architecture, Data Warehousing Design and develop scalable ETL pipelines to automate data processes and optimize delivery Implement and manage data warehousing solutions, ensuring data integrity through … rigorous testing and validation Lead, plan and execute workflow migration and data orchestration using ApacheAirflow Focus on data engineering and data analytics Requirements: 5+ years of experience in SQL 5+ years of development in Python MUST have strong experience in ApacheAirflow Experience with ETL tools, data architecture, and data warehousing solutions This contract is More ❯
In this role, you will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and ApacheAirflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and … significant impact, we encourage you to apply! Job Responsibilities ETL/ELT Pipeline Development: Design, develop, and optimize efficient and scalable ETL/ELT pipelines using Python, PySpark, and Apache Airflow. Implement batch and real-time data processing solutions using Apache Spark. Ensure data quality, governance, and security throughout the data lifecycle. Cloud Data Engineering: Manage and optimize … effectiveness. Implement and maintain CI/CD pipelines for data workflows to ensure smooth and reliable deployments. Big Data & Analytics: Develop and optimize large-scale data processing pipelines using Apache Spark and PySpark. Implement data partitioning, caching, and performance tuning techniques to enhance Spark-based workloads. Work with diverse data formats (structured and unstructured) to support advanced analytics and More ❯
London, England, United Kingdom Hybrid / WFH Options
Circana
In this role, you will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and ApacheAirflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and … significant impact, we encourage you to apply! Job Responsibilities ETL/ELT Pipeline Development: Design, develop, and optimize efficient and scalable ETL/ELT pipelines using Python, PySpark, and Apache Airflow. Implement batch and real-time data processing solutions using Apache Spark. Ensure data quality, governance, and security throughout the data lifecycle. Cloud Data Engineering: Manage and optimize … effectiveness. Implement and maintain CI/CD pipelines for data workflows to ensure smooth and reliable deployments. Big Data & Analytics: Develop and optimize large-scale data processing pipelines using Apache Spark and PySpark. Implement data partitioning, caching, and performance tuning techniques to enhance Spark-based workloads. Work with diverse data formats (structured and unstructured) to support advanced analytics and More ❯
In this role, you will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and ApacheAirflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and … desire to make a significant impact, we encourage you to apply! Job Responsibilities Data Engineering & Data Pipeline Development Design, develop, and optimize scalable DATA workflows using Python, PySpark, and Airflow Implement real-time and batch data processing using Spark Enforce best practices for data quality, governance, and security throughout the data lifecycle Ensure data availability, reliability and performance through … data processing workloads Implement CI/CD pipelines for data workflows to ensure smooth and reliable deployments. Big Data & Analytics: Build and optimize large-scale data processing pipelines using Apache Spark and PySpark Implement data partitioning, caching, and performance tuning for Spark-based workloads. Work with diverse data formats (structured and unstructured) to support advanced analytics and machine learning More ❯
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients Excellent ETL skills, Data Modeling Skills Excellent communication skills Ability to define the … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills Designing Databricks based More ❯
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients Excellent ETL skills, Data Modeling Skills Excellent communication skills Ability to define the … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills Designing Databricks based More ❯
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients Excellent ETL skills, Data Modeling Skills Excellent communication skills Ability to define the … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills Designing Databricks based More ❯
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients Excellent ETL skills, Data Modeling Skills Excellent communication skills Ability to define the … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills Designing Databricks based More ❯
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients Ability to define the monitoring, alerting, deployment strategies for various services. Experience providing … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Designing Databricks based solutions for More ❯