scalable test automation frameworks, with a focus on backend, API, and data systems using tools like Pytest and Postman. Expertise in Pandas, SQL, and AWS analytics services (Glue, Athena, Redshift) for data profiling, transformation, and validation within data lakes. Solid experience with AWS (S3, Lambda, EMR, ECS/EKS, CloudFormation/Terraform) and understanding of cloud-native architectures and More ❯
journey of our data platform (AWS) Cloud Proficiency: Hands-on experience with at least one major cloud platform (AWS, Azure, or GCP) and its core data services (e.g., S3, Redshift, Lambda/Functions, Glue). Data Modelling: Deep understanding of ELT/ETL patterns, and data modelling techniques. CRM/Customer Data Focus: Experience working directly with data from More ❯
Engineers and Associates, and lead technical discussions and design sessions. Key requirements: Must-Have: Strong experience with AWS services: Glue, Lambda, S3, Athena, Step Functions, EventBridge, EMR, EKS, RDS, Redshift, DynamoDB. Strong Python development skills. Proficient with Docker , containerization, and virtualization. Hands-on experience with CI/CD , especially GitLab CI. Solid experience with Infrastructure as Code (Terraform, CloudFormation More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Snap Analytics
thought leader by authoring blogs, presenting at webinars, and engaging in external speaking opportunities. Technical Delivery Excellence You'll design and optimise cloud-based data architectures (e.g. Snowflake, Databricks, Redshift, Google BigQuery, Azure Synapse) to support advanced analytics. You'll build and automate scalable, secure, and high-performance data pipelines handling diverse data sources. You'll work closely with More ❯
may either leverage third party tools such as Fivetran, Airbyte, Stitch or build custom pipelines. We use the main data warehouses for dbt modelling and have extensive experience with Redshift, BigQuery and Snowflake. Recently we've been rolling out a serverless implementation of dbt and progressing work on internal product to build modular data platforms. When initially working with More ❯
cases. Qualifications: 5+ years of experience in analytics engineering, data engineering, or a related field. Advanced SQL skills and experience architecting solutions on modern data warehouses (e.g., Snowflake, BigQuery, Redshift). Hands-on experience with advanced modelling techniques in dbt. A deep understanding of ETL/ELT processes and tools (e.g., Fivetran, Airbyte, Stitch). Experience with data visualisation More ❯
ELT processes, automation Technical Requirements: Strong proficiency in SQL and Python programming Extensive experience with data modeling and data warehouse concepts Advanced knowledge of AWS data services, including: S3, Redshift, AWS Glue, AWS Lambda Experience with Infrastructure as Code using AWS CDs Proficiency in ETL/ELT processes and best practices Experience with data visualization tools (Quicksight) Strong analytical More ❯
ELT processes, automation Technical Requirements: Strong proficiency in SQL and Python programming Extensive experience with data modeling and data warehouse concepts Advanced knowledge of AWS data services, including: S3, Redshift, AWS Glue, AWS Lambda Experience with Infrastructure as Code using AWS CDs Proficiency in ETL/ELT processes and best practices Experience with data visualization tools (Quicksight) Strong analytical More ❯
or Kafka Streams)? Select Which statement best describes your hands on responsibility for architecting and tuning cloud native data lake/warehouse solutions (e.g., AWS S3 + Glue/Redshift, GCP BigQuery, Azure Synapse)? Select What best reflects your experience building ETL/ELT workflows with Apache Airflow (or similar) and integrating them into containerised CI/CD pipelines More ❯
Leeds, West Yorkshire, England, United Kingdom Hybrid / WFH Options
Robert Walters
data ecosystem. Key Skills & Experience: Proven experience as a Senior/Lead Data Engineer in a large-scale environment. Strong expertise with AWS data services (e.g., S3, Glue, Lambda, Redshift, Athena, EMR). Experience designing and building data lakes and modern data platforms. Proficiency with Python, SQL, and data pipeline orchestration tools (e.g., Airflow, dbt). Strong understanding of More ❯
years experience with Python and/or Java Proficiency in SQL and relational databases Object-oriented design skills Experience with distributed systems, orchestration, micro-services AWS experience (e.g., S3, Redshift, Glue, Lambda) Bachelor's degree or equivalent in computer science or related field Experience with NoSQL databases is a plus AWS certifications are a plus Prior front office experience More ❯
of experience with Python and/or Java. Proficiency in SQL and relational databases. Experience with object-oriented design, distributed systems, orchestration, and micro-services. AWS experience (e.g., S3, Redshift, Glue, Lambda). Bachelor's degree or equivalent in computer science or related field. Experience with object databases/NoSQL and AWS certifications are a plus. Prior front-office More ❯
systems (e.g., Salesforce, Netsuite, or similar). Strong grasp of data governance, quality assurance, and security best practices. Bonus: Experience with Microsoft Fabric, cloud data warehouses (Azure, Snowflake, BigQuery, Redshift), or orchestration tools like dbt or Airflow. Collaboration & Communication Ability to communicate technical insights clearly to non-technical audiences. Experience partnering with leadership and cross-functional teams. Mindset & Growth More ❯
systems (e.g., Salesforce, Netsuite, or similar). Strong grasp of data governance, quality assurance, and security best practices. Bonus: Experience with Microsoft Fabric, cloud data warehouses (Azure, Snowflake, BigQuery, Redshift), or orchestration tools like dbt or Airflow. Collaboration & Communication Ability to communicate technical insights clearly to non-technical audiences. Experience partnering with leadership and cross-functional teams. Mindset & Growth More ❯
systems (e.g., Salesforce, Netsuite, or similar). Strong grasp of data governance, quality assurance, and security best practices. Bonus: Experience with Microsoft Fabric, cloud data warehouses (Azure, Snowflake, BigQuery, Redshift), or orchestration tools like dbt or Airflow. Collaboration & Communication Ability to communicate technical insights clearly to non-technical audiences. Experience partnering with leadership and cross-functional teams. Mindset & Growth More ❯
systems (e.g., Salesforce, Netsuite, or similar). Strong grasp of data governance, quality assurance, and security best practices. Bonus: Experience with Microsoft Fabric, cloud data warehouses (Azure, Snowflake, BigQuery, Redshift), or orchestration tools like dbt or Airflow. Collaboration & Communication Ability to communicate technical insights clearly to non-technical audiences. Experience partnering with leadership and cross-functional teams. Mindset & Growth More ❯
london (city of london), south east england, united kingdom
TGS International Group
systems (e.g., Salesforce, Netsuite, or similar). Strong grasp of data governance, quality assurance, and security best practices. Bonus: Experience with Microsoft Fabric, cloud data warehouses (Azure, Snowflake, BigQuery, Redshift), or orchestration tools like dbt or Airflow. Collaboration & Communication Ability to communicate technical insights clearly to non-technical audiences. Experience partnering with leadership and cross-functional teams. Mindset & Growth More ❯
both at the Board/Executive level and at the business unit level. Key Responsibilities Design, develop, and maintain scalable ETL pipelines using technologies like dbt, Airbyte, Cube, DuckDB, Redshift, and Superset Work closely with stakeholders across the company to gather data requirements and setup dashboards Promote a data driven culture at Notabene and train, upskill power-users across More ❯
ies (e. g ., Hadoop, Spark, Kafka). Proficiency in cloud platforms such as AWS, Azure, or Goo g le Cloud and cloud-based data services (e.g ., AWS Redshift, Azure Synapse Analytics, Goog le Bi g Query). Experience with DataOps practices and tools, includin g CI/CD for data pipelines. Excellent leadership, communication, and interpersonal skills More ❯
ies (e. g ., Hadoop, Spark, Kafka). Proficiency in cloud platforms such as AWS, Azure, or Goo g le Cloud and cloud-based data services (e.g ., AWS Redshift, Azure Synapse Analytics, Goog le Bi g Query). Experience with DataOps practices and tools, includin g CI/CD for data pipelines. Excellent leadership, communication, and interpersonal skills More ❯
ies (e. g ., Hadoop, Spark, Kafka). Proficiency in cloud platforms such as AWS, Azure, or Goo g le Cloud and cloud-based data services (e.g ., AWS Redshift, Azure Synapse Analytics, Goog le Bi g Query). Experience with DataOps practices and tools, includin g CI/CD for data pipelines. Excellent leadership, communication, and interpersonal skills More ❯
data infrastructure. What you'll be doing - your accountabilities Lead the design and implementation of robust, scalable, and secure data solutions using AWS services such as S3, Glue, Lambda, Redshift, EMR, Kinesis, and more-covering data pipelines, warehousing, and lakehouse architectures. Drive the migration of legacy data workflows to Lakehouse architectures, leveraging Apache Iceberg to enable unified analytics and More ❯
Reading, Berkshire, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
pipelines and applications that process complex datasets from multiple operational systems.Key Responsibilities: Build and maintain AWS-based ETL/ELT pipelines using S3, Glue (PySpark/Python), Lambda, Athena, Redshift, and Step Functions Develop backend applications to automate and support compliance reporting Process and validate complex data formats including nested JSON, XML, and CSV Collaborate with stakeholders to deliver More ❯
Reading, Berkshire, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
and applications that process complex datasets from multiple operational systems. Key Responsibilities: Build and maintain AWS-based ETL/ELT pipelines using S3, Glue (PySpark/Python), Lambda, Athena, Redshift, and Step Functions Develop backend applications to automate and support compliance reporting Process and validate complex data formats including nested JSON, XML, and CSV Collaborate with stakeholders to deliver More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Parking Network BV
re an experienced data engineering professional with a proven ability to lead and inspire teams. You bring deep technical skills in Python, SQL, and AWS services such as EC2, Redshift, Lambda and Kinesis, alongside strong stakeholder management and commercial awareness. You'll also bring: Proven experience designing and implementing data pipelines, ETL processes, and warehousing in cloud environments. The More ❯