and 20-30% on leadership and communication, ensuring all key builds and improvements flow through this individual. Working with a modern tech stack including AWS, Snowflake, Python, SQL, DBT, Airflow, Spark, Kafka, and Terraform, you'll drive automation and end-to-end data solutions that power meaningful insights. Ideal for ambitious, proactive talent from scale-up or start-up More ❯
Solid understanding of data modeling concepts (star/snowflake schemas, normalization). Experience with version control systems (e.g., Git) and CI/CD practices. Familiarity with orchestration tools (e.g., Airflow, dbt Cloud, Prefect) is a plus. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable More ❯
Solid understanding of data modeling concepts (star/snowflake schemas, normalization). Experience with version control systems (e.g., Git) and CI/CD practices. Familiarity with orchestration tools (e.g., Airflow, dbt Cloud, Prefect) is a plus. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable More ❯
Solid understanding of data modeling concepts (star/snowflake schemas, normalization). Experience with version control systems (e.g., Git) and CI/CD practices. Familiarity with orchestration tools (e.g., Airflow, dbt Cloud, Prefect) is a plus. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable More ❯
Solid understanding of data modeling concepts (star/snowflake schemas, normalization). Experience with version control systems (e.g., Git) and CI/CD practices. Familiarity with orchestration tools (e.g., Airflow, dbt Cloud, Prefect) is a plus. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable More ❯
london (city of london), south east england, united kingdom
Capgemini
Solid understanding of data modeling concepts (star/snowflake schemas, normalization). Experience with version control systems (e.g., Git) and CI/CD practices. Familiarity with orchestration tools (e.g., Airflow, dbt Cloud, Prefect) is a plus. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable More ❯
City of London, London, United Kingdom Hybrid / WFH Options
8Bit - Games Industry Recruitment
/AI, with strong proficiency in Python, SQL, and distributed data processing (e.g., PySpark). Hands-on experience with cloud data platforms (GCP, AWS, or Azure), orchestration frameworks (e.g., Airflow), and ELT/ETL tools. Familiarity with 2D and 3D data formats (e.g., OBJ, FBX, glTF), point clouds, and large-scale computer vision datasets-or the ability to quickly More ❯
/AI, with strong proficiency in Python, SQL, and distributed data processing (e.g., PySpark). Hands-on experience with cloud data platforms (GCP, AWS, or Azure), orchestration frameworks (e.g., Airflow), and ELT/ETL tools. Familiarity with 2D and 3D data formats (e.g., OBJ, FBX, glTF), point clouds, and large-scale computer vision datasets-or the ability to quickly More ❯
london, south east england, united kingdom Hybrid / WFH Options
8Bit - Games Industry Recruitment
/AI, with strong proficiency in Python, SQL, and distributed data processing (e.g., PySpark). Hands-on experience with cloud data platforms (GCP, AWS, or Azure), orchestration frameworks (e.g., Airflow), and ELT/ETL tools. Familiarity with 2D and 3D data formats (e.g., OBJ, FBX, glTF), point clouds, and large-scale computer vision datasets-or the ability to quickly More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
8Bit - Games Industry Recruitment
/AI, with strong proficiency in Python, SQL, and distributed data processing (e.g., PySpark). Hands-on experience with cloud data platforms (GCP, AWS, or Azure), orchestration frameworks (e.g., Airflow), and ELT/ETL tools. Familiarity with 2D and 3D data formats (e.g., OBJ, FBX, glTF), point clouds, and large-scale computer vision datasets-or the ability to quickly More ❯
slough, south east england, united kingdom Hybrid / WFH Options
8Bit - Games Industry Recruitment
/AI, with strong proficiency in Python, SQL, and distributed data processing (e.g., PySpark). Hands-on experience with cloud data platforms (GCP, AWS, or Azure), orchestration frameworks (e.g., Airflow), and ELT/ETL tools. Familiarity with 2D and 3D data formats (e.g., OBJ, FBX, glTF), point clouds, and large-scale computer vision datasets-or the ability to quickly More ❯
projects Support talent acquisition and continuous learning initiatives Knowledge and Experience Knowledge of ML model development and deployment frameworks (MLFlow, Kubeflow Advanced data querying (SQL) and data engineering pipelines (Airflow Extensive experience with comprehensive unit testing, integration testing, and test coverage strategies Experience working with Product Management teams and ability to translate complex technical concepts for non-technical stakeholders More ❯
of the below but we would love for you to have experience in some of the following areas: Proficiency in modern data engineering practices and technologies, including Prefect/Airflow, Python, dbt, Kubernetes, Kafka or similar Experience with Infrastructure as Code (IaC) and cloud-based services e.g deploying infrastructure on AWS using Terraform A deep understanding of data pipelines More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
development efficiency and deployment effectiveness, including Azure DevOps or GitHub Considerable experience designing and building operationally efficient pipelines, utilising core Cloud components, such as Azure Data Factory, Big Query, AirFlow, Google Cloud Composer and Pyspark etc Proven experience in modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
development efficiency and deployment effectiveness, including Azure DevOps or GitHub Considerable experience designing and building operationally efficient pipelines, utilising core Cloud components, such as Azure Data Factory, Big Query, AirFlow, Google Cloud Composer and Pyspark etc Proven experience in modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong More ❯
proficiency with data manipulation language including optimization techniques. Strong understanding of normalized/dimensional data modeling principles. Strong knowledge of multiple data storage subsystems. Strong experience of Terraform, AWS, Airflow, Docker, Github/Github actions, Jenkins/Teamcity• Strong AWS specific skills for Athena, Lambda, ECS, ECR, S3 and IAM Strong knowledge in industry best practices in development and More ❯
technology stack Python and associated ML/DS libraries (scikit-learn, NumPy, LightGBM, Pandas, TensorFlow, etc ) PySpark AWS cloud infrastructure: EMR, ECS, Athena, etc. MLOp/DevOps: Terraform, Docker, Airflow, MLFlow, NewRelic The interview process Recruiter Call (30 minutes) Meeting a Machine Learning Manager(30 minutes) Technical Interview with 2 x Engineers (90 mins) Final Interview with the Head More ❯
computer vision. Big Data Tools: Experience with big data platforms like Spark (PySpark) for handling large-scale datasets. MLOps: Familiarity with MLOps tools and concepts (e.g., Docker, Kubernetes, MLflow, Airflow) for model deployment and lifecycle management. Financial Domain Knowledge: Direct experience with at least two of the following domains: Credit Risk Modeling, Fraud Detection, Anti-Money Laundering (AML), Know More ❯
communicate technical solutions clearly to non-technical stakeholders Technical skills (a big plus): Knowledge of deep learning frameworks (PyTorch, TensorFlow), transformers, or LLMs Familiarity with MLOps tools (MLflow, SageMaker, Airflow, etc.) Experience with streaming data (Kafka, Kinesis) and distributed computing (Spark, Dask) Skills in data visualization apps (Streamlit, Dash) and dashboarding (Tableau, Looker) Domain experience in forecasting, optimisation, or More ❯
manipulation, modelling and scripting Experience with cloud services, such as AWS, or Google Cloud for scalable AI/ML development and deployment and knowledge of data pipelining (e.g. via Airflow) A minimum BA/BSc degree in Statistics, Mathematics, Physics, Computer Science or related quantitative degree. Masters/PhD What's in it for you? A range of flexible More ❯
City of London, London, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited
JIRA project tracking, maintain Confluence documentation, and oversee code design for efficient development and collaboration. - Lead technical design and implementation of data models, ETL processes, and visualisation tools (BigQuery, Airflow, SSIS, Tableau, SSRS). - Define and implement best practices in data modeling, including data warehouse and lakehouse integration (Medallion Architecture). Skills and Experience: - Strong understanding of structured data … operational dashboards. - Advanced proficiency with Microsoft BI Stack: SSIS, SSRS - Strong SQL Server skills and SQL querying experience - Hands-on experience with Google Cloud Platform tools including: BigQuery; Composer; ApacheAirflow; Stream; Informatica; Vertex AI - Tableau dashboard development and reporting - Python programming for data analysis - Data modelling (warehouse, lakehouse, medallion architecture) - Understanding of financial and insurance data models More ❯
that this Data Engineer position will require 2 days per week in Leeds city centre. The key skills required for this Data Engineer position are: Snowflake Python AWS DBT Airflow If you do have the relevant experience for this Data Engineer position, please do apply. More ❯
that this Data Engineer position will require 2 days per week in Leeds city centre. The key skills required for this Data Engineer position are: Snowflake Python AWS DBT Airflow If you do have the relevant experience for this Data Engineer position, please do apply. More ❯
in data engineering – Python, SQL, and cloud platforms (GCP or Azure). Experience designing or managing data pipelines and ETL processes . Exposure to orchestration tools such as dbt , Airflow , or Azure Data Factory . Good understanding of data architecture and automation . AdTech or MarTech experience would be a real plus. Confident mentoring and developing junior engineers/ More ❯
Coventry, West Midlands, England, United Kingdom Hybrid / WFH Options
Lorien
executing tests Requirements Strong experience as a Data Engineer (migrating legacy systems onto AWS, building data pipelines) Strong Python experience Tech stack experience required: AWS Glue, Redshift, Lambda, PySpark, Airflow SSIS or SAS experience (Desirable) Benefits Salary up to £57,500 + up to 20% bonus Hybrid working: 1 to 2 days a week in the office 28 days More ❯