City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
in a commercial setting. Solid understanding of distributed systems (e.g. Hadoop, AWS, Kafka). Experience with SQL/NoSQL databases (e.g. PostgreSQL, Cassandra). Familiarity with orchestration tools (e.g. Airflow, Luigi) and cloud platforms (e.g. AWS, GCP). Passion for solving complex problems and mentoring others. Package: Salary from £(phone number removed) depending on experience Remote-first with flexible More ❯
Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Tenth Revolution Group
in a commercial setting. Solid understanding of distributed systems (e.g. Hadoop, AWS, Kafka). Experience with SQL/NoSQL databases (e.g. PostgreSQL, Cassandra). Familiarity with orchestration tools (e.g. Airflow, Luigi) and cloud platforms (e.g. AWS, GCP). Passion for solving complex problems and mentoring others. Package: Salary from £(phone number removed) depending on experience Remote-first with flexible More ❯
data analyst Knowledge of a variety of financial instruments, in particular exposure to derivatives instruments Experience working with SQL Experience with cloud storage solutions Experience with workflow management tools (Airflow/Argo) Prior experience writing documentation for senior stakeholders; the ability to accurately abstract and summarize technical information is critical Python programming skills: PySpark, Pandas, Jupyter Notebooks (3+ years More ❯
applications to the Cloud (AWS) We'd love to hear from you if you Have strong experience with Python & SQL Have experience developing data pipelines using dbt, Spark and Airflow Have experience Data modelling (building optimised and efficient data marts and warehouses in the cloud) Work with Infrastructure as code (Terraform) and containerising applications (Docker) Work with AWS, S3 More ❯
Cloud Data Warehouse - Snowflake AWS Data Solutions - Kinesis, SNS, SQS, S3, ECS, Lambda Data Governance & Quality - Collate & Monte Carlo Infrastructure as Code - Terraform Data Integration & Transformation - Python, DBT, Fivetran, Airflow CI/CD - Github Actions/Jenkins Business Intelligence - Looker Experience and Attributes we'd like to see Platform Engineering Expertise Extensive experience in platform engineering; designing, building, and More ❯
concepts to diverse audiences and collaborate effectively across teams. Bonus Points For: Familiarity with data visualization tools (e.g., Tableau, Power BI, Looker). Experience with specific orchestration tools (e.g., Airflow, dbt). Experience working in Agile/Scrum development methodologies. Experience with Big Data Technologies & Frameworks Join Us! This role can be based in either of our amazing offices More ❯
transformation. Deep understanding of cloud-based data architecture, particularly with GCP (BigQuery, Cloud Functions, Pub/Sub, etc.) or AWS equivalents. Hands-on experience with orchestration tools such as Airflow or DBT. 3+ years in data engineering, preferably including at least one role supporting a live or F2P game. Experience with analytics and marketing APIs (e.g. Appsflyer, Applovin, IronSource More ❯
Power markets. Expert in Python and SQL; strong experience with data engineering libraries (e.g., Pandas, PySpark, Dask). Deep knowledge of ETL/ELT frameworks and orchestration tools (e.g., Airflow, Azure Data Factory, Dagster). Proficient in cloud platforms (preferably Azure) and services such as Data Lake, Synapse, Event Hubs, and Functions. Authoring reports and dashboards with either open More ❯
pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new services as More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Circle Recruitment
and practices Proficiency in Python and SQL for data engineering tasks Experience with DBT and a good understanding of data modelling approaches (e.g. star schema, dimensional modelling) Familiarity with Airflow or similar orchestration tools Comfortable working with AWS services such as Glue and S3, or equivalent cloud infrastructure Experience using version control and CI/CD tools like Git More ❯
members to deliver outcome Highly motivated, with strong problem-solving skills and business acumen Desirable Skills Experience with data visualization tools (e.g., Looker, Tableau, Power BI) Familiarity with dbt, Airflow, or other modern data tools Exposure to marketing or e-commerce industries Additional Information Publicis Groupe operates a hybrid working pattern with full time employees being office-based three More ❯
stack Python and associated ML/DS libraries (scikit-learn, numpy, LightlGBM, Pandas, LangChain/LangGraph TensorFlow, etc ) PySpark AWS cloud infrastructure: EMR, ECS, Athena, etc. MLOps: Terraform, Docker, Airflow, MLFlow More information: Enjoy fantastic perks like private healthcare & dental insurance, a generous work from abroad policy, 2-for-1 share purchase plans, an EV Scheme to further reduce More ❯
Desirable Skills Background in e-commerce, marketing, or operations analytics Experience with data visualization tools (e.g., Looker, Tableau, Power BI) Familiarity with modern data stack tools such as dbt, Airflow, and Git-based workflows Experience contributing to data product development, working closely with engineers and product managers Additional Information Publicis Groupe operates a hybrid working pattern with full time More ❯
stack Python and associated ML/DS libraries (scikit-learn, numpy, LightlGBM, Pandas, LangChain/LangGraph, , TensorFlow, etc...) PySpark AWS cloud infrastructure: EMR, ECS, Athena, etc. MLOps: Terraform, Docker, Airflow, MLFlow More information: Enjoy fantastic perks like private healthcare & dental insurance, a generous work from abroad policy, 2-for-1 share purchase plans, an EV Scheme to further reduce More ❯
unsupervised learning, and operations research methods. Solid background in software engineering for data science products: version control (Git), testing (unit, regression, E2E), CI/CD (GitHub Actions), and orchestration (Airflow, Dagster). Proficient in SQL and cloud platforms (AWS preferred), with exposure to model/data versioning tools (e.g. DVC), containerised solutions (Docker, ECS), and experiment tracking (e.g. MLflow More ❯
leading and managing technical teams, with excellent people development skills. Strong project management skills, with experience running complex data initiatives. Strong knowledge of modern data engineering, including SQL, Python, Airflow, Dataform/DBT, Terraform, or similar tools. Understanding of data architecture patterns (e.g., lakehouse, event-driven pipelines, star/snowflake schemas). Excellent communication and stakeholder management skills. Experience More ❯
in the face of many nuanced trade offs and varied opinions. Experience in a range of tools sets comparable with our own: Database technologies: SQL, Redshift, Postgres, DBT, Dask, airflow etc. AI Feature Development: LangChain, LangSmith, pandas, numpy, sci kit learn, scipy, hugging face, etc. Data visualization tools such as plotly, seaborn, streamlit etc You are Able to chart More ❯
time event processing databases. On prem data warehouse experience is valuable. Data Modeling: Expertise in designing and implementing data models for analytics. Data orchestration and transformation tools like dbt, Airflow, or similar Product mindset: Able to work directly with business stakeholders to translate analytics needs into user-friendly, highly-performant data products. Troubleshooter: Experience diagnosing and resolving data issues More ❯
systems (e.g. Git) Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
Starling Bank Limited
systems (e.g. Git) Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Starling Bank Limited
systems (e.g. Git) Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the More ❯
and reliability across our platform. Working format: full-time, remote. Schedule: Monday to Friday (the working day is 8+1 hours). Responsibilities: Design, develop, and maintain data pipelines using ApacheAirflow . Create and support data storage systems (Data Lakes/Data Warehouses) based on AWS (S3, Redshift, Glue, Athena, etc.). Integrate data from various sources, including … pipeline into a standalone, scalable service. What we expect from you: 4+ years of experience as a Data Engineer, including 1+ year at a Senior level. Deep knowledge of Airflow : DAGs, custom operators, and monitoring. Strong command of PostgreSQL databases; familiarity with the AWS stack (S3, Glue or Redshift, Lambda, CloudWatch) is a significant plus. Excellent SQL skills and More ❯
of ingestion across the portfolio. Key Requirements: Strong proficiency in Python and PySpark Successful track history in a Data Engineering role including Database Management (ideally SQL), Data Orchestration (ideally ApacheAirflow or Dagster), Containerisation (ideally Kubernetes and Docker) .Data Pipelines (big data technologies and architectures) Experience in Financial Services (ideally Commodity) Trading. Bachelor's degree in Information Systems More ❯
Strong grasp of API design and integration methods (REST, GraphQL, Webhooks) - Knowledge of OAuth2, JWT, and secure authentication protocols - Experience with ETL/ELT pipelines, workflow orchestration tools (e.g., ApacheAirflow) - Solid understanding of both SQL and NoSQL databases - Familiarity with AWS and its integration services - Strong problem-solving, communication, and collaboration skills - Agile team experience Nice to More ❯
have the chance to work with a talented and engaged team on an innovative product that connects with external systems, partners, and platforms across the industry. Our Tech Stack: ApacheAirflow Python Django React/Typescript AWS (S3, RDS withPostgresql, ElastiCache, MSK, EC2, ECS, Fargate, Lamda etc.) Snowflake Terraform CircleCI Bitbucket Your mission Lead and scale multiple engineering More ❯