of engineers, and work closely with cross-functional teams to deliver high-impact data solutions. Key Responsibilities: Architect and maintain robust data pipelines using AWS services (Glue, Lambda, S3, Airflow) Lead the migration and optimisation of data workflows into Snowflake. Collaborate with analysts, data scientists, and product teams to deliver clean, reliable data. Define and enforce best practices in More ❯
Liverpool, Merseyside, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
of engineers, and work closely with cross-functional teams to deliver high-impact data solutions. Key Responsibilities: Architect and maintain robust data pipelines using AWS services (Glue, Lambda, S3, Airflow) Lead the migration and optimisation of data workflows into Snowflake. Collaborate with analysts, data scientists, and product teams to deliver clean, reliable data. Define and enforce best practices in More ❯
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
of engineers, and work closely with cross-functional teams to deliver high-impact data solutions. Key Responsibilities: Architect and maintain robust data pipelines using AWS services (Glue, Lambda, S3, Airflow) Lead the migration and optimisation of data workflows into Snowflake. Collaborate with analysts, data scientists, and product teams to deliver clean, reliable data. Define and enforce best practices in More ❯
Newcastle upon Tyne, Tyne and Wear, Tyne & Wear, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
of engineers, and work closely with cross-functional teams to deliver high-impact data solutions. Key Responsibilities: Architect and maintain robust data pipelines using AWS services (Glue, Lambda, S3, Airflow) Lead the migration and optimisation of data workflows into Snowflake. Collaborate with analysts, data scientists, and product teams to deliver clean, reliable data. Define and enforce best practices in More ❯
Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Tenth Revolution Group
of engineers, and work closely with cross-functional teams to deliver high-impact data solutions. Key Responsibilities: Architect and maintain robust data pipelines using AWS services (Glue, Lambda, S3, Airflow) Lead the migration and optimisation of data workflows into Snowflake. Collaborate with analysts, data scientists, and product teams to deliver clean, reliable data. Define and enforce best practices in More ❯
Erwin, Lucidchart, PowerDesigner Strong SQL and Python skills (Snowflake or similar) AWS experience (Lambda, SNS, S3, EKS, API Gateway) Familiarity with data governance (GDPR, HIPAA) Bonus points for: DBT, Airflow, Atlan, Iceberg, CI/CD, API modelling The vibe: You’ll be joining a collaborative, inclusive team that values technical excellence and continuous learning. Flexible working, strong L&D More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Anson McCade
Erwin, Lucidchart, PowerDesigner Strong SQL and Python skills (Snowflake or similar) AWS experience (Lambda, SNS, S3, EKS, API Gateway) Familiarity with data governance (GDPR, HIPAA) Bonus points for: DBT, Airflow, Atlan, Iceberg, CI/CD, API modelling The vibe: You’ll be joining a collaborative, inclusive team that values technical excellence and continuous learning. Flexible working, strong L&D More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Salt Search
communicating customer insights Comfortable working with large datasets from sources like CRM, web analytics, product telemetry, etc. Exposure to cloud platforms (AWS, GCP, Azure) and modern data pipelines (e.g. Airflow, dbt) is a plus Soft Skills: Business-oriented thinker with strong communication skills Able to clearly explain complex models to non-technical audiences Skilled in stakeholder engagement and translating More ❯
communicating customer insights Comfortable working with large datasets from sources like CRM, web analytics, product telemetry, etc. Exposure to cloud platforms (AWS, GCP, Azure) and modern data pipelines (e.g. Airflow, dbt) is a plus Soft Skills: Business-oriented thinker with strong communication skills Able to clearly explain complex models to non-technical audiences Skilled in stakeholder engagement and translating More ❯
Hackajob, Welcome to The Jungle) and ATS platforms (Screenloop experience a plus). Solid understanding of tech stacks including Python, React.js, AWS/Azure, and data tools like dbt, Airflow, Snowflake. Ability to conduct structured interviews and technical assessments. Familiarity with software development practices, agile methodologies, DevOps culture, and AI/ML concepts. Exceptional communication and stakeholder management skills More ❯
pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new services as More ❯
business analysts and stakeholders, ensuring technical solutions meet business needs. Experience with data ingestion tools, like Fivetran. Advantageous Exposure to deploying applications with Kubernetes. Experience with Data Orchestrator tools (Airflow, Prefect, etc.) Experience with Data Observability tools (Montecarlo, Great Expectations, etc.) Experience with Data Catalog tools (Amundsen, OpenMetadata, etc.) Interview Process Call with the talent team Take home task More ❯
have advanced SQL skills; experience with dbt and/or Looker is strongly preferred You will be proficient with modern data platforms (e.g., Snowflake, dbt, AWS, GCP, Looker, Tableau, Airflow) You will have experience with version control tools such as Git You will have exposure to Python for data or analytics engineering tasks (preferred) You will demonstrate excellent problem More ❯
and model validation. Experience in data visualization tools such as plotly, seaborn, streamlit etc would be an advantage. Understanding of data modelling and exposure to tools like metaflow or airflow is desirable. Understanding of common software practices like version control, continuous deployment, and testing. Industry, or equivalent academic experience in researching and developing Generative AI powered Products and services. More ❯
New Malden, Surrey, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
hands-on background in data engineering, with 5+ years working on modern data platforms Experience leading cloud data migrations- GCP and BigQuery strongly preferred Proficiency in SQL, Python, dbt, Airflow, Terraform and other modern tooling Excellent understanding of data architecture, governance, and DevOps best practices Proven leadership or team management experience within a regulated or mid-to-large tech More ❯
/snowflake schemas) Deep expertise in dbt - including documentation, testing, and CI/CD Proficiency with Python or Bash for automation and orchestration Familiarity with pipeline orchestration tools (e.g., Airflow) Knowledge of data governance, lineage, and quality assurance practices Experience working in cloud-native environments (preferably AWS) Comfortable using Git-based workflows for version control If you'd like More ❯
Brighton, East Sussex, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
best practices in testing, data governance, and observability. Lead roadmap planning and explore emerging technologies (e.g. GenAI). Ensure operational stability and support incident resolution. Tech Stack Python , SQL , Airflow , AWS , Fivetran , Snowflake , Looker , Docker (You don't need to tick every box - if you've worked with comparable tools, that's great too.) What We're Looking For More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Method Resourcing
contexts. Bonus Experience (Nice to Have) Exposure to large language models (LLMs) or foundational model adaptation. Previous work in cybersecurity, anomaly detection, or behavioural analytics. Familiarity with orchestration frameworks (Airflow or similar). Experience with scalable ML systems, pipelines, or real-time data processing. Advanced degree or equivalent experience in ML/AI research or applied science. Cloud platform More ❯
toggle section visibility Proficiency in SQL and Python Experience with AWS cloud and analytics platforms such as Redshift, Dataiku, and Alation Familiarity with open-source technologies like Presto and Airflow A passion for data and coding, with a focus on user experience A learning mindset, can-do attitude, and effective communication skills The ability to work on innovative projects More ❯
adaptable to fast-paced startup environment, comfortable with ambiguity and evolving responsibilities Work Authorization: Must be eligible to work in US or UK Preferred Experience: Data orchestration tools (e.g. , Airflow, Prefect)Experience deploying, monitoring, and maintaining ML models in production environments (MLOps)Familiarity with big data technologies ( e.g. , Spark, Hadoop)Background in time-series analysis and forecastingExperience with data More ❯
You have experience with RAG (Retrieval-Augmented Generation) systems, vector databases, and embedding models for knowledge extraction. You can architect complex workflows. You have experience with workflow orchestration tools (Airflow, Prefect, Temporal) or have built custom pipeline systems for multi-step autonomous processes. You bridge science and engineering. You are comfortable with scientific computing libraries (NumPy, SciPy, pandas) and More ❯
sources, whether that's batch files or real-time streams. You'll have set up and worked with ETL and ELT tools like Dagster, AWS Glue, Azure Data Factory, Airflow or dbt, and you can decide what tools are right for the job. You'll have an understanding of how Node.js and TypeScript fit into a modern development environment More ❯
Real time streaming patterns. Strong background in Data Management, Data Governance, Transformation initiatives preferred. Preferred Experience/Familiarity with one or more of these tools Big data platforms - Hadoop, Apache Kafka Relational SQL, NoSQL, and Cloud Native databases - Postgres, Cassandra, Snowflake Experience with data pipeline and orchestration tools - Azkaban, Luigi, or Airflow Experience with stream-processing engines - Apache Spark, Apache Storm, or Apache Flink Experience with ETL tools - Talend, Ab Initio Experience with Data Analytics/visualization tools - Looker, Mode, or Tableau What we can offer you: By joining Citi Dubli n , you will not only be part of a business casual workplace with a hybrid working model (up to 2 days working at home More ❯
unsupervised, and reinforcement learning methods. Experience with GCP services such as Vertex AI, BigQuery ML, Dataflow, AI Platform Pipelines, and Dataproc. Solid knowledge of distributed systems, data streaming (e.g., Apache Beam, Kafka), and large-scale data processing. ML Ops: Hands-on experience with continuous integration/deployment (CI/CD) for ML, model versioning, and monitoring. Business Acumen: Ability … to understand marketing and advertising concepts like customer lifetime value (CLV), attribution modeling, real-time bidding (RTB), and audience targeting. Strong understanding of data pipeline orchestration tools (e.g., ApacheAirflow, Kubernetes). You thrive when working as part of a team Comfortable in a fast-paced environment Have excellent written and verbal English skills Last but not least More ❯