how product support and knowledge tooling gets measured The must-haves: 5+ years hands-on with SQL & Python Strong data modelling + visualisation (Tableau, MicroStrategy etc.) Azure Data Factory, Airflow or similar integration tools Bonus: analytics mindset and experience working with customer support or product teams The Offer: Annual Salary up to £86,000 doe and location Contract until More ❯
ideally with some prior management or lead responsibility. A real passion for coaching and developing engineers. Hands-on experience with their tech stack - any cloud, Snowflake (or equivalent), Python, Airflow, Docker Ability to juggle multiple products and effectively gather requirements. Experience with real-time data products is a big plus. Strong communication skills and a good academic background. HOW More ❯
Lambda, Azure) Strong problem-solving skills and critical thinking Strategic planning abilities Additional Skills (desired): Experience with protein or DNA bioinformatics MLOps expertise Software engineering skills, data pipelining (e.g., Airflow), and cloud deployment experience Familiarity with Agile methodologies If you're interested in joining as our new Head of AI, we'd love to hear from you More ❯
problem-solving skills and critical thinking in AI research Strategic planning abilities Additional Skills (desired): Experience with protein or DNA bioinformatics MLOps expertise Software engineering skills, data pipelining (e.g., Airflow), cloud deployment experience Knowledge of Agile methodologies Interested in joining as our Head of AI? We look forward to hearing from you More ❯
such as Lightdash, Looker or Tableau Comfortable with version control, testing frameworks, and CI/CD in a data context Familiarity with Python and orchestration tools like Dagster or Airflow is highly desirable Experience with Ecommerce and/or Subscription services is desirable The Interview Process Meet & Greet Call with a Talent Partner Call with the Hiring Manager and More ❯
or dbt. Experience in data pipeline and data model development. Ability to build efficient, robust, and scalable solutions based on requirements. Knowledge of orchestration and version control tools like Airflow and GitHub. In-depth knowledge of data visualization tools such as Looker or Tableau. Collaborative working style with a desire to share knowledge and enable team growth. Positive, enthusiastic More ❯
cloud-native environment. Strong understanding of programmatic advertising, attribution modeling, campaign measurement, and media mix optimization. Familiarity with cloud platforms (especially GCP) and tools like BigQuery, Vertex AI, Dataflow, Airflow, and dbt. Excellent communication and stakeholder management skills, with the ability to align cross-functional teams around a shared vision. Please send in latest CV LA International is a More ❯
software architecture, perform thorough code reviews, and ensure high coding standards. Foster innovative thinking and strive to create robust and scalable systems. Your Skills: Proficiency in Python, Kubernetes, Docker, Airflow, Harness, and Jenkins. Knowledge and experience with CI/CD pipelines and methods. Experience building ML pipelines and working with classical ML algorithms. Ability to develop and maintain high More ❯
Data/Analytics Monthly Business Review (MBR) - Track performance metrics for your team - Communicate with and support various internal stakeholders and external audiences - Work with AWS products including Redshift, Airflow, QuickSight, Lambda, and GenAI products like AWS Bedrock A day in the life Our team's customers are internal Amazon teams responsible for managing Amazon's Last Mile global More ❯
using the below technologies: Python as our main programming language Databricks as our datalake platform Kubernetes for data services and task orchestration Terraform for infrastructure Streamlit for data applications Airflow purely for job scheduling and tracking Circle CI for continuous deployment Parquet and Delta file formats on S3 for data lake storage Spark for data processing DBT for data More ❯
Time series modelling (both machine-learning and econometric approaches) Familiarity with cloud platforms (AWS) and containerisation technologies (Docker) Familiarity with cloud-based ETL/ELT data pipelines and orchestrators (Airflow, Dagster, Prefect) Experience building backtests and deploying production ML Confident communicator with the ability to work across tech and commercial teams More ❯
centred around a software product, and have solid Python coding skills, and expertise with cloud infrastructure (preferably AWS). Familiarity with Containers and MLE tools such as MLflow and Airflow is essential, with any knowledge of AI SaaS or GenAI APIs being is a bonus. But what truly matters is your passion for learning and advancing technology. In return More ❯
client and internal workflows using Azure or Vertex AI, with secure data handling. Implement ISO-aligned controls, monitor infrastructure, and respond to cloud security incidents. Work with tools like Airflow, Dataflow, Pinecone, and ElasticSearch to manage secure data flows. What You'll Need 3+ years in DevSecOps, DevOps, or Site Reliability Engineering , with a strong security background. Expertise in More ❯
tracking) Desirable Skills & Interests LangChain, Langflow, or similar frameworks for building AI agents LLMs or intelligent automation workflows High-availability, scalable systems (microservices, event- based architectures) Orchestration tools like Airflow, container orchestration (Kubernetes), or serverless frameworks You will be an enthusiastic Python Developer and a great communicator. The successful Python Developer should have strong problem-solving abilities, organisational skills More ❯
Telford, Shropshire, England, United Kingdom Hybrid / WFH Options
eTeam Inc
with cloud platforms (AWS, Azure) and container orchestration (Kubernetes, Docker). Familiarity with GitLab CI/CD, Infrastructure as Code (IaC), and automated deployment pipelines. Knowledge of scheduling tools - Airflow Strong documentation and communication skills. Ability to work collaboratively across multidisciplinary teams. The successful candidate will also need to be SC Vetted. More ❯
Telford, Shropshire, England, United Kingdom Hybrid / WFH Options
eTeam Inc
with cloud platforms (AWS, Azure) and container orchestration (Kubernetes, Docker). Familiarity with GitLab CI/CD, Infrastructure as Code (IaC), and automated deployment pipelines. Knowledge of scheduling tools - Airflow Strong documentation and communication skills. Ability to work collaboratively across multidisciplinary teams. The successful candidate will also need to be SC Vetted. If you are interested in this position More ❯
Telford, Shropshire, United Kingdom Hybrid / WFH Options
Experis - ManpowerGroup
execution. Ability to work under pressure and manage competing priorities. Desirable Qualifications: Familiarity with DevOps practices and cloud-hosted platforms (AWS, Azure, SAS PaaS). Knowledge of scheduling tools - Airflow Exposure to SAS Migration Factory Delivery Models. The successful candidate will also need to be SC Vetted. All profiles will be reviewed against the required skills and experience. Due More ❯
Telford, Shropshire, West Midlands, United Kingdom
LA International Computer Consultants Ltd
with cloud platforms (AWS, Azure) and container orchestration (Kubernetes, Docker). Familiarity with GitLab CI/CD, Infrastructure as Code (IaC), and automated deployment pipelines. Knowledge of scheduling tools - Airflow Strong documentation and communication skills. Ability to work collaboratively across multidisciplinary teams. Due to the nature and urgency of this post, candidates holding or who have held high level More ❯
in delta one, store of value, and/or FICC options trading Experience with Linux-based, concurrent, high-throughput, low-latency software systems Experience with pipeline orchestration frameworks (e.g. Airflow, Dagster) Experience with streaming platforms (e.g. Kafka), data lake platforms (e.g. Delta Lake, Apache Iceberg), and relational databases Have a Bachelors or advanced degree in Computer Science, Mathematics More ❯
class-leading data and ML platform infrastructure, balancing maintenance with exciting greenfield projects. develop and maintain our real-time model serving infrastructure, utilising technologies such as Kafka, Python, Docker, Apache Flink, Airflow, and Databricks. Actively assist in model development and debugging using tools like PyTorch, Scikit-learn, MLFlow, and Pandas, working with models from gradient boosting classifiers to More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
options Hybrid working - 1 day a week in a central London office High-growth scale-up with a strong mission and serious funding Modern tech stack: Python, SQL, Snowflake, Apache Iceberg, AWS, Airflow, dbt, Spark Work cross-functionally with engineering, product, analytics, and data science leaders What You'll Be Doing Lead, mentor, and grow a high-impact More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
best practices around CI/CD, infrastructure-as-code, and modern data tooling Introduce and advocate for scalable, efficient data processes and platform enhancements Tech Environment: Python, SQL, Spark, Airflow, dbt, Snowflake, Postgres AWS (S3), Docker, Terraform Exposure to Apache Iceberg, streaming tools (Kafka, Kinesis), and ML pipelines is a bonus What We're Looking For: 5+ years More ❯
respond to suspicious activity. We own the end-to-end platform that powers our real-time and batch monitoring capabilities, including: A custom alerting and orchestration platform, built on Airflow, that enables scalable, auditable detection pipelines. Data pipelines in DBT and Snowflake that serve both ML models and rule-based logic. Backend services and APIs that handle case management … case management, and customer termination. Build and maintain robust, well-tested code with a focus on performance, reliability, and operational efficiency. Maintain and evolve the engineering infrastructure behind our Airflow-based alerting platform, enabling analysts to deploy and manage DAGs safely and effectively. Contribute to the development and maintenance of DBT models and data pipelines integrated with Snowflake to … software design principles. Proficiency in at least one of the following languages: Python, Golang, Java. Experience with multiple languages is a plus. Familiarity with data pipeline tooling such as Airflow and DBT, and cloud data warehouses like Snowflake. Understanding of testing strategies, including unit, integration, and system testing (TDD/BDD is a plus). Experience with CI/ More ❯
build, and lead the development of scalable data pipelines and platforms on AWS. The ideal candidate will have deep expertise in PySpark, Glue, Athena, AWS LakeFormation, data modelling, DBT, Airflow, Docker and will be responsible for driving best practices in data engineering, governance, and DevOps. Key Responsibilities: • Lead the design and implementation of scalable, secure, and high-performance data … field. • 10+ years of experience in data engineering. • Strong hands-on experience with AWS services: S3, Glue, Lake Formation, Athena, Redshift, Lambda, IAM, CloudWatch. • Proficiency in PySpark, Python, DBT, Airflow, Docker and SQL. • Deep understanding of data modeling techniques and best practices. • Experience with CI/CD tools and version control systems like Git. • Familiarity with data governance, security More ❯
build, and lead the development of scalable data pipelines and platforms on AWS. The ideal candidate will have deep expertise in PySpark, Glue, Athena, AWS LakeFormation, data modelling, DBT, Airflow, Docker and will be responsible for driving best practices in data engineering, governance, and DevOps. Key Responsibilities: • Lead the design and implementation of scalable, secure, and high-performance data … field. • 10+ years of experience in data engineering. • Strong hands-on experience with AWS services: S3, Glue, Lake Formation, Athena, Redshift, Lambda, IAM, CloudWatch. • Proficiency in PySpark, Python, DBT, Airflow, Docker and SQL. • Deep understanding of data modeling techniques and best practices. • Experience with CI/CD tools and version control systems like Git. • Familiarity with data governance, security More ❯