believe that no one is the finished article, however, some experience in the following is important for this role: Strong Python skills and experience with data pipelines (pandas, SQLAlchemy, Airflow) Deep understanding of modern AI (RAG, agentic AI, MCP, vector databases) Skilled in designing scalable data architectures and working with structured/unstructured data How We Work (This is More ❯
Required Skills and Qualifications: Proven experience as an ApacheAirflow SME or Lead Developer in a production-grade environment. Strong understanding of Airflow internals, including scheduler, executor types (Celery, Kubernetes), and plugin development. Experience with workload orchestration and autoscaling using KEDA (Kubernetes-based Event Driven Autoscaler), and familiarity with Celery for distributed task execution and background job More ❯
techniques and tools Strong hands-on experience with cloud data platforms, specifically Google Cloud Platform - BigQuery Experience with leading ETL/ELT tools and data pipeline orchestration (e.g., Dataflow, ApacheAirflow, Talend, Informatica) Advanced SQL skills and deep knowledge of various database technologies (relational, columnar, NoSQL) Practical experience in establishing data governance frameworks, data quality initiatives, and Master More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
datasets into actionable insights for the business. In this role, you will: Develop and maintain a cloud-based data warehouse (BigQuery, GCP) Create and optimise ETL processes (SSIS, Talend, Airflow) Collaborate with BI teams to deliver key performance metrics Ensure data quality, security, and cost-efficient storage What you'll bring: Strong SQL and data modelling skills Data warehouse More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
datasets into actionable insights for the business. In this role, you will: Develop and maintain a cloud-based data warehouse (BigQuery, GCP) Create and optimise ETL processes (SSIS, Talend, Airflow) Collaborate with BI teams to deliver key performance metrics Ensure data quality, security, and cost-efficient storage What you'll bring: Strong SQL and data modelling skills Data warehouse More ❯
prototypes, predictive models and proof of concepts. Technical Skills Required: SQL (5+ years) Python (5+ years) Data Modelling, visualization - Tableau, micro strategy etc and integration tools : Azure data factory , airflow ect. - (5+ years) Skills: Ability to work as part of a team, as well as work independently or with minimal direction. Excellent written, presentation, and verbal communication skills. Collaborate More ❯
B2C environments Strong programming skills in Python, with experience using libraries like scikit-learn, XGBoost, and pandas Practical experience in MLOps or strong knowledge of model deployment (e.g. MLflow, Airflow, Docker, Kubernetes, model monitoring tools) Familiarity with cloud environments (AWS, GCP, or Azure) and data pipelines Excellent communication skills—able to explain technical work to non-technical stakeholders and More ❯
B2C environments Strong programming skills in Python, with experience using libraries like scikit-learn, XGBoost, and pandas Practical experience in MLOps or strong knowledge of model deployment (e.g. MLflow, Airflow, Docker, Kubernetes, model monitoring tools) Familiarity with cloud environments (AWS, GCP, or Azure) and data pipelines Excellent communication skills—able to explain technical work to non-technical stakeholders and More ❯
Brighton, East Sussex, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
data capabilities to better understand and predict user behaviour Supporting a data function embedded within the tech team - directly influencing architecture and systems decisions Tech Environment: AWS, Snowflake, Python, Airflow Docker, Fivetran, Looker Real-time streaming technologies are a bonus What We're Looking For: Proven experience managing or mentoring Engineers (either as a Lead or in a formal More ❯
with an ability to work on multiple projects simultaneously Strong interpersonal and communication skills Quick, self-learning capabilities and creativity in problem-solving Preferred: Familiarity with Python Familiarity with Airflow, ETL tools, Snowflake and MSSQL Hands-on with VCS (Git) Apply for this job indicates a required field First Name Last Name Email Phone Resume/CV Enter manually More ❯
and integration projects. Deep understanding of data governance, consent management, and PII handling. Experience with: SQL, Python Power BI (or equivalent BI tools such as Looker, Tableau, Omni) dbt, Airflow, Docker (preferred) Twilio Segment (or other CDPs such as mParticle, Salesforce Data Cloud) Exceptional stakeholder management and communication skills - able to translate complex data topics into business impact. Experience More ❯
leading technical decisions. You are proficient in Python, Java, Scala, and ML frameworks (e.g., TensorFlow, PyTorch ), with experience in cloud platforms (AWS), big data (Spark), and deployment tools (Kubernetes, Airflow, Docker). Accommodation requests If you need assistance with any part of the application or recruiting process due to a disability, or other physical or mental health conditions, please More ❯
similarly complex operational businesses Experience using LLMs or AI tools to structure and extract meaning from unstructured data Experience automating workflows and deploying model pipelines (e.g. MLFlow, GCP Vertex, Airflow, dbt or similar) Exposure to business planning, pricing, or commercial decision-making Familiarity with geospatial data Experience in fast-scaling startups or operational teams We're flexible on experience More ❯
software development and code quality. Vendor Collaboration: Work closely with third-party vendors to integrate their solutions, ensuring they meet our high standards for production environments. Workflow Automation: Utilize Airflow to automate and optimize workflows, ensuring efficient and reliable operations. Required 5-7 years of experience in software development with a focus on production-grade code. Proficiency in Java More ❯
PyTorch, and scikit-learn. Experience with cloud platforms (e.g., AWS), big data technologies (e.g., Spark) as well as other technologies used to deploy models to production (e.g., Kubernetes, GHA, Airflow, Docker etc.). Accommodation requests If you need assistance with any part of the application or recruiting process due to a disability, or other physical or mental health conditions More ❯
and Databricks AWS services (e.g. IAM, S3, Redis, ECS) Shell scripting and related developer tooling CI/CD tools and best practices Streaming and batch data systems (e.g. Kafka, Airflow, RabbitMQ) Additional Information Health + Mental Wellbeing PMI and cash plan healthcare access with Bupa Subsidised counselling and coaching with Self Space Cycle to Work scheme with options from More ❯
DevOps tooling/automation - ArgoCD/Jenkins/Github Actions Scripting languages - Bash/Python/Groovy/Golang Provisioning software/frameworks (Elasticsearch/Spark/Hadoop/Airflow/PostgreSQL) Infrastructure Management - CasC, IasC (Ansible, Terraform, Packer) Log and metric aggregation with Fluentd, Prometheus, Grafana, Alertmanager Public Cloud, primarily GCP and Azure, but also AWS What do More ❯
scientists Ability to clearly communicate technical ideas through writing, visualisations, or presentations Strong organisational skills with experience in balancing multiple projects Familiarity with Posit Connect, workflow orchestration tools (e.g., Airflow), AWS services (e.g., SageMaker, Redshift), or distributed computing tools (e.g., Spark, Kafka) Experience in a media or newsroom environment Agile team experience Advanced degree in Maths, Statistics, or a More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
XPERT-CAREER LTD
AI agents Understanding of Large Language Models (LLMs) and intelligent automation workflows Experience building high-availability, scalable systems using microservices or event-driven architecture Knowledge of orchestration tools like ApacheAirflow , Kubernetes , or serverless frameworks Qualifications: Bachelor’s degree in Computer Science, Engineering, or related field Experience working in Agile/Scrum environments Strong problem-solving skills and More ❯
how product support and knowledge tooling gets measured The must-haves: 5+ years hands-on with SQL & Python Strong data modelling + visualisation (Tableau, MicroStrategy etc.) Azure Data Factory, Airflow or similar integration tools Bonus: analytics mindset and experience working with customer support or product teams The Offer: Annual Salary up to £86,000 doe and location Contract until More ❯
ideally with some prior management or lead responsibility. A real passion for coaching and developing engineers. Hands-on experience with their tech stack - any cloud, Snowflake (or equivalent), Python, Airflow, Docker Ability to juggle multiple products and effectively gather requirements. Experience with real-time data products is a big plus. Strong communication skills and a good academic background. HOW More ❯
Lambda, Azure) Strong problem-solving skills and critical thinking Strategic planning abilities Additional Skills (desired): Experience with protein or DNA bioinformatics MLOps expertise Software engineering skills, data pipelining (e.g., Airflow), and cloud deployment experience Familiarity with Agile methodologies If you're interested in joining as our new Head of AI, we'd love to hear from you More ❯
problem-solving skills and critical thinking in AI research Strategic planning abilities Additional Skills (desired): Experience with protein or DNA bioinformatics MLOps expertise Software engineering skills, data pipelining (e.g., Airflow), cloud deployment experience Knowledge of Agile methodologies Interested in joining as our Head of AI? We look forward to hearing from you More ❯
such as Lightdash, Looker or Tableau Comfortable with version control, testing frameworks, and CI/CD in a data context Familiarity with Python and orchestration tools like Dagster or Airflow is highly desirable Experience with Ecommerce and/or Subscription services is desirable The Interview Process Meet & Greet Call with a Talent Partner Call with the Hiring Manager and More ❯
or dbt. Experience in data pipeline and data model development. Ability to build efficient, robust, and scalable solutions based on requirements. Knowledge of orchestration and version control tools like Airflow and GitHub. In-depth knowledge of data visualization tools such as Looker or Tableau. Collaborative working style with a desire to share knowledge and enable team growth. Positive, enthusiastic More ❯