B2C environments Strong programming skills in Python, with experience using libraries like scikit-learn, XGBoost, and pandas Practical experience in MLOps or strong knowledge of model deployment (e.g. MLflow, Airflow, Docker, Kubernetes, model monitoring tools) Familiarity with cloud environments (AWS, GCP, or Azure) and data pipelines Excellent communication skills—able to explain technical work to non-technical stakeholders and More ❯
B2C environments Strong programming skills in Python, with experience using libraries like scikit-learn, XGBoost, and pandas Practical experience in MLOps or strong knowledge of model deployment (e.g. MLflow, Airflow, Docker, Kubernetes, model monitoring tools) Familiarity with cloud environments (AWS, GCP, or Azure) and data pipelines Excellent communication skills—able to explain technical work to non-technical stakeholders and More ❯
in containerized environments (e.g., Docker, Kubernetes). Proficient with cloud-native tools or on-prem equivalents (e.g., logging, tracing, metrics). Knowledge of data processing frameworks (e.g., Pandas, Spark, Airflow) is a plus. Comfortable reading and working with Python-based ML code (scikit-learn, TensorFlow, PyTorch, etc.). Strong ownership mindset and a collaborative attitude. Nice to Have Experience More ❯
and Jenkins Proficient in Python and shell scripting Experience with Delta Lake table formats Strong data engineering background Proven experience working with large datasets Nice to Have : Familiarity with Airflow Background in full stack development Team & Culture : Join a collaborative team of 10 professionals Friendly, delivery-focused environment Replacing two outgoing contractors - the handover will ensure a smooth start More ❯
and Jenkins Proficient in Python and shell scripting Experience with Delta Lake table formats Strong data engineering background Proven experience working with large datasets Nice to Have : Familiarity with Airflow Background in full stack development Team & Culture : Join a collaborative team of 10 professionals Friendly, delivery-focused environment Replacing two outgoing contractors - the handover will ensure a smooth start More ❯
pipelines. 1-2 years of hands-on experience with Azure services such as Data Factory, Databricks, Synapse (DWH), Azure Functions, and other data analytics tools, including streaming. Experience with Airflow and Kubernetes. Programming skills in Python (PySpark) and scripting languages like Bash. Knowledge of Git, CI/CD operations, and Docker. Basic PowerBI knowledge is a plus. Experience deploying More ❯
with an ability to work on multiple projects simultaneously Strong interpersonal and communication skills Quick, self-learning capabilities and creativity in problem-solving Preferred: Familiarity with Python Familiarity with Airflow, ETL tools, Snowflake and MSSQL Hands-on with VCS (Git) Apply for this job indicates a required field First Name Last Name Email Phone Resume/CV Enter manually More ❯
and integration projects. Deep understanding of data governance, consent management, and PII handling. Experience with: SQL, Python Power BI (or equivalent BI tools such as Looker, Tableau, Omni) dbt, Airflow, Docker (preferred) Twilio Segment (or other CDPs such as mParticle, Salesforce Data Cloud) Exceptional stakeholder management and communication skills - able to translate complex data topics into business impact. Experience More ❯
using the below technologies: Python as our main programming language Databricks as our datalake platform Kubernetes for data services and task orchestration Terraform for infrastructure Streamlit for data applications Airflow purely for job scheduling and tracking Circle CI for continuous deployment Parquet and Delta file formats on S3 for data lake storage Spark for data processing DBT for data More ❯
especially in areas like compliance, payments, or SaaS metrics. Exposure to statistical analysis or experimentation (A/B testing, cohort analysis, etc.). Familiarity with data pipeline tools (e.g., Airflow) and modern data stack concepts. Why join us Unique opportunity . Be part of a company with direct access to tens of thousands of new businesses in the UK. More ❯
City of London, London, United Kingdom Hybrid / WFH Options
1st Formations
especially in areas like compliance, payments, or SaaS metrics. Exposure to statistical analysis or experimentation (A/B testing, cohort analysis, etc.). Familiarity with data pipeline tools (e.g., Airflow) and modern data stack concepts. Why join us Unique opportunity . Be part of a company with direct access to tens of thousands of new businesses in the UK. More ❯
similarly complex operational businesses Experience using LLMs or AI tools to structure and extract meaning from unstructured data Experience automating workflows and deploying model pipelines (e.g. MLFlow, GCP Vertex, Airflow, dbt or similar) Exposure to business planning, pricing, or commercial decision-making Familiarity with geospatial data Experience in fast-scaling startups or operational teams We're flexible on experience More ❯
sensitive and global datasets. Hands-on experience integrating advanced AI/ML capabilities into operational and analytical data platforms. Extensive knowledge of modern data orchestration and workflow technologies (e.g., Airflow, Kubeflow), and infrastructure automation frameworks (Terraform, CloudFormation). Demonstrated leadership in managing technical product roadmaps, agile delivery practices, and stakeholder management in complex environments. Boston Consulting Group is an More ❯
software development and code quality. Vendor Collaboration: Work closely with third-party vendors to integrate their solutions, ensuring they meet our high standards for production environments. Workflow Automation: Utilize Airflow to automate and optimize workflows, ensuring efficient and reliable operations. Required 5-7 years of experience in software development with a focus on production-grade code. Proficiency in Java More ❯
.NET framework backend services and React frontends. You'll utilise tools such as Terraform for infrastructure-as-code ( IaC ), AWS (Lambda, EC2, EKS, Step Functions, VPC etc.) for ETL, Airflow pipelines, Snowflake, and ensure architectural alignment with AI/ML initiatives and data-driven services. You will serve as the go-to engineer for: End-to-end data migration More ❯
a strong grounding in evaluating NLP models using classification and ranking metrics, and experience running A/B or offline benchmarks. Proficient with MLOps and training infrastructure (MLflow, Kubeflow, Airflow), including CI/CD, hyperparameter tuning, and model versioning. Strong social media data extraction and scraping skills at scale (Twitter v2, Reddit, Discord, Telegram, Scrapy, Playwright). Experience with More ❯
PyTorch, and scikit-learn. Experience with cloud platforms (e.g., AWS), big data technologies (e.g., Spark) as well as other technologies used to deploy models to production (e.g., Kubernetes, GHA, Airflow, Docker etc.). Accommodation requests If you need assistance with any part of the application or recruiting process due to a disability, or other physical or mental health conditions More ❯
software development and code quality. Vendor Collaboration: Work closely with third-party vendors to integrate their solutions, ensuring they meet our high standards for production environments. Workflow Automation: Utilize Airflow to automate and optimize workflows, ensuring efficient and reliable operations. Required 5-7 years of experience in software development with a focus on production-grade code. Proficiency in Java More ❯
and Databricks AWS services (e.g. IAM, S3, Redis, ECS) Shell scripting and related developer tooling CI/CD tools and best practices Streaming and batch data systems (e.g. Kafka, Airflow, RabbitMQ) Additional Information Health + Mental Wellbeing PMI and cash plan healthcare access with Bupa Subsidised counselling and coaching with Self Space Cycle to Work scheme with options from More ❯
scientists Ability to clearly communicate technical ideas through writing, visualisations, or presentations Strong organisational skills with experience in balancing multiple projects Familiarity with Posit Connect, workflow orchestration tools (e.g., Airflow), AWS services (e.g., SageMaker, Redshift), or distributed computing tools (e.g., Spark, Kafka) Experience in a media or newsroom environment Agile team experience Advanced degree in Maths, Statistics, or a More ❯
Spring Boot, Python/FastAPI - Frontend: TypeScript, React, Next.js . Headless CMS, Design systems. - CI/CD: ArgoCD, GitHub Actions - Infrastructure: GCP, Kubernetes, Terraform, Grafana - Data: Postgres, dbt, BigQuery, Airflow, Hex What You'll Be Doing Igniting Potential: Lead, mentor, and empower a high-performing team of 3-6 engineers, focusing relentlessly on outcomes that wow our members , not More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
XPERT-CAREER LTD
AI agents Understanding of Large Language Models (LLMs) and intelligent automation workflows Experience building high-availability, scalable systems using microservices or event-driven architecture Knowledge of orchestration tools like ApacheAirflow , Kubernetes , or serverless frameworks Qualifications: Bachelor’s degree in Computer Science, Engineering, or related field Experience working in Agile/Scrum environments Strong problem-solving skills and More ❯
Basingstoke, Hampshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
e.g., Archimate), and cloud platforms (AWS, Azure) Hands-on experience with DevSecOps tooling, automation platforms (Power Platform, UiPath), and secure software development practices Familiarity with data integration pipelines (Kafka, ApacheAirflow), API management, and scripting (Python) Strong understanding of software design patterns including microservices, cloud-native, and OO design Eligible and willing to undergo high-level security clearance More ❯
southampton, south east england, united kingdom Hybrid / WFH Options
Anson Mccade
e.g., Archimate), and cloud platforms (AWS, Azure) Hands-on experience with DevSecOps tooling, automation platforms (Power Platform, UiPath), and secure software development practices Familiarity with data integration pipelines (Kafka, ApacheAirflow), API management, and scripting (Python) Strong understanding of software design patterns including microservices, cloud-native, and OO design Eligible and willing to undergo high-level security clearance More ❯