London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
and analysing A/B tests Strong storytelling and stakeholder-management skills Full UK work authorization Desirable Familiarity with cloud data platforms (Snowflake, Redshift, BigQuery) and ETL tools (dbt, Airflow) Experience with loyalty-programme analytics or CRM platforms Knowledge of machine-learning frameworks (scikit-learn, TensorFlow) for customer scoring Technical Toolbox Data & modeling: SQL, Python/R, pandas, scikit … learn Dashboarding: Tableau or Power BI ETL & warehousing: dbt, Airflow, Snowflake/Redshift/BigQuery Experimentation: A/B testing platforms (Optimizely, VWO More ❯
SQL, craft new features. Modelling sprint: run hyper-parameter sweeps or explore heuristic/greedy and MIP/SAT approaches. Deployment: ship a model as a container, update an Airflow (or Azure Data Factory) job. Review: inspect dashboards, compare control vs. treatment, plan next experiment. Tech stack Python (pandas, NumPy, scikit-learn, PyTorch/TensorFlow)SQL (Redshift, Snowflake or … similar)AWS SageMaker Azure ML migration, with Docker, Git, Terraform, Airflow/ADFOptional extras: Spark, Databricks, Kubernetes. What you'll bring 3-5+ years building optimisation or recommendation systems at scale. Strong grasp of mathematical optimisation (e.g., linear/integer programming, meta-heuristics) as well as ML. Hands-on cloud ML experience (AWS or Azure). Proven track More ❯
DATA ENGINEER - DBT/AIRFLOW/DATABRICKS 4-MONTH CONTRACT £450-550 PER DAY OUTSIDE IR35 This is an exciting opportunity for a Data Engineer to join a leading media organisation working at the forefront of data innovation. You'll play a key role in designing and building the data infrastructure that supports cutting-edge machine learning and LLM … to accelerate delivery of critical pipelines and platform improvements. THE ROLE You'll join a skilled data team to lead the build and optimisation of scalable pipelines using DBT, Airflow, and Databricks. Working alongside data scientists and ML engineers, you'll support everything from raw ingestion to curated layers powering LLMs and advanced analytics.Your responsibilities will include: Building and … maintaining production-grade ETL/ELT workflows with DBT and Airflow Collaborating with AI/ML teams to support data readiness for experimentation and inference Writing clean, modular SQL and Python code for use in Databricks Contributing to architectural decisions around pipeline scalability and performance Supporting the integration of diverse data sources into the platform Ensuring data quality, observability More ❯
systems and APIs (RESTful/GraphQL), with solid experience in microservices and databases (SQL/NoSQL). You know your way around big data tools (Spark, Dask) and orchestration (Airflow, DBT). You understand NLP and have experience working with Large Language Models. You're cloud-savvy (AWS, GCP, or Azure) and comfortable with containerization (Docker, Kubernetes). You More ❯
and Jenkins Proficient in Python and shell scripting Experience with Delta Lake table formats Strong data engineering background Proven experience working with large datasets Nice to Have : Familiarity with Airflow Background in full stack development Team & Culture : Join a collaborative team of 10 professionals Friendly, delivery-focused environment Replacing two outgoing contractors - the handover will ensure a smooth start More ❯
Are you a problem-solver with a passion for data, performance, and smart engineering? This is your opportunity to join a fast-paced team working at the forefront of data platform innovation in the financial technology space. You'll tackle More ❯
Technologies & Tools: Salesforce platform (Admin, Developer, and Deployment tools) Snowflake Data Cloud Git, Bitbucket, GitHub Jenkins, Azure DevOps, or GitLab CI/CD Jira, Confluence DataOps tools (e.g., dbt, Airflow) - Desirable The Ideal Candidate: Has previously built environments for large-scale IT transformation programmes Brings hands-on experience with automation and process optimisation Is highly proficient in JIRA Has More ❯
Technologies & Tools: Salesforce platform (Admin, Developer, and Deployment tools) Snowflake Data Cloud Git, Bitbucket, GitHub Jenkins, Azure DevOps, or GitLab CI/CD Jira, Confluence DataOps tools (e.g., dbt, Airflow) - Desirable The Ideal Candidate: Has previously built environments for large-scale IT transformation programmes Brings hands-on experience with automation and process optimisation Is highly proficient in JIRA Has More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
to our bespoke data pipeline and associated API services. Key Requirements 5+ years of Python experience with frameworks like Flask, FastAPI, and Django. Strong command orchestration tools (e.g. Prefect, Airflow), Docker, and AWS infrastructure (CDK, Terraform). Solid understanding of API services, authentication methods (JWT, SSO), and clear, pragmatic communication skills. Maintain, upgrade, and improve existing systems and custom More ❯