London, South East, England, United Kingdom Hybrid / WFH Options
Interquest
analysts, and client stakeholders to deliver reliable, automated, and high-performing data solutions end to end. What We’re Looking For Strong experience with Python, Databricks and tools like Airflow Confident working across Cloud Platforms (AWS, Azure, GCP) Great communication skills and the ability to work with both technical and non-technical teams Comfortable in a consultancy setting, balancing More ❯
tools. Knowledge of event streaming, API integration, and MLOps . Experience in regulated, high-volume industries (gaming, finance, or e-commerce). Proficiency with integration/orchestration tools like Airflow and dbt . Why this role is a game-changer for you You'll be making a huge impact on our products, player experience, and internal processes . This More ❯
with a passion for new technologies Experience in startups or top-tier consultancies is a plus Nice to Have Familiarity with dashboarding tools, Typescript, and API development Exposure to Airflow, DBT, Databricks Experience with ERP (e.g. SAP, Oracle) and CRM systems If you’re interested, get in touch ASAP with a copy of your most recent and up-to More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
with a passion for new technologies Experience in startups or top-tier consultancies is a plus Nice to Have Familiarity with dashboarding tools, Typescript, and API development Exposure to Airflow, DBT, Databricks Experience with ERP (e.g. SAP, Oracle) and CRM systems If you’re interested, get in touch ASAP with a copy of your most recent and up-to More ❯
needs to work on Saturday and Sunday and can take off on Tuesday/Wednesday Minimum Qualification: Technical Proficiency in SQL Databases, Programming OR Scripting in ( Python, Java, AWS) Airflow for data access, manipulation.Prior expertise experience deploying, supporting and maintaining cloud-based applications (i.e. AWS)Demonstrated ability to influence at all levels of an organization, whether with a peer More ❯
the Tech Department. Tech stack We primarily use Python (Pandas, PyTorch, Transformers), Azure, and PySpark. Our workflow uses Git for version control, Docker for packaging, Argo CD for deployment, Airflow for scheduling, and MLflow on Databricks for experiment tracking. For video processing, we leverage Argo Workflows on Kubernetes. Data sources include Snowflake and Azure Event Hub, with MongoDB for More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
with a passion for new technologies Experience in startups or top-tier consultancies is a plus Nice to Have: Familiarity with dashboarding tools, Typescript, and API development Exposure to Airflow, DBT, Databricks Experience with ERP (e.g. SAP, Oracle) and CRM systems What's On Offer: Salary: £50,000-£75,000 + share options Hybrid working: 2-3 days per More ❯
with a passion for new technologies Experience in startups or top-tier consultancies is a plus Nice to Have: Familiarity with dashboarding tools, Typescript, and API development Exposure to Airflow, DBT, Databricks Experience with ERP (e.g. SAP, Oracle) and CRM systems What's On Offer: Salary: £50,000-£75,000 + share options Hybrid working: 2-3 days per More ❯
reasoning, short and long term memory, tool use, semantic data layers. Expertise in ontology development, schema reconciliation, and knowledge graphs. Strong foundations in modern data infra (e.g., dbt, BigQuery, Airflow) and cloud (GCP or AWS). Background in symbolic AI, reasoning systems, or semantic data modeling is a huge plus . Experience at an early stage startup is a More ❯
involve visual data (like images, videos) analysis. 5. Utilize knowledge of Big Data tools like AWS, PySpark, SQL to perform data analysis on large datasets. 6. Utilize knowledge of Airflow and AWS Glue to automate the extraction of data from various sources such as databases, APIs, files, and streaming platforms, and to automate Machine Learning workflow. 7. Utilize knowledge More ❯
following: configuration management, orchestration, CI/CD, API design and implementation, infrastructure monitoring and telemetry Familiarity with one or more of the following technologies is preferred: Chef, SaltStack, Ansible, Airflow, Jenkins Linux kernel or networking knowledge is a major bonus Experience with Windows is useful but not required Discover what makes Bloomberg unique - watch our for an inside look More ❯
design, build, and troubleshoot data solutions end-to-end Experience working with large datasets and complex data ecosystems Familiarity with modern data platforms and tools (e.g., Snowflake, dbt, GCP, Airflow) is a plus, but not a requirement Strong analytical and problem-solving skills, with a keen eye for detail and data quality Excellent communication skills and the ability to More ❯
South East London, London, United Kingdom Hybrid / WFH Options
Stepstone UK
code. Implement TMS (Tealium IQ, Adobe Analytics, GTM and Adobe Dynamic Tag Manager) changes. Integrate data sources via web and REST APIs. Data pipping and modelling using SQL, DBT, Airflow, ETL, Data Warehousing, Redshift and Python. Transfer knowledge of the business processes and requirements to the development teams. Collaborate with Product, Marketing and Development teams to collect business requirements More ❯
Cutting-Edge Stack: Backend: Kotlin/Spring Boot, Python/FastAPI & Django Frontend: TypeScript, React Native, CI/CD: ArgoCD, GitHub Actions Infrastructure: GCP, Kubernetes, Terraform Data: dbt, BigQuery, Airflow What You'll Be Doing Collaborate with product managers, designers, and marketing teams to deliver meaningful solutions. Design and run experiments to quickly validate ideas using real user feedback. More ❯
ensure the smooth delivery of daily data loads across multiple systems. Key Responsibilities Lead daily data operations and ensure end-to-end completion of data loads. Monitor and debug Airflow workflows, resolving issues efficiently. Manage a team of offshore data engineers, providing technical direction and support. Collaborate with stakeholders to troubleshoot data discrepancies and root-cause issues in reports. … business and technical stakeholders to improve data reliability and transparency. Identify opportunities for automation and process optimisation once BAU stability is achieved. Technical Environment AWS (data storage and processing) ApacheAirflow (workflow orchestration) Power BI (reporting and analytics) What We're Looking For Strong background in data engineering or data operations. Experience managing or mentoring offshore technical teams. More ❯
volume provisioning Solid understanding of RBAC, IAM, and secure networking (TLS, ingress/egress rules) Preferred Qualifications: Knowledge to have with Data Science workflows and user personas Knowledge of ApacheAirflow, Iceberg, or Spark on Kubernetes Familiarity with Open Data Hub, Kubeflow, or Red Hat AI/ML tooling Certifications: CKA, Red Hat Certified Specialist in OpenShift Administration More ❯
in Snowflake internals like Snowflake Roles dynamic tables streams and tasks policies etc Mandatory Skills : Snowflake, ANSI-SQL, Dimensional Data Modeling, Snow park Container services, Snowflake-Data Science , DBT, Airflow If you are interested, please contact hrajendran@redglobal.com or apply here. More ❯
business to get closer to their customers, informing business strategy and decision making. Help and lead the development of automation solutions and workflow integrations, using platforms such as Databricks, Airflow, Power Apps, Power Automat. Help define the right data flow architecture that delivers tangible results quickly. Lead and develop a small team of Analysts. Develop internal relationships with key More ❯
business to get closer to their customers, informing business strategy and decision making. Help and lead the development of automation solutions and workflow integrations, using platforms such as Databricks, Airflow, Power Apps, Power Automat. Help define the right data flow architecture that delivers tangible results quickly. Lead and develop a small team of Analysts. Develop internal relationships with key More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Sanderson Recruitment
on development/engineering background Machine Learning or Data Background Technical Experience: PySpark, Python, SQL, Jupiter Cloud: AWS, Azure (Cloud Environment) - Moving towards Azure Nice to Have: Astro/Airflow, Notebook Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people More ❯
Data Engineering, and AI teams SKILLS AND EXPERIENCE Required: 5+ years’ experience in BI Operations or Production Support (AWS environment) Strong SQL and AWS BI stack expertise (Redshift, Glue, Airflow, S3, Step Functions) Proficiency with Power BI or similar reporting tools Experience with incident, change, and problem management (ITIL framework) Proven ability to optimise workflows and automate manual processes More ❯
CD for ML, and production monitoring. Experience building robust backend systems and APIs to serve ML models at scale. Strong understanding of big data technologies and data orchestration tools (Airflow, DBT). Familiarity with LLM integration and optimisation in production environments. Excellent problem-solving, analytical, and communication skills. Experience fine-tuning LLMs (e.g. Unsloth, cloud-based methods). More ❯
search, RAG, and feature engineering Implement secure access and governance controls including RBAC, SSO, token policies, and pseudonymisation frameworks Support batch and streaming data flows using technologies like Kafka, Airflow, and Terraform Monitor and optimise cloud resource usage to ensure performance and cost efficiency Collaborate with cross-functional teams on architecture decisions, technical designs, and data governance standards You More ❯