gaming, food, health care). Knowledge of regulatory reporting and treasury operations in retail banking Exposure to Python, Go or similar languages. Experience working with orchestration frameworks such as Airflow/Luigi Have previously used dbt, dataform or similar tooling. Used to AGILE ways of working (Kanban, Scrum) The Interview Process: Our interview process involves 3 main stages More ❯
gaming, food, health care). Knowledge of regulatory reporting and treasury operations in retail banking Exposure to Python, Go or similar languages. Experience working with orchestration frameworks such as Airflow/Luigi Have previously used dbt, dataform or similar tooling. Used to AGILE ways of working (Kanban, Scrum) The Interview Process: Our interview process involves 3 main stages More ❯
gaming, food, health care). Knowledge of regulatory reporting and treasury operations in retail banking Exposure to Python, Go or similar languages. Experience working with orchestration frameworks such as Airflow/Luigi Have previously used dbt, dataform or similar tooling. Used to AGILE ways of working (Kanban, Scrum) The Interview Process: Our interview process involves 3 main stages More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
Monzo
gaming, food, health care). Knowledge of regulatory reporting and treasury operations in retail banking Exposure to Python, Go or similar languages. Experience working with orchestration frameworks such as Airflow/Luigi Have previously used dbt, dataform or similar tooling. Used to AGILE ways of working (Kanban, Scrum) The Interview Process: Our interview process involves 3 main stages More ❯
an understanding of their migration challenges Familiarity with version control (Git) and modern deployment workflows Proficiency in troubleshooting data pipeline failures and performance bottlenecks Experience with data pipeline orchestration (Airflow or similar is a plus) Nice-to-have experience: Exposure to other parts of the modern data stack (e.g., Fivetran, dbt Metrics Layer, Tableau) Experience in CI/CD More ❯
Understands modern software delivery methodologies and project management tools and uses them to drive successful outcomes Technical requirements Cloud Data Warehouse (Big Query, Snowflake, Redshift etc) Advanced SQL DBT Airflow (or similar tool) ELT Looker (or similar tool) Perks of Working at Viator Competitive compensation packages (routinely benchmarked against the latest industry data), including base salary and annual bonuses More ❯
As an Architect/Staff Software Engineer at Contiamo, you are a technical leader who shapes the direction and quality of our engineering work across multiple projects and clients. This role brings a systems-level perspective that enables architecting solutions More ❯
Senior Data Engineer 100% Remote B2B Contract Full-time position with flexible working hours (overlap with US required) We're looking for a Senior Data Engineer for a company that facilitates freelancing and remote work. Their platform provides a marketplace More ❯
Cloud usage VMWare usage Technical Leadership & Design DevSecOps tooling and practices Application Security Testing SAFe (scaled agile) Processes Data Integration Focused: Data Pipeline Orchestration and ELT tooling such as ApacheAirflow, Apache NiFi, Airbyte, and Singer Message Brokers and streaming data processors like Apache Kafka Object Storage solutions such as S3, MinIO, LakeFS CI/CD More ❯
VMWare General/Usage Technical Leadership & Design DevSecOps tooling and practices Application Security Testing SAFe (scaled agile) Processes Data Integration Focused Data Pipeline Orchestration, and ELT tooling such as ApacheAirflow, Apark, NiFi, Airbyte and Singer. Message Brokers, streaming data processors, such as Apache Kafka Object Storage, such as S3, MinIO, LakeFS CI/CD Pipeline, Integration More ❯
with new methodologies to enhance the user experience. Key skills: Senior Data Scientist experience Commercial experience in Generative AI and recommender systems Strong Python and SQL experience Spark/ApacheAirflow LLM experience MLOps experience AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill) with hybrid working More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Cathcart Technology
with new methodologies to enhance the user experience. Key skills: ** Senior Data Scientist experience ** Commercial experience in Generative AI and recommender systems ** Strong Python and SQL experience ** Spark/ApacheAirflow ** LLM experience ** MLOps experience ** AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill) with hybrid working More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Cathcart Technology
with new methodologies to enhance the user experience. Key skills: ** Senior Data Scientist experience ** Commercial experience in Generative AI and recommender systems ** Strong Python and SQL experience ** Spark/ApacheAirflow ** LLM experience ** MLOps experience ** AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill) with hybrid working More ❯
City of London, London, Tottenham Court Road, United Kingdom Hybrid / WFH Options
Cathcart Technology
with new methodologies to enhance the user experience. Key skills: ** Senior Data Scientist experience ** Commercial experience in Generative AI and recommender systems ** Strong Python and SQL experience ** Spark/ApacheAirflow ** LLM experience ** MLOps experience ** AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill) with hybrid working More ❯
building and scaling Extract, Load, Transform (ELT) pipelines. Experience overseeing or managing projects involving data pipeline orchestration, data quality management, and performance optimization. Knowledge of common ELT tools like ApacheAirflow, Informatica PowerCenter, or cloud-native data integration services is a plus. • Data Science/AI & Data Standards: Understanding of how data standards impact data science and AI More ❯
building and scaling Extract, Load, Transform (ELT) pipelines. Experience overseeing or managing projects involving data pipeline orchestration, data quality management, and performance optimization. Knowledge of common ELT tools like ApacheAirflow, Informatica PowerCenter, or cloud-native data integration services is a plus. • Data Science/AI & Data Standards: Understanding of how data standards impact data science and AI More ❯
position? We are seeking a passionate DataOps Engineer who loves optimizing pipelines, automating workflows, and scaling cloud-based data infrastructure. Key Responsibilities: Design, build, and optimize data pipelines using Airflow, DBT, and Databricks. Monitor and improve pipeline performance to support real-time and batch processing. Manage and optimize AWS-based data infrastructure, including S3 and Lambda, as well as … experience supporting high-velocity data/development teams and designing and maintaining data infrastructure, pipelines, and automation frameworks. You should also have experience streamlining data workflows using tools like Airflow, DBT, Databricks, and Snowflake while maintaining data integrity, security, and performance. Bachelor's degree in Computer Science, Information Technology, or a related field, or equivalent work experience. Minimum of … years of experience in DataOps or similar. Proficiency in key technologies, including Airflow, Snowflake, and SageMaker. Certifications in AWS/Snowflake/other technologies a plus. Excellent communication and interpersonal skills. Ability to work in a fast-paced environment and manage multiple priorities effectively. What's in it for you? We offer our employees more than just competitive compensation. More ❯
and maintaining critical data pipelines that support core trading, forecasting, risk, and PPA processes across all Octopus international regions. The role leverages a modern tech stack including SQL, Python, Airflow, Kubernetes, and various other cutting-edge technologies. You'll work with tools like dbt on Databricks, PySpark, Streamlit, and Django, ensuring robust data infrastructure that powers business-critical operations. … to ensure best practices and code standardization Take ownership of data platform improvements Share knowledge and upskill team members Requirements For Data Engineer Strong aptitude with SQL, Python and Airflow Experience in Kubernetes, Docker, Django, Spark and related monitoring tools Experience with dbt for pipeline modelling Ability to shape needs into requirements and design scalable solutions Quick understanding of More ❯
work. 6-month contract , likely to extend Hybrid - 2 days a week onsite in London (Likely to go remote) Active SC Tech stack: Python, SQL, Jinja, AWS or Azure, Airflow, GitHub Actions, Terraform You'll be designing scalable pipelines, building CI/CD workflows, and collaborating with cross-functional teams. Apply now with your CV and availability. More ❯
of data warehousing concepts and SQL (Snowflake experience a plus) Experience with or willingness to learn CI/CD tooling (e.g. GitHub Actions), containerization (Docker), and workflow orchestration tools (Airflow/AstroCloud) Strong debugging and troubleshooting skills for data pipelines and ML systems Experience writing tests (unit, integration) and implementing monitoring/alerting for production systems Strong data skills … of data warehousing concepts and SQL (Snowflake experience a plus) Experience with or willingness to learn CI/CD tooling (e.g. GitHub Actions), containerization (Docker), and workflow orchestration tools (Airflow/AstroCloud) Strong debugging and troubleshooting skills for data pipelines and ML systems Experience writing tests (unit, integration) and implementing monitoring/alerting for production systems Strong data skills More ❯