software development and code quality. Vendor Collaboration: Work closely with third-party vendors to integrate their solutions, ensuring they meet our high standards for production environments. Workflow Automation: Utilize Airflow to automate and optimize workflows, ensuring efficient and reliable operations. Required 5-7 years of experience in software development with a focus on production-grade code. Proficiency in Java More ❯
Are you a problem-solver with a passion for data, performance, and smart engineering? This is your opportunity to join a fast-paced team working at the forefront of data platform innovation in the financial technology space. You'll tackle More ❯
Are you a problem-solver with a passion for data, performance, and smart engineering? This is your opportunity to join a fast-paced team working at the forefront of data platform innovation in the financial technology space. You'll tackle More ❯
Job Title: Data Modeller Salary: £85,000 - £95,000 + Benefits and Bonus Location: London (3 days a week onsite) The Role: As a Data Modeler , you will be responsible for designing and implementing data models to support complex data More ❯
features. Rapid Prototyping: Create interactive AI demos and proofs-of-concept with Streamlit, Gradio, or Next.js for stakeholder feedback; MLOps & Deployment: Implement CI/CD pipelines (e.g., GitLab Actions, ApacheAirflow), experiment tracking (MLflow), and model monitoring for reliable production workflows; Cross-Functional Collaboration: Participate in code reviews, architectural discussions, and sprint planning to deliver features end-to More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
Harnham
and resolve ETL failures or data issues Collaborate with cross-functional and offshore teams, as well as suppliers Hands-on support for tools like Power BI, AWS, SQL and AirFlow Staying ahead of emerging AI tech and research to propose exciting solutions Proactively manage and escalate data issues SKILLS AND EXPERIENCE Required 5+ years industry experience (flexible depending on … quality of experience) Airflow (must-have), AWS (Redshift, S3, Glue), Power BI Strong SQL, as well as Python AWS ecosystem familiarity is essential Both hands-on and management/leadership experience is required Able to work in a fast-paced, dynamic environment This includes a two-stage interview process! This role cannot sponsor. Apply below More ❯
Lambda, Azure) Strong problem-solving skills and critical thinking Strategic planning abilities Additional Skills (desired): Experience with protein or DNA bioinformatics MLOps expertise Software engineering skills, data pipelining (e.g., Airflow), and cloud deployment experience Familiarity with Agile methodologies If you're interested in joining as our new Head of AI, we'd love to hear from you More ❯
problem-solving skills and critical thinking in AI research Strategic planning abilities Additional Skills (desired): Experience with protein or DNA bioinformatics MLOps expertise Software engineering skills, data pipelining (e.g., Airflow), cloud deployment experience Knowledge of Agile methodologies Interested in joining as our Head of AI? We look forward to hearing from you More ❯
production environments. Understanding of CI/CD and best practices for ML. Nice to Have Exposure to energy trading or real-time data environments. Experience with tools like MLflow, Airflow, or Step Functions. Apply now for immediate review! Seniority level Seniority level Mid-Senior level Employment type Employment type Contract Job function Job function Information Technology Industries Utilities Referrals More ❯
ideally in early-stage startups Deep expertise in React Native and TypeScript, with the ability to dip into Swift or Objective-C Experience navigating backend or infrastructure (Python, GCP, Airflow) to unblock yourself A scrappy, resourceful mindset and comfort moving fast and iterating quickly Excellent judgment on what good looks like in design, performance, and user value Strong product More ❯
such as Lightdash, Looker or Tableau Comfortable with version control, testing frameworks, and CI/CD in a data context Familiarity with Python and orchestration tools like Dagster or Airflow is highly desirable Experience with Ecommerce and/or Subscription services is desirable The Interview Process Meet & Greet Call with a Talent Partner Call with the Hiring Manager and More ❯
System Reliability Engineer. Experience with building, maintaining and continuously enhancing automations needed for scalability & efficiency in running the Network Infrastructure. Experience in infrastructure Automation and orchestration Frameworks e.g. Ansible, Airflow, Terraform, Chef, Salt. Proven experience with object-oriented programming languages preferably in Python. A bachelor's or master's degree in computer science, Engineering, Mathematics, a similar field of More ❯
What we’d like to see from you: • Extensive experience designing and deploying ML systems in production • Deep technical expertise in Python and modern ML tooling (e.g. MLflow, TFX, Airflow, Kubeflow, SageMaker, Vertex AI) • Experience with infrastructure-as-code and CI/CD practices for ML (e.g. Terraform, GitHub Actions, ArgoCD) • Proven ability to build reusable tooling, scalable services More ❯
Data/Analytics Monthly Business Review (MBR) - Track performance metrics for your team - Communicate with and support various internal stakeholders and external audiences - Work with AWS products including Redshift, Airflow, QuickSight, Lambda, and GenAI products like AWS Bedrock A day in the life Our team's customers are internal Amazon teams responsible for managing Amazon's Last Mile global More ❯
What we’d like to see from you: Extensive experience designing and deploying ML systems in production Deep technical expertise in Python and modern ML tooling (e.g. MLflow, TFX, Airflow, Kubeflow, SageMaker, Vertex AI) Experience with infrastructure-as-code and CI/CD practices for ML (e.g. Terraform, GitHub Actions, ArgoCD) Proven ability to build reusable tooling, scalable services More ❯
experience designing and implementing robust, performant data pipelines from a variety of sources such as databases, APIs, SFTP, etc. Experience building ELT pipelines using tools such as dbt and Airflow Proven experience managing software engineering teams, including mentoring juniors and seniors, and promoting professional development. Demonstrated experience in project management, with the ability to manage multiple projects and prioritize More ❯
ETL using big data tools (HIVE/Presto, Redshift) Previous experience with web frameworks for Python such as Django/Flask is a plus Experience writing data pipelines using Airflow Fluency in Looker and/or Tableau Strong understanding of data warehousing principles, pipelines, and APIs Strong communication skills The ability to work independently and across multiple time zones More ❯
Machine Moderation system using Python-based NLP models deployed on AWS (Lambda, ECS, SageMaker, SQS, SNS). Train, evaluate, and monitor machine learning models using orchestration tools (e.g. Flyte, Airflow). Manage ML pipelines on AWS with containerized services and CI/CD deployment via GitHub Actions. Implement streaming data processing using Kafka for real-time content moderation decisions. More ❯
Fitch Group, Inc., Fitch Ratings, Inc., Fitch Solutions Group
AWS and Azure cloud services to provide the necessary infrastructure, resources, and interfaces for data loading and LLM workflows. Use Python and large-scale data workflow orchestration platforms (e.g. Airflow) to build software artifacts for ETL, integrating diverse data formats and storage technologies, and incorporate them into robust data workflows and dynamic systems You May be a Good Fit More ❯
Science or a related field. Strong background in statistical analysis, including Marketing Mix Modelling and Predictive LTV modelling. Excellent knowledge of both Data Science (Python, SQL) and production tools (Airflow). Experience of shipping tested models which make predictions in batch or real time. Proven ability to understand stakeholder problems and build models that get used for decision-making More ❯
Lumi Space is empowering the future prosperity of earth - making space scalable and sustainable using ground-based laser systems. We work with global companies and institutions to build products and services to precisely track satellites and remove the dangers of More ❯
standardized Spark SQL and PySpark code for data transformations, ensuring data integrity and accuracy across the pipeline. Automate pipeline orchestration using Databricks Workflows or integration with external tools (e.g., ApacheAirflow, Azure Data Factory). Data Ingestion & Transformation: Build scalable data ingestion processes to handle structured, semi-structured, and unstructured data from various sources (APIs, databases, file systems More ❯
standardized Spark SQL and PySpark code for data transformations, ensuring data integrity and accuracy across the pipeline. Automate pipeline orchestration using Databricks Workflows or integration with external tools (e.g., ApacheAirflow, Azure Data Factory). Data Ingestion & Transformation: Build scalable data ingestion processes to handle structured, semi-structured, and unstructured data from various sources (APIs, databases, file systems More ❯
years focused specifically on data, analytics platforms, or related technical products. You are deeply familiar with modern analytics technologies, especially SQL engines, Databricks, and tools such as Starburst, Airflow, Looker, or equivalent. You understand how generative AI experiences can be leveraged in analytics platforms. You have a proven track record of delivering analytics products that significantly enhance internal productivity … years focused specifically on data, analytics platforms, or related technical products. You are deeply familiar with modern analytics technologies, especially SQL engines, Databricks, and tools such as Starburst, Airflow, Looker, or equivalent. You understand how generative AI experiences can be leveraged in analytics platforms. You have a proven track record of delivering analytics products that significantly enhance internal productivity More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
to our bespoke data pipeline and associated API services. Key Requirements 5+ years of Python experience with frameworks like Flask, FastAPI, and Django. Strong command orchestration tools (e.g. Prefect, Airflow), Docker, and AWS infrastructure (CDK, Terraform). Solid understanding of API services, authentication methods (JWT, SSO), and clear, pragmatic communication skills. Maintain, upgrade, and improve existing systems and custom More ❯