developing with Python and related ML libraries Functional programming experience a plus Willing to learn and develop in new technologies as required Experience with MongoDB Experience with MLFlow or AirFlow a plus Other tools: Maven, GIT, LINUX, Location: Customer Site, Telework Telework: 75% (In office at least one day a week More ❯
Science, or a related technical field required • 4+ years in data engineering, preferably within secure or classified environments • Strong proficiency in Python, Spark, SQL, and orchestration tools such as Airflow • Hands-on experience with classified data management, secure networking, and infrastructure performance tuning Preferred Experience • Familiarity with secure cloud environments (e.g., AWS GovCloud, C2S) • Strong troubleshooting and optimization skills More ❯
of machine learning, statistics, and data modeling Expert in Python (pandas, scikit-learn, etc.) and SQL Experience with cloud platforms (Azure, AWS, or GCP) Familiarity with tools like MLflow, Airflow, Power BI Strong stakeholder management and communication skills Fluent in Dutch (mandatory More ❯
Preferred Qualifications Experience in coding for automation (e.g. Scala, Python, Java etc.) Familiarity with writing data pipelines using one or more big data technologies (e.g. Spark, Hive, Trino, Flink, Airflow etc.) Familiarity with modern cloud technologies such as Kubernetes Familiarity with software development lifecycle for data science and engineering Familiarity with commerce, payments, and user experience datasets at online More ❯
platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes). Solid understanding of ML lifecycle: data handling, model development, deployment, and monitoring. Familiarity with MLOps tools such as MLflow, Airflow, DVC, or similar. Experience with version control (Git), CI/CD pipelines, and software engineering best practices. Fluent in English; knowledge of French or Dutch is a plus. Desirable More ❯
Engineering etc Software Development experience in Python or Scala An understanding of Big Data technologies such as Spark, messaging services like Kafka or RabbitMQ, and workflow management tools like Airflow SQL & NoSQL expertise, ideally including Postgres, Redis, MongoDB etc Experience with AWS, and with tools like Docker & Kubernetes As well as this you will be someone willing to take More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
as Azure Data Factory (ADF) and Python. Ensuring the quality of raw datasets to empower Data Analysts in creating robust data models. Deploying and managing data tools on Kubernetes (Airflow, Superset, RStudio Connect). Supporting Data Analytics through the management of DBT, DevOps, and deployment rules. You will have the opportunity to work end-to-end, making meaningful contributions More ❯
security standards. Required Experience: Active Secret clearance. 3-7 years in data engineering, preferably within secure, high-side environments. Proficiency in Python, Spark, SQL, and data orchestration tools (e.g., Airflow). Experience with classified data management, secure networking, and infrastructure optimization. Preferred Experience: Familiarity with IC standards (UDS, IC ITE) and secure cloud environments (AWS GovCloud, C2S). Strong More ❯
. NoSQL (e.g., MongoDB and Firestore). SQL querying (e.g., BigQuery, Snowflake), including the ability to work with complex data structures and very large data volumes. Orchestration services (e.g., Airflow, Luigi, Cloud Compose). Proactive, independent, responsible and attentive to detail. Eager and able to learn, analyse, resolve problems, and improve the standard of BVGroup data infrastructure. Degree in More ❯
systems and APIs (RESTful/GraphQL), with solid experience in microservices and databases (SQL/NoSQL). You know your way around big data tools (Spark, Dask) and orchestration (Airflow, DBT). You understand NLP and have experience working with Large Language Models. You're cloud-savvy (AWS, GCP, or Azure) and comfortable with containerization (Docker, Kubernetes). You More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
and familiarity with Git Strong communicator, eager to learn, and naturally curious Comfortable working across multiple business areas with varied responsibilities Nice-to-Haves Exposure to tools like Prefect , Airflow , or Dagster Familiarity with Azure SQL , Snowflake , or dbt Tech Stack/Tools Python SQL (on-prem + Azure SQL Data Warehouse) Git Benefits £35,000 - £40,000 starting More ❯
from you. Key Responsibilities: - Design and build high-scale systems and services to support data infrastructure and production systems. - Develop and maintain data processing pipelines using technologies such as Airflow, PySpark and Databricks. - Implement dockerized high-performance microservices and manage their deployment. - Monitor and debug backend systems and data pipelines to identify and resolve bottlenecks and failures. - Work collaboratively More ❯
with remote stakeholders Familiarity with AI development tools such as Cursor, GitHub Copilot, or Claude. BS degree in Computer Science or related engineering field Nice to Have Experience with Airflow, Celery, AWS and/or Azure, Postgres Experience with API platform development Experience with Go Ready to be part of AI transformation at Abnormal AI? Apply Now! Once you More ❯
adaptable to fast-paced startup environment, comfortable with ambiguity and evolving responsibilities Work Authorization: Must be eligible to work in US or UK Preferred Experience: Data orchestration tools (e.g. , Airflow, Prefect)Experience deploying, monitoring, and maintaining ML models in production environments (MLOps)Familiarity with big data technologies ( e.g. , Spark, Hadoop)Background in time-series analysis and forecastingExperience with data More ❯
adaptable to fast-paced startup environment, comfortable with ambiguity and evolving responsibilities Work Authorization: Must be eligible to work in US or UK Preferred Experience: Data orchestration tools (e.g. , Airflow, Prefect)Experience deploying, monitoring, and maintaining ML models in production environments (MLOps)Familiarity with big data technologies ( e.g. , Spark, Hadoop)Background in time-series analysis and forecastingExperience with data More ❯
You have experience with RAG (Retrieval-Augmented Generation) systems, vector databases, and embedding models for knowledge extraction. You can architect complex workflows. You have experience with workflow orchestration tools (Airflow, Prefect, Temporal) or have built custom pipeline systems for multi-step autonomous processes. You bridge science and engineering. You are comfortable with scientific computing libraries (NumPy, SciPy, pandas) and More ❯
or in a similar role, with a focus on data management. Strong programming skills in languages such as Python, Shell Scripting, or similar. Experience with automation frameworks (e.g., Jenkins, Airflow, or similar) and version control tools (e.g., Git). Proficiency in working with databases (SQL, NoSQL) and cloud platforms (AWS, Azure, GCP) for data storage and processing. Solid understanding More ❯
strategy. Expertise in causal inference methods and forecasting. Expertise in data querying languages (e.g. SQL) and scripting languages (e.g. Python, R). Experience with data architecture technologies such as Airflow, Databricks, and dbt. Preferred qualifications: Experience in technology, financial services and/or a high growth environment. Experience with Excel and Finance systems (e.g. Oracle). Equal opportunity Airwallex More ❯
warehouse provider e.g. Databricks, GCP, Snowflake The following would be nice to have Experience in the following Languages: Python Experience with the following tools: Github Lightdash Elementary CircleCI Databricks Airflow Kubernetes DuckDB Spark Data Modelling Techniques e.g. Kimball, OBT Why else you'll love it here Wondering what the salary for this role is? Just ask us! On a More ❯
AI delivery • Proven track record deploying ML systems in production at scale (batch and/or real-time) • Strong technical background in Python and ML engineering tooling (e.g. MLflow, Airflow, SageMaker, Vertex AI, Databricks) • Understanding of infrastructure-as-code and CI/CD for ML systems (e.g. Terraform, GitHub Actions, ArgoCD) • Ability to lead delivery in agile environments-balancing More ❯
AI delivery • Proven track record deploying ML systems in production at scale (batch and/or real-time) • Strong technical background in Python and ML engineering tooling (e.g. MLflow, Airflow, SageMaker, Vertex AI, Databricks) • Understanding of infrastructure-as-code and CI/CD for ML systems (e.g. Terraform, GitHub Actions, ArgoCD) • Ability to lead delivery in agile environments—balancing More ❯
AI delivery Proven track record deploying ML systems in production at scale (batch and/or real-time) Strong technical background in Python and ML engineering tooling (e.g. MLflow, Airflow, SageMaker, Vertex AI, Databricks) Understanding of infrastructure-as-code and CI/CD for ML systems (e.g. Terraform, GitHub Actions, ArgoCD) Ability to lead delivery in agile environments—balancing More ❯
on data storage, transfer, and processing, ensuring data is readily available and secure. Proficient in cloud data services (e.g., AWS, Azure), scripting (e.g., Python, Bash), and orchestration tools (e.g., Airflow, Jenkins). Demonstrated experience with data pipeline development, ETL/ELT processes, and cloud-native data management. Strong understanding of network segmentation, data governance, and cross-domain data transfer More ❯
Previous experience working in e-commerce, retail, or the travel industry Experience designing and analysing large-scale A/B test experiments Mastery of workflow orchestration technologies such as Airflow, Dagster or Prefect Expert knowledge of technologies such as: Google Cloud Platform, particularly Vertex AI Docker and Kubernetes Infrastructure as Code Experience establishing data science best practices across an More ❯