robust way possible! Diverse training opportunities and social benefits (e.g. UK pension schema) What do you offer? Strong hands-on experience working with modern Big Data technologies such as Apache Spark, Trino, Apache Kafka, Apache Hadoop, Apache HBase, Apache Nifi, ApacheAirflow, Opensearch Proficiency in cloud-native technologies such as containerization and Kubernetes More ❯
and contribute to technical roadmap planning Technical Skills: Great SQL skills with experience in complex query optimization Strong Python programming skills with experience in data processing libraries (pandas, NumPy, Apache Spark) Hands-on experience building and maintaining data ingestion pipelines Proven track record of optimising queries, code, and system performance Experience with open-source data processing frameworks (Apache Spark, Apache Kafka, ApacheAirflow) Knowledge of distributed computing concepts and big data technologies Experience with version control systems (Git) and CI/CD practices Experience with relational databases (PostgreSQL, MySQL or similar) Experience with containerization technologies (Docker, Kubernetes) Experience with data orchestration tools (ApacheAirflow or Dagster) Understanding of data warehousing concepts and More ❯
and contribute to technical roadmap planning Technical Skills: Great SQL skills with experience in complex query optimization Strong Python programming skills with experience in data processing libraries (pandas, NumPy, Apache Spark) Hands-on experience building and maintaining data ingestion pipelines Proven track record of optimising queries, code, and system performance Experience with open-source data processing frameworks (Apache Spark, Apache Kafka, ApacheAirflow) Knowledge of distributed computing concepts and big data technologies Experience with version control systems (Git) and CI/CD practices Experience with relational databases (PostgreSQL, MySQL or similar) Experience with containerization technologies (Docker, Kubernetes) Experience with data orchestration tools (ApacheAirflow or Dagster) Understanding of data warehousing concepts and More ❯
and contribute to technical roadmap planning Technical Skills: Great SQL skills with experience in complex query optimization Strong Python programming skills with experience in data processing libraries (pandas, NumPy, Apache Spark) Hands-on experience building and maintaining data ingestion pipelines Proven track record of optimising queries, code, and system performance Experience with open-source data processing frameworks (Apache Spark, Apache Kafka, ApacheAirflow) Knowledge of distributed computing concepts and big data technologies Experience with version control systems (Git) and CI/CD practices Experience with relational databases (PostgreSQL, MySQL or similar) Experience with containerization technologies (Docker, Kubernetes) Experience with data orchestration tools (ApacheAirflow or Dagster) Understanding of data warehousing concepts and More ❯
and contribute to technical roadmap planning Technical Skills: Great SQL skills with experience in complex query optimization Strong Python programming skills with experience in data processing libraries (pandas, NumPy, Apache Spark) Hands-on experience building and maintaining data ingestion pipelines Proven track record of optimising queries, code, and system performance Experience with open-source data processing frameworks (Apache Spark, Apache Kafka, ApacheAirflow) Knowledge of distributed computing concepts and big data technologies Experience with version control systems (Git) and CI/CD practices Experience with relational databases (PostgreSQL, MySQL or similar) Experience with containerization technologies (Docker, Kubernetes) Experience with data orchestration tools (ApacheAirflow or Dagster) Understanding of data warehousing concepts and More ❯
london (city of london), south east england, united kingdom
Vallum Associates
and contribute to technical roadmap planning Technical Skills: Great SQL skills with experience in complex query optimization Strong Python programming skills with experience in data processing libraries (pandas, NumPy, Apache Spark) Hands-on experience building and maintaining data ingestion pipelines Proven track record of optimising queries, code, and system performance Experience with open-source data processing frameworks (Apache Spark, Apache Kafka, ApacheAirflow) Knowledge of distributed computing concepts and big data technologies Experience with version control systems (Git) and CI/CD practices Experience with relational databases (PostgreSQL, MySQL or similar) Experience with containerization technologies (Docker, Kubernetes) Experience with data orchestration tools (ApacheAirflow or Dagster) Understanding of data warehousing concepts and More ❯
Note: This is a Sr. Level role! About the Role- We are looking for an experienced Senior Airflow Developer with over 5 years of experience to help transition our existing Windows scheduler jobs to ApacheAirflow DAGs. In this role, you’ll play a critical part in modernizing and optimizing our task automation processes by converting existing … into efficient, manageable, and scalable workflows in Airflow. You will also work on security hardening, implementing data pipelines, and designing ETL processes. Key Responsibilities- Convert Windows Scheduler Jobs to Airflow: Migrate existing Windows-based scheduled jobs into Airflow DAGs, ensuring smooth execution and reliability. Develop and Optimize DAGs: Author, schedule, and monitor DAGs (Directed Acyclic Graphs) to handle … data workflows, ETL tasks, and various automation processes. Programming and Scripting: Use Python as the primary language for Airflow DAGs and task logic, with experience in SQL for data manipulation. Set Up and Configure Airflow: Provide comprehensive instructions and configurations for setting up Airflow environments, including deployment, resource allocation, and high availability. Security Hardening: Implement security best More ❯
at least one cloud data platform (e.g. AWS, Azure, Google Cloud) and big data technologies (e.g., Hadoop, Spark). Strong knowledge of data workflow solutions like Azure Data Factory, Apache NiFi, ApacheAirflow etc Good knowledge of stream and batch processing solutions like Apache Flink, Apache Kafka Good knowledge of log management, monitoring, and analytics More ❯
the success of the data department overall. TLA works with the modern data stack, utilising Snowflake for our data warehouse, dbt to transform data across our medallion architecture, and ApacheAirflow for orchestration. Microsoft Azure is our choice of cloud provider for hosting infrastructure. Within the role you will be hands-on with all these exciting technologies. Many … Nice-to-Have Skills: Experience with both batch and near real-time data pipelines Familiarity with Infrastructure as Code (Terraform) Experience with dbt and medallion architecture patterns Knowledge of ApacheAirflow or similar orchestration tools Azure cloud platform experience Why Join TLA? TLA is a fast-moving, innovative digital business that partners with some of the biggest automotive More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
monitor machine learning models for anomaly detection and failure prediction. Analyze sensor data and operational logs to support predictive maintenance strategies. Develop and maintain data pipelines using tools like ApacheAirflow for efficient workflows. Use MLflow for experiment tracking, model versioning, and deployment management. Contribute to data cleaning, feature engineering, and model evaluation processes. Collaborate with engineers and … science libraries (Pandas, Scikit-learn, etc.). Solid understanding of machine learning concepts and algorithms . Interest in working with real-world industrial or sensor data . Exposure to ApacheAirflow and/or MLflow (through coursework or experience) is a plus. A proactive, analytical mindset with a willingness to learn and collaborate. Why Join Us Work on More ❯
delivering end-to-end AI/ML projects. Nice to Have: Exposure to LLMs (Large Language Models), generative AI , or transformer architectures . Experience with data engineering tools (Spark, Airflow, Snowflake). Prior experience in fintech, healthtech, or similar domains is a plus. More ❯
Snowflake, Databricks) Strong DevOps mindset with experience in CI/CD pipelines, monitoring, and observability tools (Grafana or equivalent). Exposure to analytics, reporting, and BI tools such as Apache Superset, Lightdash or OpenSearch Willingness to work across the stack by contributing to API development and, at times, UI components (Vue.js, Zoho, or similar). Excellent communication and collaboration More ❯
in C# and Python with additional skills in Java, JavaScript/Typescript, Angular, very strong SQL, Windows server, UNIX, and .Net. Strong research skills. Strong experience of Terraform, AWS, Airflow, Docker, Github/Github actions, Jenkins/Teamcity• Strong AWS specific skills for Athena, Lambda, ECS, ECR, S3 and IAM Strong knowledge in industry best practices in development and More ❯
move us towards our vision of scaling up through product led growth. This role will be focused on our backend system (Symfony, PHP) and our data products (BigQuery, DBT, Airflow), but there will be opportunities to work across the platform including, agentic AI (Python, Langchain), frontend (React, TypeScript), the APIs (GraphQL, REST), our integration tool of choice (Tray.ai) and More ❯
and Responsibilities While in this position your duties may include but are not limited to: Support the design, development, and maintenance of scalable data pipelines using tools such as ApacheAirflow, dbt, or Azure Data Factory. Learn how to ingest, transform, and load data from a variety of sources, including APIs, databases, and flat files. Assist in the More ❯
following engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith More ❯
platforms. Proven experience in a leadership or technical lead role, with official line management responsibility. Strong experience with modern data stack technologies, including Python, Snowflake, AWS (S3, EC2, Terraform), Airflow, dbt, Apache Spark, Apache Iceberg, and Postgres. Skilled in balancing technical excellence with business priorities in a fast-paced environment. Strong communication and stakeholder management skills, able More ❯
record in full stack data development (from ingestion to visualization). Strong expertise in Snowflake, including data modeling, warehousing, and performance optimization. Hands-on experience with ETL tools (e.g., ApacheAirflow, dbt, Fivetran) and integrating data from ERP systems like NetSuite. Proficiency in SQL, Python, and/or other scripting languages for data processing and automation. Familiarity with More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
/BI role Advanced SQL skills with hands-on experience using dbt for data modeling Strong familiarity with modern data platforms like Snowflake, Looker, AWS/GCP, Tableau, or Airflow Experience with version control tools (e.g., Git) Ability to design, build, and document scalable, reliable data models Comfortable gathering business requirements and translating them into data architecture Strong problem More ❯
inclusive and collaborative culture, encouraging peer to peer feedback and evolving healthy, curious and humble teams. Tech Stack: Python, Javascript/Typescript, React/React Native, AWS, GraphQL, Snowflake, Airflow, DDD. This is an incredible opportunity for a Senior Software Engineer to join a unique company as they embark on a period of significant growth to take their fantastic More ❯
inclusive and collaborative culture, encouraging peer to peer feedback and evolving healthy, curious and humble teams. Tech Stack: Python, Javascript/Typescript, React/React Native, AWS, GraphQL, Snowflake, Airflow, DDD. This is an incredible opportunity for a Senior Software Engineer to join a unique company as they embark on a period of significant growth to take their fantastic More ❯
inclusive and collaborative culture, encouraging peer to peer feedback and evolving healthy, curious and humble teams. Tech Stack: Python, Javascript/Typescript, React/React Native, AWS, GraphQL, Snowflake, Airflow, DDD. This is an incredible opportunity for a Senior Software Engineer to join a unique company as they embark on a period of significant growth to take their fantastic More ❯
tuning. Experience with designing and programming relational database such as MySQL, RedShift, Oracle SQL Server, or Postgres. Experience with AWS based system architecture covering S3, EKS, EC2, Batch, or Airflow etc. Experience with caching and messaging technologies such as, Redis, Hazelcast, MQ, or Kafka etc. Experience with programming within a CICD pipeline such as Git, Jenkins etc. Strong problem More ❯
software development lifecycle, from conception to deployment. Capable of conceptualizing and implementing software architectures spanning multiple technologies and platforms. Technology stack Python Flask Java Spring JavaScript BigQuery Redis ElasticSearch Airflow Google Cloud Platform Kubernetes Docker Voted "Best Places to Work," our culture is driven by self-starters, team players, and visionaries. Headquartered in Los Angeles, California, the company operates More ❯
to non-technical stakeholders Familiarity with agile development practices. Nice to Have: Experience working with patent data, legal data, or other structured open data sources Exposure to tools like Airflow, dbt, or CI/CD pipelines for data workflows Understanding of data governance, quality frameworks, and observability tools Contributions to engineering documentation or internal knowledge sharing Why Join Us More ❯