Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (AI-Driven SaaS plaform) (Python, Snowflake, Data Modelling, ETL/ELT, ApacheAirflow, Kafka, AWS) Large-scale data environment Up to £70,000 plus benefits FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalable data structures and data pipelines within a truly enterprise-scale data … platform integrates Python and Snowflake and you'll need a deep understanding of SQL and NoSQL databases (MongoDB or similar!) You'll also have exposure with streaming platforms like Apache Kafka and be able to develop and maintain ELT and essentially bring a solid understanding of data warehousing concepts and best practice. Essentially, a strong Data Engineer who is … a Snowflake enthusiast who can write solid SQL queries, within Snowflake! You will understanding Apache Kafka to a high standard and have solid knowledge of ApacheAirflow - from a Cloud perspective, good AWS exposure. Naturally you will have good understanding on AWS. I'd love you to be an advocate of Agile too - these guys are massive More ❯
analytical data models using dbt (data build tool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using ApacheAirflow, including developing custom Python operators and DAGs. Write efficient and maintainable Python scripts for data processing, automation, and integration with various data sources and APIs. Ensure data … and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with ApacheAirflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms (e.g., AWS, Azure, GCP) and relevant data services. Understanding More ❯
analytical data models using dbt (data build tool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using ApacheAirflow, including developing custom Python operators and DAGs. Write efficient and maintainable Python scripts for data processing, automation, and integration with various data sources and APIs. Ensure data … and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with ApacheAirflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms (e.g., AWS, Azure, GCP) and relevant data services. Understanding More ❯
london (city of london), south east england, united kingdom
HCLTech
analytical data models using dbt (data build tool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using ApacheAirflow, including developing custom Python operators and DAGs. Write efficient and maintainable Python scripts for data processing, automation, and integration with various data sources and APIs. Ensure data … and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with ApacheAirflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms (e.g., AWS, Azure, GCP) and relevant data services. Understanding More ❯
you'll have the opportunity to grow the data function and step into a leadership role. The Role: Designing and building robust ETL pipelines using tools like dbt or ApacheAirflow Integrating data from APIs, databases, and SaaS platforms into BigQuery Structuring clean, queryable data models to support analytics and reporting Collaborating with analysts to deliver insightful dashboards … via Looker Establishing data governance and quality processes Requirements: GCP (BigQuery), but open to other cloud backgrounds ETL: dbt, ApacheAirflow, or similar BI: Looker (preferred), or other BI tools Languages: SQL, Python, Java Experienced data engineer, with strong ETL and cloud data warehouse experience Proficiency in SQL and data modelling best practices Experience with BI tools and More ❯
you'll have the opportunity to grow the data function and step into a leadership role. The Role: Designing and building robust ETL pipelines using tools like dbt or ApacheAirflow Integrating data from APIs, databases, and SaaS platforms into BigQuery Structuring clean, queryable data models to support analytics and reporting Collaborating with analysts to deliver insightful dashboards … via Looker Establishing data governance and quality processes Requirements: GCP (BigQuery), but open to other cloud backgrounds ETL: dbt, ApacheAirflow, or similar BI: Looker (preferred), or other BI tools Languages: SQL, Python, Java Experienced data engineer, with strong ETL and cloud data warehouse experience Proficiency in SQL and data modelling best practices Experience with BI tools and More ❯
West Midlands, United Kingdom Hybrid / WFH Options
Experis
data pipelines within enterprise-grade on-prem systems. Key Responsibilities: Design, develop, and maintain data pipelines using Hadoop technologies in an on-premises infrastructure. Build and optimise workflows using ApacheAirflow and Spark Streaming for real-time data processing. Develop robust data engineering solutions using Python for automation and transformation. Collaborate with infrastructure and analytics teams to support … platform. Ensure compliance with enterprise security and data governance standards. Required Skills & Experience: Minimum 5 years of experience in Hadoop and data engineering. Strong hands-on experience with Python, ApacheAirflow, and Spark Streaming. Deep understanding of Hadoop components (HDFS, Hive, HBase, YARN) in on-prem environments. Exposure to data analytics, preferably involving infrastructure or operational data. Experience More ❯
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with ApacheAirflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and More ❯
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with ApacheAirflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and More ❯
london (city of london), south east england, united kingdom
HCLTech
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with ApacheAirflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and More ❯
Troubleshooting: Oversee pipeline performance, address issues promptly, and maintain comprehensive data documentation. What Youll Bring Technical Expertise: Proficiency in Python and SQL; experience with data processing frameworks such as Airflow, Spark, or TensorFlow. Data Engineering Fundamentals: Strong understanding of data architecture, data modelling, and scalable data solutions. Backend Development: Willingness to develop proficiency in backend technologies (e.g., Python with … Django) to support data pipeline integrations. Cloud Platforms: Familiarity with AWS or Azure, including services like ApacheAirflow, Terraform, or SageMaker. Data Quality Management: Experience with data versioning and quality assurance practices. Automation and CI/CD: Knowledge of build and deployment automation processes. Experience within MLOps A 1st class Data degree from one of the UKs top More ❯
Maintenance - Implement robust logging, alerting, and performance monitoring for integrations. Continuous Improvement - Champion enhancements to integration architectures and best practices. Skills & Experience Required Experience with workflow orchestration tools (e.g., ApacheAirflow). Proven track record in backend development (e.g., Node.js, Python, Java). Strong knowledge of API design, integration methods (REST, Webhooks, GraphQL), and authentication protocols (OAuth2, JWT More ❯
Maintenance - Implement robust logging, alerting, and performance monitoring for integrations. Continuous Improvement - Champion enhancements to integration architectures and best practices. Skills & Experience Required Experience with workflow orchestration tools (e.g., ApacheAirflow). Proven track record in backend development (e.g., Node.js, Python, Java). Strong knowledge of API design, integration methods (REST, Webhooks, GraphQL), and authentication protocols (OAuth2, JWT More ❯
of real-time and analytical data pipelines, metadata, and cataloguing (e.g., Atlan) Strong communication, stakeholder management, and documentation skills Preferred (but not essential): AWS or Snowflake certifications Knowledge of ApacheAirflow, DBT, GitHub Actions Experience with Iceberg tables and data product thinking Why Apply? Work on high-impact, high-scale client projects Join a technically elite team with More ❯
Meta, Amazon , OpenAI) Proficiency with essential data science libraries including Pandas, NumPy, scikit-learn, Plotly/Matplotlib, and Jupyter Notebooks Knowledge of ML-adjacent technologies, including AWS SageMaker and Apache Airflow. Strong skills in data preprocessing, wrangling, and augmentation techniques Experience deploying scalable AI solutions on cloud platforms (AWS, Google Cloud, or Azure) with enthusiasm for MLOps tools and More ❯
Basingstoke, Hampshire, South East, United Kingdom
Anson Mccade
processes. Monitor integration health and implement alerting, logging, and performance tracking. Contribute to continuous improvement of integration architecture and practices. Key Skills & Experience Experience with workflow orchestration tools (e.g., ApacheAirflow). Proven backend development skills using Node.js, Python, Java, or similar. Strong understanding of API design and integration techniques (REST, Webhooks, GraphQL). Familiarity with authentication protocols More ❯
including LoRA, QLoRA, and parameter-efficient methods Multi-modal AI systems combining text, image, and structured data Reinforcement Learning from Human Feedback (RLHF) for model alignment Production ML Systems: ApacheAirflow/Dagster for ML workflow orchestration and ETL pipeline management Model versioning and experiment tracking (MLflow, Weights & Biases) Real-time model serving and edge deployment strategies A More ❯
with Kimball methodology, dimensional modelling, and star schema design. Proven experience with Redshift or Snowflake. Strong background in cloud-based data environments (AWS preferred). Hands-on experience with Airflow for orchestration. (Nice-to-have) Python for data engineering tasks. (Nice-to-have) Optimisation for BI tools such as Power BI or Looker. Soft skills: Strong collaboration with both More ❯
Leeds, West Yorkshire, England, United Kingdom Hybrid / WFH Options
Robert Walters
services (e.g., S3, Glue, Lambda, Redshift, Athena, EMR). Experience designing and building data lakes and modern data platforms. Proficiency with Python, SQL, and data pipeline orchestration tools (e.g., Airflow, dbt). Strong understanding of data modelling, ETL/ELT processes, and distributed systems. Knowledge of data security, governance, and compliance best practices. Excellent leadership, communication, and stakeholder engagement More ❯
in a commercial setting. Solid understanding of distributed systems (e.g. Hadoop, AWS, Kafka). Experience with SQL/NoSQL databases (e.g. PostgreSQL, Cassandra). Familiarity with orchestration tools (e.g. Airflow, Luigi) and cloud platforms (e.g. AWS, GCP). Passion for solving complex problems and mentoring others. Package: Salary from £(phone number removed) depending on experience Remote-first with flexible More ❯
Reading, Berkshire, England, United Kingdom Hybrid / WFH Options
Hays Specialist Recruitment Limited
e.g., Git), and strong problem-solving skills are vital. The ability to communicate technical concepts clearly to non-technical stakeholders is also important. Experience with orchestration tools such as Airflow, knowledge of Python or other scripting languages, and familiarity with data governance and security best practices are desirable. Exposure to agile development environments, performance tuning of data queries, and More ❯
of ingestion across the portfolio. Key Requirements: Strong proficiency in Python and PySpark Successful track history in a Data Engineering role including Database Management (ideally SQL), Data Orchestration (ideally ApacheAirflow or Dagster), Containerisation (ideally Kubernetes and Docker) .Data Pipelines (big data technologies and architectures) Experience in Financial Services (ideally Commodity) Trading. Bachelor's degree in Information Systems More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Venesky Brown
fine-tuning techniques including LoRA, QLoRA, and parameter efficient methods - Multi-modal AI systems combining text, image, and structured data - Reinforcement Learning from Human Feedback (RLHF) for model alignment - ApacheAirflow/Dagster for ML workflow orchestration and ETL pipeline management - Model versioning and experiment tracking (MLflow, Weights & Biases) - Real-time model serving and edge deployment strategies - A More ❯
milton, central scotland, united kingdom Hybrid / WFH Options
Venesky Brown
fine-tuning techniques including LoRA, QLoRA, and parameter efficient methods - Multi-modal AI systems combining text, image, and structured data - Reinforcement Learning from Human Feedback (RLHF) for model alignment - ApacheAirflow/Dagster for ML workflow orchestration and ETL pipeline management - Model versioning and experiment tracking (MLflow, Weights & Biases) - Real-time model serving and edge deployment strategies - A More ❯
paisley, central scotland, united kingdom Hybrid / WFH Options
Venesky Brown
fine-tuning techniques including LoRA, QLoRA, and parameter efficient methods - Multi-modal AI systems combining text, image, and structured data - Reinforcement Learning from Human Feedback (RLHF) for model alignment - ApacheAirflow/Dagster for ML workflow orchestration and ETL pipeline management - Model versioning and experiment tracking (MLflow, Weights & Biases) - Real-time model serving and edge deployment strategies - A More ❯