production issues. Optimize applications for performance and responsiveness. Stay Up to Date with Technology: Keep yourself and the team updated on the latest Python technologies, frameworks, and tools like Apache Spark, Databricks, Apache Pulsar, ApacheAirflow, Temporal, and Apache Flink, sharing knowledge and suggesting improvements. Documentation: Contribute to clear and concise documentation for software, processes … Experience with cloud platforms like AWS, GCP, or Azure. DevOps Tools: Familiarity with containerization (Docker) and infrastructure automation tools like Terraform or Ansible. Real-time Data Streaming: Experience with Apache Pulsar or similar systems for real-time messaging and stream processing is a plus. Data Engineering: Experience with Apache Spark, Databricks, or similar big data platforms for processing … large datasets, building data pipelines, and machine learning workflows. Workflow Orchestration: Familiarity with tools like ApacheAirflow or Temporal for managing workflows and scheduling jobs in distributed systems. Stream Processing: Experience with Apache Flink or other stream processing frameworks is a plus. Desired Skills Asynchronous Programming: Familiarity with asynchronous programming tools like Celery or asyncio. Frontend Knowledge More ❯
Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (AI-Driven SaaS plaform) (Python, Snowflake, Data Modelling, ETL/ELT, ApacheAirflow, Kafka, AWS) Large-scale data environment Up to £70,000 plus benefits FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalable data structures and data pipelines within a truly enterprise-scale data … platform integrates Python and Snowflake and you'll need a deep understanding of SQL and NoSQL databases (MongoDB or similar!) You'll also have exposure with streaming platforms like Apache Kafka and be able to develop and maintain ELT and essentially bring a solid understanding of data warehousing concepts and best practice. Essentially, a strong Data Engineer who is … a Snowflake enthusiast who can write solid SQL queries, within Snowflake! You will understanding Apache Kafka to a high standard and have solid knowledge of ApacheAirflow - from a Cloud perspective, good AWS exposure. Naturally you will have good understanding on AWS. I'd love you to be an advocate of Agile too - these guys are massive More ❯
you'll have the opportunity to grow the data function and step into a leadership role. The Role: Designing and building robust ETL pipelines using tools like dbt or ApacheAirflow Integrating data from APIs, databases, and SaaS platforms into BigQuery Structuring clean, queryable data models to support analytics and reporting Collaborating with analysts to deliver insightful dashboards … via Looker Establishing data governance and quality processes Requirements: GCP (BigQuery), but open to other cloud backgrounds ETL: dbt, ApacheAirflow, or similar BI: Looker (preferred), or other BI tools Languages: SQL, Python, Java Experienced data engineer, with strong ETL and cloud data warehouse experience Proficiency in SQL and data modelling best practices Experience with BI tools and More ❯
you'll have the opportunity to grow the data function and step into a leadership role. The Role: Designing and building robust ETL pipelines using tools like dbt or ApacheAirflow Integrating data from APIs, databases, and SaaS platforms into BigQuery Structuring clean, queryable data models to support analytics and reporting Collaborating with analysts to deliver insightful dashboards … via Looker Establishing data governance and quality processes Requirements: GCP (BigQuery), but open to other cloud backgrounds ETL: dbt, ApacheAirflow, or similar BI: Looker (preferred), or other BI tools Languages: SQL, Python, Java Experienced data engineer, with strong ETL and cloud data warehouse experience Proficiency in SQL and data modelling best practices Experience with BI tools and More ❯
you'll have the opportunity to grow the data function and step into a leadership role. The Role: Designing and building robust ETL pipelines using tools like dbt or ApacheAirflow Integrating data from APIs, databases, and SaaS platforms into BigQuery Structuring clean, queryable data models to support analytics and reporting Collaborating with analysts to deliver insightful dashboards … via Looker Establishing data governance and quality processes Requirements: GCP (BigQuery), but open to other cloud backgrounds ETL: dbt, ApacheAirflow, or similar BI: Looker (preferred), or other BI tools Languages: SQL, Python, Java Experienced data engineer, with strong ETL and cloud data warehouse experience Proficiency in SQL and data modelling best practices Experience with BI tools and More ❯
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with ApacheAirflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and More ❯
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with ApacheAirflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and More ❯
london (city of london), south east england, united kingdom
HCLTech
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with ApacheAirflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and More ❯
data engineering tasks. Experience building and maintaining web scraping pipelines. Strong SQL skills, with expertise in performance tuning. Strong proficiency with dbt for data transformations. Hands-on experience with ApacheAirflow or Prefect. Proficiency with GitHub, GitHub Actions, and CI/CD pipelines. Nice to have: Experience with GCP (BigQuery, Dataflow, Composer, Pub/Sub) or AWS. Familiarity More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
Azure Storage, Medallion Architecture, and working with data formats such as JSON, CSV, and Parquet. • Strong understanding of IT concepts, including security, IAM, Key Vault, and networking. • Exposure to ApacheAirflow and DBT is a bonus. • Familiarity with agile principles and practices. • Experience with Azure DevOps pipelines. The "Nice to Haves": • Certification in Azure or related technologies. • Experience More ❯
experience in data engineering or a related field, with a focus on building scalable data systems and platforms. Expertise in modern data tools and frameworks such as Spark, dbt, Airflow, Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure) Understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures Proficiency in SQL and at least More ❯
Amazon EKS, Amazon S3, AWS Glue, Amazon RDS, Amazon DynamoDB, Amazon Aurora, Amazon SageMaker, Amazon Bedrock (including LLM hosting and management). Expertise in workflow orchestration tools such as Apache Airflow. Experience implementing DataOps best practices and tooling, including DataOps.Live. Advanced skills in data storage and management platforms like Snowflake. Ability to deliver insightful analytics via business intelligence tools More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Office Angels
data modelling techniques (star schema, data vault, dimensional modelling). Proficiency in Excel-based data workflows for various Agile Retail projects. Hands-on experience with data pipeline orchestration tools (Airflow, dbt, Prefect, or similar). Benefits: Unlimited holiday Annual Wellbeing Allowance Flexible work culture Monthly socials and events Complimentary snack bar Employer pension contribution If you're a data More ❯
Meta, Amazon , OpenAI) Proficiency with essential data science libraries including Pandas, NumPy, scikit-learn, Plotly/Matplotlib, and Jupyter Notebooks Knowledge of ML-adjacent technologies, including AWS SageMaker and Apache Airflow. Strong skills in data preprocessing, wrangling, and augmentation techniques Experience deploying scalable AI solutions on cloud platforms (AWS, Google Cloud, or Azure) with enthusiasm for MLOps tools and More ❯
Basingstoke, Hampshire, South East, United Kingdom
Anson Mccade
processes. Monitor integration health and implement alerting, logging, and performance tracking. Contribute to continuous improvement of integration architecture and practices. Key Skills & Experience Experience with workflow orchestration tools (e.g., ApacheAirflow). Proven backend development skills using Node.js, Python, Java, or similar. Strong understanding of API design and integration techniques (REST, Webhooks, GraphQL). Familiarity with authentication protocols More ❯
with Kimball methodology, dimensional modelling, and star schema design. Proven experience with Redshift or Snowflake. Strong background in cloud-based data environments (AWS preferred). Hands-on experience with Airflow for orchestration. (Nice-to-have) Python for data engineering tasks. (Nice-to-have) Optimisation for BI tools such as Power BI or Looker. Soft skills: Strong collaboration with both More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
experience in data engineering, warehousing, or platform leadership. Proven track record of delivering large-scale data warehouse migrations (BigQuery experience strongly preferred). Hands-on expertise with SQL, Python, Airflow, DBT/Dataform, Terraform , and modern data architecture. Strong leadership and stakeholder management skills. Experience driving complex data projects in agile, cross-functional teams. Nice-to-haves: Background in More ❯
deep expertise in ETL and data warehousing (especially BigQuery). Hands-on competence with Looker (LookML) or similar BI tools. Strong SQL skills and familiarity with orchestration tools like Airflow (or similar). A thoughtful approach to data modeling, query optimization, and performance tuning. Excellent analytical problem-solving skills and the ability to drive projects in a fast-paced More ❯
Sheffield, South Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
My Finance Club
charts. BI tools. Desirable PowerBI A/B testing. Split principles, test planning, logging. Desirable hard skills Python. Data analysis, pandas, numpy, scipy, statsmodels, seaborn, matplotlib etc. ETL processes. Airflow, Pentaho etc. IT understanding. Client-server processes. API. Back-end and front-end. Parsing. Algorithms and data structures. Management. Sprints, AGILE. Waterfall. Jira etc. Other skills Required fluent English. More ❯
handsworth, yorkshire and the humber, united kingdom Hybrid / WFH Options
My Finance Club
charts. BI tools. Desirable PowerBI A/B testing. Split principles, test planning, logging. Desirable hard skills Python. Data analysis, pandas, numpy, scipy, statsmodels, seaborn, matplotlib etc. ETL processes. Airflow, Pentaho etc. IT understanding. Client-server processes. API. Back-end and front-end. Parsing. Algorithms and data structures. Management. Sprints, AGILE. Waterfall. Jira etc. Other skills Required fluent English. More ❯
Reigate, Surrey, England, United Kingdom Hybrid / WFH Options
esure Group
improvement and operational excellence. Deep expertise in data compliance frameworks, cost management, and platform optimisation. Strong hands-on experience with modern cloud data warehouses (Databricks, Snowflake, AWS), SQL, Spark, Airflow, Terraform. Advanced Python skills with orchestration tooling; solid experience in CI/CD (Git, Jenkins). Proven track record in data modelling, batch/real-time integration, and large More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Venesky Brown
fine-tuning techniques including LoRA, QLoRA, and parameter efficient methods - Multi-modal AI systems combining text, image, and structured data - Reinforcement Learning from Human Feedback (RLHF) for model alignment - ApacheAirflow/Dagster for ML workflow orchestration and ETL pipeline management - Model versioning and experiment tracking (MLflow, Weights & Biases) - Real-time model serving and edge deployment strategies - A More ❯
milton, central scotland, united kingdom Hybrid / WFH Options
Venesky Brown
fine-tuning techniques including LoRA, QLoRA, and parameter efficient methods - Multi-modal AI systems combining text, image, and structured data - Reinforcement Learning from Human Feedback (RLHF) for model alignment - ApacheAirflow/Dagster for ML workflow orchestration and ETL pipeline management - Model versioning and experiment tracking (MLflow, Weights & Biases) - Real-time model serving and edge deployment strategies - A More ❯
paisley, central scotland, united kingdom Hybrid / WFH Options
Venesky Brown
fine-tuning techniques including LoRA, QLoRA, and parameter efficient methods - Multi-modal AI systems combining text, image, and structured data - Reinforcement Learning from Human Feedback (RLHF) for model alignment - ApacheAirflow/Dagster for ML workflow orchestration and ETL pipeline management - Model versioning and experiment tracking (MLflow, Weights & Biases) - Real-time model serving and edge deployment strategies - A More ❯