control, code review), and with project management tools (JIRA or similar) in an agile environment. Familiarity with other ML Ops tools (Kubeflow, MLflow, etc.) or big data processing frameworks (Spark) can be an added advantage Rewards and Benefits We believe in supporting our employees in both their professional and personal lives. As part of our commitment to your well More ❯
control, code review), and with project management tools (JIRA or similar) in an agile environment. Familiarity with other ML Ops tools (Kubeflow, MLflow, etc.) or big data processing frameworks (Spark) can be an added advantage Rewards and Benefits We believe in supporting our employees in both their professional and personal lives. As part of our commitment to your well More ❯
control, code review), and with project management tools (JIRA or similar) in an agile environment. Familiarity with other ML Ops tools (Kubeflow, MLflow, etc.) or big data processing frameworks (Spark) can be an added advantage Rewards and Benefits We believe in supporting our employees in both their professional and personal lives. As part of our commitment to your well More ❯
control, code review), and with project management tools (JIRA or similar) in an agile environment. Familiarity with other ML Ops tools (Kubeflow, MLflow, etc.) or big data processing frameworks (Spark) can be an added advantage Rewards and Benefits We believe in supporting our employees in both their professional and personal lives. As part of our commitment to your well More ❯
Doncaster, South Yorkshire, UK Hybrid/Remote Options
Williams Lea
control, code review), and with project management tools (JIRA or similar) in an agile environment. Familiarity with other ML Ops tools (Kubeflow, MLflow, etc.) or big data processing frameworks (Spark) can be an added advantage Rewards and Benefits We believe in supporting our employees in both their professional and personal lives. As part of our commitment to your well More ❯
control, code review), and with project management tools (JIRA or similar) in an agile environment. Familiarity with other ML Ops tools (Kubeflow, MLflow, etc.) or big data processing frameworks (Spark) can be an added advantage Rewards and Benefits We believe in supporting our employees in both their professional and personal lives. As part of our commitment to your well More ❯
Wakefield, West Yorkshire, UK Hybrid/Remote Options
Williams Lea
control, code review), and with project management tools (JIRA or similar) in an agile environment. Familiarity with other ML Ops tools (Kubeflow, MLflow, etc.) or big data processing frameworks (Spark) can be an added advantage Rewards and Benefits We believe in supporting our employees in both their professional and personal lives. As part of our commitment to your well More ❯
Sheffield, South Yorkshire, England, United Kingdom Hybrid/Remote Options
DCS Recruitment
principles. Experience working with cloud platforms such as AWS, Azure, or GCP. Exposure to modern data tools such as Snowflake, Databricks, or BigQuery. Familiarity with streaming technologies (e.g., Kafka, Spark Streaming, Flink) is an advantage. Experience with orchestration and infrastructure tools such as Airflow, dbt, Prefect, CI/CD pipelines, and Terraform. What you get in return: Up to More ❯
quality software features Strong communication, organisational, and interpersonal skills Ability to manage multiple priorities in a fast-paced environment Experience with SQL, NoSQL, and big data platforms (e.g., Hadoop, Spark) Knowledge of cloud security (AWS, Azure, GCP) and data access controls Proficiency in scripting languages (e.g., Python, Bash) for automation Certifications such as OSCP, CEH, CISSP, or GIAC are More ❯
able to work across full data cycle. - Proven Experience working with AWS data technologies (S3, Redshift, Glue, Lambda, Lake formation, Cloud Formation), GitHub, CI/CD - Coding experience in ApacheSpark, Iceberg or Python (Pandas) - Experience in change and release management. - Experience in Database Warehouse design and data modelling - Experience managing Data Migration projects. - Cloud data platform development … the AWS services like Redshift, Lambda,S3,Step Functions, Batch, Cloud formation, Lake Formation, Code Build, CI/CD, GitHub, IAM, SQS, SNS, Aurora DB - Good experience with DBT, Apache Iceberg, Docker, Microsoft BI stack (nice to have) - Experience in data warehouse design (Kimball and lake house, medallion and data vault) is a definite preference as is knowledge of … other data tools and programming languages such as Python & Spark and Strong SQL experience. - Experience is building Data lake and building CI/CD data pipelines - A candidate is expected to understand and can demonstrate experience across the delivery lifecycle and understand both Agile and Waterfall methods and when to apply these. Experience: This position requires several years of More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid/Remote Options
Fruition Group
Airflow, and Git for version control. Excellent collaboration and communication skills, with strong attention to detail and data quality. Desirable: Exposure to AI/ML data preparation, Python or Spark, Data Vault 2.0, data governance, GDPR, and experience working in mobility, logistics, financial services, or tech-enabled environments. What's in it for me? Hybrid and remote working flexibility. More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid/Remote Options
Fruition Group
best practices for data security and compliance. Collaborate with stakeholders and external partners. Skills & Experience: Strong experience with AWS data technologies (e.g., S3, Redshift, Lambda). Proficient in Python, ApacheSpark, and SQL. Experience in data warehouse design and data migration projects. Cloud data platform development and deployment. Expertise across data warehouse and ETL/ELT development in More ❯
Experience 10+ years' experience in Data Engineering, with a minimum of 3 years of hands-on Azure Databricks experience delivering production-grade solutions. Strong programming proficiency in Python and Spark (PySpark) or Scala, with the ability to build scalable and efficient data processing applications. Advanced understanding of data warehousing concepts, including dimensional modelling, ETL/ELT patterns, and modern More ❯
Experience 10+ years' experience in Data Engineering, with a minimum of 3 years of hands-on Azure Databricks experience delivering production-grade solutions. Strong programming proficiency in Python and Spark (PySpark) or Scala, with the ability to build scalable and efficient data processing applications. Advanced understanding of data warehousing concepts, including dimensional modelling, ETL/ELT patterns, and modern More ❯
Experience 10+ years' experience in Data Engineering, with a minimum of 3 years of hands-on Azure Databricks experience delivering production-grade solutions. Strong programming proficiency in Python and Spark (PySpark) or Scala, with the ability to build scalable and efficient data processing applications. Advanced understanding of data warehousing concepts, including dimensional modelling, ETL/ELT patterns, and modern More ❯
Experience 10+ years' experience in Data Engineering, with a minimum of 3 years of hands-on Azure Databricks experience delivering production-grade solutions. Strong programming proficiency in Python and Spark (PySpark) or Scala, with the ability to build scalable and efficient data processing applications. Advanced understanding of data warehousing concepts, including dimensional modelling, ETL/ELT patterns, and modern More ❯
Experience 10+ years' experience in Data Engineering, with a minimum of 3 years of hands-on Azure Databricks experience delivering production-grade solutions. Strong programming proficiency in Python and Spark (PySpark) or Scala, with the ability to build scalable and efficient data processing applications. Advanced understanding of data warehousing concepts, including dimensional modelling, ETL/ELT patterns, and modern More ❯
familiarity with data science libraries and frameworks. Strong expertise in statistical modelling, predictive analytics, and machine learning. In-depth understanding of data architecture and big data technologies (e.g., Hadoop, Spark, AWS). Exceptional problem-solving skills and ability to think strategically. Outstanding communication abilities to convey technical concepts to non-technical stakeholders. Experience in managing cross-departmental or large More ❯
Doncaster, South Yorkshire, UK Hybrid/Remote Options
Lucid Support Services Ltd
familiarity with data science libraries and frameworks. Strong expertise in statistical modelling, predictive analytics, and machine learning. In-depth understanding of data architecture and big data technologies (e.g., Hadoop, Spark, AWS). Exceptional problem-solving skills and ability to think strategically. Outstanding communication abilities to convey technical concepts to non-technical stakeholders. Experience in managing cross-departmental or large More ❯
familiarity with data science libraries and frameworks. Strong expertise in statistical modelling, predictive analytics, and machine learning. In-depth understanding of data architecture and big data technologies (e.g., Hadoop, Spark, AWS). Exceptional problem-solving skills and ability to think strategically. Outstanding communication abilities to convey technical concepts to non-technical stakeholders. Experience in managing cross-departmental or large More ❯
in public cloud (AWS) and our dev/test workloads are self-orchestrated using clustered Proxmox as a private cloud. Our analytics workloads are distributed on bare metal using Spark, HBASE and HDFS. We make extensive use of terraform and ansible for IaC, our CI/CD uses GitHub Actions, our observability is provided via Datadog and we increasingly More ❯
in public cloud (AWS) and our dev/test workloads are self-orchestrated using clustered Proxmox as a private cloud. Our analytics workloads are distributed on bare metal using Spark, HBASE and HDFS. We make extensive use of terraform and ansible for IaC, our CI/CD uses GitHub Actions, our observability is provided via Datadog and we increasingly More ❯
in public cloud (AWS) and our dev/test workloads are self-orchestrated using clustered Proxmox as a private cloud. Our analytics workloads are distributed on bare metal using Spark, HBASE and HDFS. We make extensive use of terraform and ansible for IaC, our CI/CD uses GitHub Actions, our observability is provided via Datadog and we increasingly More ❯
Wakefield, West Yorkshire, UK Hybrid/Remote Options
Ripjar
in public cloud (AWS) and our dev/test workloads are self-orchestrated using clustered Proxmox as a private cloud. Our analytics workloads are distributed on bare metal using Spark, HBASE and HDFS. We make extensive use of terraform and ansible for IaC, our CI/CD uses GitHub Actions, our observability is provided via Datadog and we increasingly More ❯
Doncaster, South Yorkshire, UK Hybrid/Remote Options
Ripjar
in public cloud (AWS) and our dev/test workloads are self-orchestrated using clustered Proxmox as a private cloud. Our analytics workloads are distributed on bare metal using Spark, HBASE and HDFS. We make extensive use of terraform and ansible for IaC, our CI/CD uses GitHub Actions, our observability is provided via Datadog and we increasingly More ❯