collaboration, and high performance. Hold team members accountable for their performance and professional growth. Technical Expertise: Bring a deep understanding of data technologies, including batchprocessing, stream processing, storage, analytics, machine learning, and AI. Collaborate with Engineering teams to ensure the data platform meets technical requirements and … data-focused environment. Strong technical background and a degree in Computer Science, Engineering, or a related field. Hands-on experience with data technologies, including batchprocessing, stream processing, data warehousing, and analytics platforms. Proven track record of successfully leading product strategies that drive significant business outcomes. Experience … experience in mentoring and developing Product Managers. Ability to inspire and lead cross-functional teams toward common goals. Deep understanding of data architectures, data processing frameworks, and data modelling. Familiarity with technologies such as AWS and other cloud-based data services. Knowledge of data privacy laws and compliance requirements More ❯
collaboration, and high performance. Hold team members accountable for their performance and professional growth. Technical Expertise: Bring a deep understanding of data technologies, including batchprocessing, stream processing, storage, analytics, machine learning, and AI. Collaborate with Engineering teams to ensure the data platform meets technical requirements and … data-focused environment. Strong technical background and a degree in Computer Science, Engineering, or a related field. Hands-on experience with data technologies, including batchprocessing, stream processing, data warehousing, and analytics platforms. Proven track record of successfully leading product strategies that drive significant business outcomes. Experience … experience in mentoring and developing Product Managers. Ability to inspire and lead cross-functional teams toward common goals. Deep understanding of data architectures, data processing frameworks, and data modelling. Familiarity with technologies such as AWS and other cloud-based data services. Knowledge of data privacy laws and compliance requirements More ❯
metadata-forward, CI/CD-driven platform represents and enables the entire application and analysis lifecycle including interactive development and explorations (notebooks), large-scale batchprocessing, observability and production application deployments. The optimization team's focus is on maximizing scale and performance of all aspects of the platforms. … for delivery of scalable solutions to the Compute and AIML Platforms that supports the entire application lifecycle (interactive development and explorations/analysis, scalable batchprocessing, application deployment) with particular focus on performance at scale Partner with both AIML and Compute platform teams as well as scientific users More ❯
metadata-forward, CI/CD-driven platform represents and enables the entire application and analysis lifecycle including interactive development and explorations (notebooks), large-scale batchprocessing, observability and production application deployments. The optimization team's focus is on maximizing scale and performance of all aspects of the platforms. … for delivery of scalable solutions to the Compute and AIML Platforms that supports the entire application lifecycle (interactive development and explorations/analysis, scalable batchprocessing, application deployment) with particular focus on performance at scale Partner with both AIML and Compute platform teams as well as scientific users More ❯
metadata-forward, CI/CD-driven platform represents and enables the entire application and analysis lifecycle including interactive development and explorations (notebooks), large-scale batchprocessing, observability and production application deployments. The optimization team's focus is on maximizing scale and performance of all aspects of the platforms. … for delivery of scalable solutions to the Compute and AIML Platforms that supports the entire application lifecycle (interactive development and explorations/analysis, scalable batchprocessing, application deployment) with particular focus on performance at scale Partner with both AIML and Compute platform teams as well as scientific users More ❯
metadata-forward, CI/CD-driven platform represents and enables the entire application and analysis lifecycle including interactive development and explorations (notebooks), large-scale batchprocessing, observability and production application deployments. The optimization team’s focus is on maximizing scale and performance of all aspects of the platforms. … for delivery of scalable solutions to the Compute and AIML Platforms that supports the entire application lifecycle (interactive development and explorations/analysis, scalable batchprocessing, application deployment) with particular focus on performance at scale Partner with both AIML and Compute platform teams as well as scientific users More ❯
Watford, Hertfordshire, United Kingdom Hybrid / WFH Options
Digital Gaming Corp
data from sources like Facebook, Google Analytics, and payment providers. Using tools like AWS Redshift, S3, and Kafka, you'll optimize data models for batch and real-time processing. Collaborating with stakeholders, you'll deliver actionable insights on player behavior and gaming analytics, enhancing experiences and driving revenue with … robust ETL pipelines to integrate data from diverse sources, including APIs like Facebook, Google Analytics, and payment providers. Develop and optimize data models for batchprocessing and real-time streaming using tools like AWS Redshift, S3, and Kafka. Lead efforts in acquiring, storing, processing, and provisioning data More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Smart DCC
the data environment. What will you be doing? Design and implement efficient ETL processes for data extraction, transformation, and loading. Build real-time data processing pipelines using platforms like Apache Kafka or cloud-native tools. Optimize batchprocessing workflows with tools like Apache Spark and Flink for … What are we looking for? Advanced proficiency with databases (SQL Server, Oracle, MySQL, PostgreSQL). Expertise in building and managing data pipelines and data processing workflows. Strong understanding of data warehousing concepts, schema design, and data modelling. Hands-on experience with cloud platforms (AWS, Azure, Google Cloud) for scalable More ❯
experts, stakeholders, and senior management across the NHSCFA and our wider stakeholders. The primary responsibilities will include developing, evaluating, maintaining, and managing data pipelines, processing, preparing data suitable for analysis, and developing and maintaining data models. You will support the design of analytical services, products, and models that meet … Lakehouses. Build data pipelines that clean, transform, and present granular and aggregate data from disparate sources. Design, build, test, automate, and maintain architectures and processing workflows for analytics and business intelligence (BI) systems which are scalable. Plan, develop and evaluate methods and processes for gathering, extracting, transforming, and cleaning … data warehouse, infrastructure and ETL solutions covering multiple sources of data. Knowledge of using techniques to ensure datasets and efficient queries are optimised, including batchprocessing, partitions, and indexing etc. Ability to build API integration to integrate various external NHS data sources and optimise code and data pipelines. More ❯
e.g., Hadoop, Spark). · Strong knowledge of data workflow solutions like Azure Data Factory, Apache NiFi, Apache Airflow etc · Good knowledge of stream and batchprocessing solutions like Apache Flink, Apache Kafka/· Good knowledge of log management, monitoring, and analytics solutions like Splunk, Elastic Stack, New Relic More ❯
robust ETL pipelines to integrate data from diverse sources, including APIs like Facebook, Google Analytics, and payment providers. Develop and optimize data models for batchprocessing and real-time streaming using tools like AWS Redshift, S3, and Kafka. Lead efforts in acquiring, storing, processing, and provisioning data More ❯
business area. Comfortable making presentations covering business, technical or sales. Detailed knowledge of; Project phasing and reporting On-site delivery Systems support Operational procedures Batchprocessing System sizing and capacity planning System backup, contingency and disaster recovery Regional roll outs Education & Preferred Qualifications Third level qualification essential ideally More ❯
for end-to-end ETL, ELT & reporting solutions using key components like Spark SQL & Spark Streaming. Strong knowledge of multi-threading and high-volume batch processing. Proficiency in performance tuning for Java/Python and Spark. Deep knowledge of AWS products/services and Kubernetes/container technologies and More ❯
business area. Comfortable making presentations covering business, technical or sales. Detailed knowledge of; Project phasing and reporting. On-site delivery. Systems support. Operational procedures. Batch processing. System sizing and capacity planning. System backup, contingency and disaster recovery. Regional roll outs. Education & Preferred Qualifications Third level qualification essential ideally a More ❯
join wider cross-functional teams. You’ll have significant experience building distributed systems both on-prem and in public cloud, CI/CD pipelines, batch compute tooling and developer productivity tooling. You will be excited and committed to providing an excellent developer experience, with a constant eye on continuous … Software Engineering teams, as well as providing the mechanism for discovering and consuming shared components. Including but not limited to: K8s configuration and usability, batchprocessing and HPC interactions and hybrid cloud deployments Staying informed on open source and 3rd party tooling that we should consider adopting rather … Exposure to the following would be beneficial: Developer portals and software catalogues Infrastructure-as-code tools such as Ansible or Terraform Experience in using batch compute frameworks, HPC tooling Company Description For almost 50 years, Williams Racing has been at the forefront of one of the fastest sports on More ❯
join wider cross-functional teams. You’ll have significant experience building distributed systems both on-prem and in public cloud, CI/CD pipelines, batch compute tooling and developer productivity tooling. You will be excited and committed to providing an excellent developer experience, with a constant eye on continuous … Software Engineering teams, as well as providing the mechanism for discovering and consuming shared components. Including but not limited to: K8s configuration and usability, batchprocessing and HPC interactions and hybrid cloud deployments Staying informed on open source and 3rd party tooling that we should consider adopting rather … Exposure to the following would be beneficial: Developer portals and software catalogues Infrastructure-as-code tools such as Ansible or Terraform Experience in using batch compute frameworks, HPC tooling Company Description For almost 50 years, Williams Racing has been at the forefront of one of the fastest sports on More ❯
join wider cross-functional teams. You’ll have significant experience building distributed systems both on-prem and in public cloud, CI/CD pipelines, batch compute tooling and developer productivity tooling. You will be excited and committed to providing an excellent developer experience, with a constant eye on continuous … Software Engineering teams, as well as providing the mechanism for discovering and consuming shared components. Including but not limited to: K8s configuration and usability, batchprocessing and HPC interactions and hybrid cloud deployments Staying informed on open source and 3rd party tooling that we should consider adopting rather … Exposure to the following would be beneficial: Developer portals and software catalogues Infrastructure-as-code tools such as Ansible or Terraform Experience in using batch compute frameworks, HPC tooling Company Description For almost 50 years, Williams Racing has been at the forefront of one of the fastest sports on More ❯
for end-to-end ETL, ELT & reporting solutions using key components like Spark SQL & Spark Streaming. Strong knowledge of multi-threading and high-volume batch processing. Proficiency in performance tuning for Java/Python and Spark. Deep knowledge of AWS products/services and Kubernetes/container technologies and More ❯
for end-to-end ETL, ELT & reporting solutions using key components like Spark SQL & Spark Streaming. Strong knowledge of multi-threading and high-volume batch processing. Proficiency in performance tuning for Java/Python and Spark. Deep knowledge of AWS products/services and Kubernetes/container technologies and More ❯
join wider cross-functional teams. You’ll have some experience building distributed systems both on-prem and in public cloud, CI/CD pipelines, batch compute tooling and developer productivity tooling. You will be excited and committed to providing an excellent developer experience, with a constant eye on continuous … Software Engineering teams, as well as providing the mechanism for discovering and consuming shared components. Including but not limited to: K8s configuration and usability, batchprocessing and HPC interactions and hybrid cloud deployments Stay informed on open source and 3rd party tooling that we should consider adopting rather … Exposure to the following would be beneficial: Developer portals and software catalogues Infrastructure-as-code tools such as Ansible or Terraform Experience in using batch compute frameworks, HPC tooling Company Description For almost 50 years, Williams Racing has been at the forefront of one of the fastest sports on More ❯
join wider cross-functional teams. You’ll have some experience building distributed systems both on-prem and in public cloud, CI/CD pipelines, batch compute tooling and developer productivity tooling. You will be excited and committed to providing an excellent developer experience, with a constant eye on continuous … Software Engineering teams, as well as providing the mechanism for discovering and consuming shared components. Including but not limited to: K8s configuration and usability, batchprocessing and HPC interactions and hybrid cloud deployments Stay informed on open source and 3rd party tooling that we should consider adopting rather … Exposure to the following would be beneficial: Developer portals and software catalogues Infrastructure-as-code tools such as Ansible or Terraform Experience in using batch compute frameworks, HPC tooling Company Description For almost 50 years, Williams Racing has been at the forefront of one of the fastest sports on More ❯
join wider cross-functional teams. You’ll have some experience building distributed systems both on-prem and in public cloud, CI/CD pipelines, batch compute tooling and developer productivity tooling. You will be excited and committed to providing an excellent developer experience, with a constant eye on continuous … Software Engineering teams, as well as providing the mechanism for discovering and consuming shared components. Including but not limited to: K8s configuration and usability, batchprocessing and HPC interactions and hybrid cloud deployments Stay informed on open source and 3rd party tooling that we should consider adopting rather … Exposure to the following would be beneficial: Developer portals and software catalogues Infrastructure-as-code tools such as Ansible or Terraform Experience in using batch compute frameworks, HPC tooling Company Description For almost 50 years, Williams Racing has been at the forefront of one of the fastest sports on More ❯
/ML-driven analytics, improving anomaly detection, predictive insights, and automated reporting in SAP environments. Collaborate with cloud architects to enhance real-time and batchprocessing pipelines, ensuring efficient SAP data workflows. Translate technical innovations into customer-centric solutions, ensuring business users can derive actionable insights from SAP … of SAP data structures, tables, and APIs, ensuring seamless data extraction and analysis. Experience in data warehousing, cloud data lakes, and real-time data processing, optimizing SAP data workflows. Knowledge of enterprise BI tools (Power BI, Tableau, SAP Analytics Cloud) and how they integrate with SAP environments. Familiarity with More ❯
Skills: Experience with Spark-based frameworks for ETL, ELT, and reporting, including Spark SQL & Spark Streaming. Strong knowledge of multi-threading and high-volume batch processing. Proficiency in performance tuning for Java/Python and Spark. Deep understanding of AWS services and container technologies like Kubernetes. Experience deploying applications More ❯
to support LLM-based querying, semantic search, and metadata retrieval. Integrate structured (SQL-based) and unstructured (documents, reports) data sources for real-time and batch processing. Maintain and troubleshoot Airflow pipelines for embedding extraction and document processing. Ensure data governance, security, and compliance across all applications. Manage Vector Database More ❯