Manchester, North West, United Kingdom Hybrid / WFH Options
INFUSED SOLUTIONS LIMITED
culture. Key Responsibilities Design, build, and maintain scalable data solutions to support business objectives. Work with Microsoft Fabric to develop robust data pipelines. Utilise ApacheSpark and the Spark API to handle large-scale data processing. Contribute to data strategy, governance, and architecture best practices. Identify and … approaches. Collaborate with cross-functional teams to deliver projects on time . Key Requirements ? Hands-on experience with Microsoft Fabric . ? Strong expertise in ApacheSpark and Spark API . ? Knowledge of data architecture, engineering best practices, and governance . ? DP-600 & DP-700 certifications are highly More ❯
warrington, cheshire, north west england, United Kingdom Hybrid / WFH Options
Infused Solutions
culture. Key Responsibilities Design, build, and maintain scalable data solutions to support business objectives. Work with Microsoft Fabric to develop robust data pipelines. Utilise ApacheSpark and the Spark API to handle large-scale data processing. Contribute to data strategy, governance, and architecture best practices. Identify and … approaches. Collaborate with cross-functional teams to deliver projects on time . Key Requirements ✅ Hands-on experience with Microsoft Fabric . ✅ Strong expertise in ApacheSpark and Spark API . ✅ Knowledge of data architecture, engineering best practices, and governance . ✅ DP-600 & DP-700 certifications are highly More ❯
bolton, greater manchester, north west england, United Kingdom Hybrid / WFH Options
Infused Solutions
culture. Key Responsibilities Design, build, and maintain scalable data solutions to support business objectives. Work with Microsoft Fabric to develop robust data pipelines. Utilise ApacheSpark and the Spark API to handle large-scale data processing. Contribute to data strategy, governance, and architecture best practices. Identify and … approaches. Collaborate with cross-functional teams to deliver projects on time . Key Requirements ✅ Hands-on experience with Microsoft Fabric . ✅ Strong expertise in ApacheSpark and Spark API . ✅ Knowledge of data architecture, engineering best practices, and governance . ✅ DP-600 & DP-700 certifications are highly More ❯
Skills: Experience working within the public sector. Knowledge of cloud platforms (e.g., IBM Cloud, AWS, Azure). Familiarity with big data processing frameworks (e.g., ApacheSpark, Hadoop). Understanding of data warehousing concepts and experience with tools like IBM Cognos or Tableau. Certifications:While not required, the following … beneficial: Experience working within the public sector. Knowledge of cloud platforms (e.g., IBM Cloud, AWS, Azure). Familiarity with big data processing frameworks (e.g., ApacheSpark, Hadoop). Understanding of data warehousing concepts and experience with tools like IBM Cognos or Tableau. ABOUT BUSINESS UNIT IBM Consulting is More ❯
bradford, yorkshire and the humber, United Kingdom
Pyramid Consulting, Inc
We are seeking an experienced Kafka Real-Time Architect to design and implement scalable, high-performance real-time data processing systems leveraging Apache Kafka. In this role, you will be responsible for architecting and managing Kafka clusters, ensuring system scalability and availability, and integrating Kafka with various data processing … approach to addressing business data needs and ensuring optimal system performance. Key Responsibilities: Design & Architecture: Architect and design scalable, real-time streaming systems using Apache Kafka, ensuring they are robust, highly available, and meet business requirements for data ingestion, processing, and real-time analytics. Kafka Cluster Management: Configure, deploy … and troubleshoot issues to maintain smooth operations. Integration & Data Processing: Integrate Kafka with key data processing tools and platforms, including Kafka Streams , Kafka Connect , ApacheSpark Streaming , Apache Flink , Apache Beam , and Schema Registry . This integration will facilitate data stream processing, event-driven architectures, and More ❯
leeds, west yorkshire, yorkshire and the humber, United Kingdom
Pyramid Consulting, Inc
We are seeking an experienced Kafka Real-Time Architect to design and implement scalable, high-performance real-time data processing systems leveraging Apache Kafka. In this role, you will be responsible for architecting and managing Kafka clusters, ensuring system scalability and availability, and integrating Kafka with various data processing … approach to addressing business data needs and ensuring optimal system performance. Key Responsibilities: Design & Architecture: Architect and design scalable, real-time streaming systems using Apache Kafka, ensuring they are robust, highly available, and meet business requirements for data ingestion, processing, and real-time analytics. Kafka Cluster Management: Configure, deploy … and troubleshoot issues to maintain smooth operations. Integration & Data Processing: Integrate Kafka with key data processing tools and platforms, including Kafka Streams , Kafka Connect , ApacheSpark Streaming , Apache Flink , Apache Beam , and Schema Registry . This integration will facilitate data stream processing, event-driven architectures, and More ❯
years of hands-on experience with big data tools and frameworks. Technical Skills: Proficiency in SQL, Python, and data pipeline tools such as Apache Kafka, ApacheSpark, or AWS Glue. Problem-Solving: Strong analytical skills with the ability to troubleshoot and resolve data issues. Communication: Excellent communication More ❯
Databricks. Solid understanding of ETL processes , data modeling, and data warehousing. Familiarity with SQL and relational databases. Knowledge of big data technologies , such as Spark, Hadoop, or Kafka, is a plus. Strong problem-solving skills and the ability to work in a collaborative team environment. Excellent verbal and written More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Smart DCC
you be doing? Design and implement efficient ETL processes for data extraction, transformation, and loading. Build real-time data processing pipelines using platforms like Apache Kafka or cloud-native tools. Optimize batch processing workflows with tools like ApacheSpark and Flink for scalable performance. Infrastructure Automation: Implement … Integrate cloud-based data services with data lakes and warehouses. Build and automate CI/CD pipelines with Jenkins, GitLab CI/CD, or Apache Airflow. Develop automated test suites for data pipelines, ensuring data quality and transformation integrity. Monitoring & Performance Optimization: Monitor data pipelines with tools like Prometheus More ❯
platform management roles, with 5+ years in leadership positions. Expertise in modern data platforms (e.g., Azure, AWS, Google Cloud) and big data technologies (e.g., Spark, Kafka, Hadoop). Strong knowledge of data governance frameworks, regulatory compliance (e.g., GDPR, CCPA), and data security best practices. Proven experience in enterprise-level More ❯
experience in data engineering or a related field, with expertise in designing scalable data solutions. Familiarity with big data technologies like Hadoop, Kafka, or Spark for processing large-scale data. Experience with data visualization tools such as Tableau, Power BI, or similar platforms for building reports and dashboards. Hands More ❯
engineering or a related field, with a focus on building scalable data systems and platforms. Expertise in modern data tools and frameworks such as Spark, dbt, Airflow, Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure) Understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming More ❯
blackpool, north west england, United Kingdom Hybrid / WFH Options
Perch Group
problem-solving skills Strong teamwork, interpersonal and collaboration skills with colleagues and clients Desirable: Experience with Cloud ETL tools such as Databricks/Snowflake, Spark and Kafka Experience using source control tools such as GitHub or Azure DevOps Experience with Azure DevOps for CI/CD pipeline development and More ❯
preston, lancashire, north west england, United Kingdom Hybrid / WFH Options
Perch Group
problem-solving skills Strong teamwork, interpersonal and collaboration skills with colleagues and clients Desirable: Experience with Cloud ETL tools such as Databricks/Snowflake, Spark and Kafka Experience using source control tools such as GitHub or Azure DevOps Experience with Azure DevOps for CI/CD pipeline development and More ❯
Purview, or Informatica, including projects around lineage, cataloging, and quality rules. Strong hands-on development experience in SQL and Python, with working knowledge of Spark or other distributed data processing frameworks. Design, development and implementation of distributed data solutions using API and microservice-based architecture. Deep understanding of ETL More ❯
Purview, or Informatica, including projects around lineage, cataloging, and quality rules. Strong hands-on development experience in SQL and Python, with working knowledge of Spark or other distributed data processing frameworks. Design, development and implementation of distributed data solutions using API and microservice-based architecture. Deep understanding of ETL More ❯
or a related field, with a focus on building scalable data systems and platforms. Strong expertise with modern data tools and frameworks such as Spark , dbt , Airflow OR Kafka , Databricks , and cloud-native services (AWS, GCP, or Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and More ❯
Intelligence platform, either on Azure or AWS. Good working knowledge of Databricks components: DeltaLake, Unity Catalog, ML Flow, etc. Expertise in SQL, Python and Spark (Scala or Python). Experience working with relational SQL databases either on premises or in the cloud. Experience delivering multiple solutions using key techniques More ❯
Thornton-Cleveleys, Lancashire, North West, United Kingdom
Victrex Manufacturing Ltd
architecture, data modelling, ETL/ELT processesand data pipeline development. Competency with cloud platforms (e.g., AWS, Azure, GCP) and big data technologies (e.g., Hadoop, Spark, Kafka etc). Excellent communication and leadership skills, with the ability to engage and influence stakeholders at all levels. Insightful problem-solving skills and More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Aventum Group
Python, SQL, T-SQL, SSIS DB: Azure SQL Database, Cosmos DB, NoSQL, Methodologies: Agile, DevOps must have Concepts: ELT/ETL, DWH, APIs (RESTful), Spark APIs, FTP protocols, SSL, SFTP, PKI (public Key Infrastructure) and Integration testing Management Duties Yes We are an equal opportunity employer, and we are More ❯
slough, south east england, United Kingdom Hybrid / WFH Options
Aventum Group
Python, SQL, T-SQL, SSIS DB: Azure SQL Database, Cosmos DB, NoSQL, Methodologies: Agile, DevOps must have Concepts: ELT/ETL, DWH, APIs (RESTful), Spark APIs, FTP protocols, SSL, SFTP, PKI (public Key Infrastructure) and Integration testing Management Duties Yes We are an equal opportunity employer, and we are More ❯