Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
Hamilton Barnes
Pub/Sub, Dataflow, and BigQuery. Key Responsibilities: Develop Scalable Solutions: Lead the creation of scalable and dependable data streaming solutions on GCP using Apache Kafka and associated technologies. Optimize Kafka Setup: Customize Kafka brokers, topics, partitions, and replication to guarantee the highest performance and reliability of data streams. … Connectors: Apply your expertise to set up Kafka connectors for batch processing, managing both source and sink connectors to seamlessly integrate data. Python and Apache Beam Proficiency: Utilize Python and Apache Beam to craft tailored data processing logic and transformations within pipelines, enabling swift and effective data analysis. … Bring: Hands-On Kafka Configuration: Proven expertise in configuring Kafka connectors for batch processing, optimizing their number for improved performance. Python and DataFlow/Apache Beam Proficiency: Skilled in Python and DataFlow/Apache Beam, adept at developing custom data processing logic within pipelines Streaming Data Management: Demonstrated more »
Newport, Gwent, Wales, United Kingdom Hybrid / WFH Options
Maclean Moore Ltd
Developer. ROLE: GCP DATA ENGINEER LOCATION: NEWPORT OR CARDIFF (HYBRID) IR35 STATUS: INSIDE LENGTH: 6 MONTHS Required experience: Expertise in python and DataFlow/Apache beam Experience in handling streaming data Strong experience in database replication using Message based CDC Experience in using Kafka implementations in a secured cloud more »
development, deployment of large scale data streaming Pipelines in GCP . Work on Data Streaming POC Experience required: Expertise in python and DataFlow/Apache beam Experience in handling streaming data. Strong experience in database replication using Message based CDC Experience in using Kafka implementations in a secured cloud more »
Newport, Wales, United Kingdom Hybrid / WFH Options
Maclean Moore Ltd
Developer. ROLE: GCP DATA ENGINEER LOCATION: NEWPORT OR CARDIFF (HYBRID) IR35 STATUS: INSIDE LENGTH: 6 MONTHS Required experience: Expertise in python and DataFlow/Apache beam Experience in handling streaming data Strong experience in database replication using Message based CDC Experience in using Kafka implementations in a secured cloud more »
development, deployment of large scale data streaming Pipelines in GCP . Work on Data Streaming POC Experience required: Expertise in python and DataFlow/Apache beam Experience in handling streaming data. Strong experience in database replication using Message based CDC Experience in using Kafka implementations in a secured cloud more »
Swansea, Wales, United Kingdom Hybrid / WFH Options
CPS Group (UK) Limited
my client will train you): Knowledge of Microsoft SQL Server and packaged BI tools (SSAS and SSIS). Docker, Kubernetes and cloud computing technologies. Apache Kafka and data streaming. Familiarity with Apache Spark or similar data processing tools. Experience developing and maintaining CICD pipelines, particularly Azure DevOps or more »