e.g., Hadoop, Spark). · Strong knowledge of data workflow solutions like Azure Data Factory, Apache NiFi, Apache Airflow etc · Good knowledge of stream and batch processing solutions like ApacheFlink, Apache Kafka/· Good knowledge of log management, monitoring, and analytics solutions like Splunk, Elastic Stack, New Relic etc Given that this is just a short snapshot of the More ❯
Sheffield, South Yorkshire, England, United Kingdom Hybrid/Remote Options
DCS Recruitment
working with cloud platforms such as AWS, Azure, or GCP. Exposure to modern data tools such as Snowflake, Databricks, or BigQuery. Familiarity with streaming technologies (e.g., Kafka, Spark Streaming, Flink) is an advantage. Experience with orchestration and infrastructure tools such as Airflow, dbt, Prefect, CI/CD pipelines, and Terraform. What you get in return: Up to More ❯
Hands-on experience with SQL, Data Pipelines, Data Orchestration and Integration Tools Experience in data platforms on premises/cloud using technologies such as: Hadoop, Kafka, Apache Spark, ApacheFlink, object, relational and NoSQL data stores. Hands-on experience with big data application development and cloud data warehousing (e.g. Hadoop, Spark, Redshift, Snowflake, GCP BigQuery) Expertise in building data More ❯
delivering under tight deadlines without compromising quality. Your Qualifications 12+ years of software engineering experience, ideally in platform, infrastructure, or data-centric product development. Expertise in Apache Kafka, ApacheFlink, and/or Apache Pulsar. Deep understanding of event-driven architectures, data lakes, and streaming pipelines. Strong experience integrating AI/ML models into production systems, including prompt engineering More ❯
about complex problems at high scale. Ability to work collaboratively in a team environment and communicate effectively with other teams across Cloudflare. Experience with data streaming technologies (e.g., Kafka, Flink) is a strong plus. Experience with various logging platforms or SIEMs (e.g., Splunk, Datadog, Sumo Logic) and storage destinations (e.g., S3, R2, GCS) is a plus. Experience with Infrastructure More ❯
to Date with Technology: Keep yourself and the team updated on the latest Python technologies, frameworks, and tools like Apache Spark, Databricks, Apache Pulsar, Apache Airflow, Temporal, and ApacheFlink, sharing knowledge and suggesting improvements. Documentation: Contribute to clear and concise documentation for software, processes, and systems to ensure team alignment and knowledge sharing. Your Qualifications Experience: Professional experience … pipelines, and machine learning workflows. Workflow Orchestration: Familiarity with tools like Apache Airflow or Temporal for managing workflows and scheduling jobs in distributed systems. Stream Processing: Experience with ApacheFlink or other stream processing frameworks is a plus. Desired Skills Asynchronous Programming: Familiarity with asynchronous programming tools like Celery or asyncio. Frontend Knowledge: Exposure to frontend frameworks like React More ❯
Java, data structures and concurrency, rather than relying on frameworks such as Spring. You have built event-driven applications using Kafka and solutions with event-streaming frameworks at scale (Flink/Kafka Streams/Spark) that go beyond basic ETL pipelines. You know how to orchestrate the deployment of applications on Kubernetes, including defining services, deployments, stateful sets etc. More ❯
required in the role; we are happy to support your learning on the job, but prior experience is a plus: Experience with large-scale data processing frameworks (e.g., Spark, Flink). Experience with time series analysis, anomaly detection, or graph analytics in a security context. Proficiency in data visualization tools and techniques to effectively communicate complex findings. A basic More ❯
to cross-functional teams, ensuring best practices in data architecture, security and cloud computing Proficiency in data modelling, ETL processes, data warehousing, distributed systems and metadata systems Utilise ApacheFlink and other streaming technologies to build real-time data processing systems that handle large-scale, high-throughput data Ensure all data solutions comply with industry standards and government regulations … not limited to EC2, S3, RDS, Lambda and Redshift. Experience with other cloud providers (e.g., Azure, GCP) is a plus In-depth knowledge and hands-on experience with ApacheFlink for real-time data processing Proven experience in mentoring and managing teams, with a focus on developing talent and fostering a collaborative work environment Strong ability to engage with More ❯
leeds, west yorkshire, yorkshire and the humber, united kingdom Hybrid/Remote Options
Axiom Software Solutions Limited
and maintaining Kafka clusters to ensure high availability and scalability. • Integrating Kafka with other data processing tools and platforms such as Kafka Streams, Kafka Connect, Spark Streaming Schema Registry, Flink and Beam. • Collaborating with cross-functional teams to understand data requirements and design solutions that meet business needs. • Implementing security measures to protect Kafka clusters and data streams. • Monitoring … like Spark Required Skills Experience • Extensive Experience with Apache Kafka and real-time architecture including event driven frameworks. • Strong Knowledge of Kafka Streams, Kafka Connect, Spark Streaming, Schema Registry, Flink and Beam. • Experience with cloud platform such as GCP Pub/Sub. • Excellent problem-solving skills. ͏ Knowledge & Experience/Qualifications · Knowledge of Kafka data pipelines and messaging solutions to More ❯
leeds, west yorkshire, yorkshire and the humber, united kingdom Hybrid/Remote Options
Axiom Software Solutions Limited
and maintaining Kafka clusters to ensure high availability and scalability. Integrating Kafka with other data processing tools and platforms such as Kafka Streams, Kafka Connect, Spark Streaming Schema Registry, Flink and Beam. Collaborating with cross-functional teams to understand data requirements and design solutions that meet business needs. Implementing security measures to protect Kafka clusters and data streams. Monitoring … industry practices. Required Skills Experience Extensive Experience with Apache Kafka and real-time architecture including event driven frameworks. Strong Knowledge of Kafka Streams, Kafka Connect, Spark Streaming, Schema Registry, Flink and Beam. Experience with cloud platform such as GCP Pub/Sub. Excellent problem-solving skills. Knowledge & Experience/Qualifications Knowledge of Kafka data pipelines and messaging solutions to More ❯