Maidenhead, Berkshire, United Kingdom Hybrid / WFH Options
Wireless Logic Group
opportunity to leave your mark on something transformative. Design, develop and maintain user interfaces and web applications (Angular, React) Create and maintain RESTful APIs and contribute to EDAs (Node, Kafka) Implement authentication, authorization and security measures Translate UX designs or wireframes into efficient, reusable code Actively participate in CI/CD strategy with wider-team and colleagues Propose and More ❯
London, England, United Kingdom Hybrid / WFH Options
Degreed Inc
data mapping, and middleware solutions. Proven architecture leadership for major global talent, skills, or learning‐platform transformations 3+ years of hands-on technical expertise with REST APIs, event buses (Kafka/SNS/SQS), OAuth 2.0/OIDC/SAML, webhooks, and bulk or streaming ETL tools (MuleSoft, Boomi, SnapLogic, Azure Data Factory) and integrations with data warehouses or More ❯
iO Associates - UK/EU provided pay range This range is provided by iO Associates - UK/EU. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more. Base pay range Direct More ❯
Data Engineer | Software and Analytics Scale Up | Bristol/Hybrid | £60 - 65,000 + bonus (doe) iO Associates are delighted to be partnering with an exciting Software and Analytics scale up, who are looking for a Data Engineer to join More ❯
Security Architect – Defence & National Security (DV Cleared), Outside IR35 £600-700 per day, initial 3 month contract Location: Farnborough or Cheltenham | Clearance: DV Cleared Outside IR35 contract role, starting on an initial 3 months with long contract opportunity. Looking to More ❯
Job Title: Backend Developer (Mandarin Speaking) Job Type: Full-time Location: On-site (United Kingdom) Salary: £28,000–£45,000 per year Schedule: Monday to Friday Responsibilities: Design, develop, and maintain scalable and reliable API systems. Build and manage microservices More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Citigroup Inc
About the Team: Data Solutions Technology strives to provide measurable competitive advantage to our business by delivering high quality, innovative and cost effective reference data technology and operational; solutions in order to meet the needs of the business, our clients More ❯
Cheltenham, Gloucestershire, United Kingdom Hybrid / WFH Options
The Talent Locker Ltd
Security Architect - Defence & National Security (DV Cleared), Outside IR35 £600-700 per day, initial 3 month contract Location: Farnborough or Cheltenham Clearance: DV Cleared Outside IR35 contract role, starting on an initial 3 months with long contract opportunity. Looking to More ❯
Farnborough, Hampshire, United Kingdom Hybrid / WFH Options
The Talent Locker Ltd
Security Architect - Defence & National Security (DV Cleared) Location: Farnborough/Hybrid/Remote (UK-based) Clearance: DV Cleared Salary: £90k + excellent benefits Looking to use your security expertise to make a real difference on high-impact defence and national More ❯
On behalf of DWP we are looking for a Java Developer for a 12 month (Inside IR35) contract. Hybrid working with 2 3 days per week required in either Manchester, Leeds, Birmingham or Newcastle. The Department for Work and Pensions More ❯
Farnborough, Hampshire, South East, United Kingdom Hybrid / WFH Options
Talent Locker
Security Architect - Defence & National Security (DV Cleared) Location: Farnborough/Hybrid/Remote (UK-based) | Clearance: DV Cleared | Salary: £90k + excellent benefits Looking to use your security expertise to make a real difference on high-impact defence and national More ❯
London, England, United Kingdom Hybrid / WFH Options
Deutsche Bank
Maven, Bower, and SBT Experience in architecting and deploying big data applications using Apache Hadoop in cloud scenarios Expertise in Hadoop ecosystem components like HDFS, Hive, HBase, Spark, Ranger, Kafka, YARN Knowledge of Kubernetes, Docker, and security practices such as Kerberos, LDAP, SSL Ability to read and modify open-source Java repositories How we'll support you Culture of More ❯
London, England, United Kingdom Hybrid / WFH Options
Deutsche Bank
building, and deploying big data applications using the Apache Hadoop ecosystem in hybrid cloud and private cloud scenarios. Expert knowledge of Apache Hadoop ecosystem HDFS, Hive, HBase, Spark, Ranger, Kafka, Yarn etc.). Proficiency in Kubernetes & Docker. Strong understanding of security practices within big data environments such as Kerberos, LDAP , POSIX and SSL. Ability to read and modify open More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Axiom Software Solutions Limited
week) Type: Contract Client: Wipro Job description As part of the CTO Data Ingestion Service, the incumbent will be required to: Designing and architecting scalable, real-time systems in Kafka, with a focus on on-premise Cloudera open source Kafka and disaster recovery aspects. Configuring, deploying, and maintaining Kafka clusters to ensure high availability, resiliency, and scalability … including understanding and explaining features like KRAFT. Integrating Kafka with other data processing tools and platforms such as Kafka Streams, Kafka Connect, Spark Streaming, Schema Registry, Flink, and Beam. Collaborating with cross-functional teams to understand data requirements and design solutions that meet business needs. Implementing security measures to protect Kafka clusters and data streams. Monitoring … on disaster recovery aspects Knowledge of Kafka resiliency and new features like KRAFT Experience with real-time technologies such as Spark Required Skills & Experience Extensive experience with ApacheKafka and real-time architecture including event-driven frameworks. Strong knowledge of Kafka Streams, Kafka Connect, Spark Streaming, Schema Registry, Flink, and Beam. Experience with cloud platforms such More ❯
and advanced knowledge of machine learning. Experience in building and maintaining of an enterprise data model Experience in implementing data pipelines using ETL and ELT technologies such as ApacheKafka and Apache Nifi Experienced in data architecture and management tools such as ER/Studio, Alation, and DataHub Experience with data modeling, data warehousing, and data analytics Experience with More ❯
high-quality data solutions aligned with company goals. Requirements: 5+ years of hands-on experience in data engineering, including expertise in Python, Scala, or Java. Deep understanding of ApacheKafka for stream processing workflows (required) Proficiency in managing and optimizing databases such as PostgreSQL, MySQL, MSSQL. Familiarity with analytical databases. Familiarity with both cloud solutions (AWS preferably) and on … the implementation of scalable, efficient data pipelines and architectures, with a strong focus on stream processing. Develop and maintain robust data storage and processing solutions, leveraging tools like ApacheKafka, Redis, and ClickHouse. Guide the migration of selected cloud-based solutions to on-premises tools, optimizing costs while maintaining performance and reliability. Collaborate with stakeholders to gather requirements, propose More ❯
high-quality data solutions aligned with company goals. Requirements: 5+ years of hands-on experience in data engineering, including expertise in Python, Scala, or Java. Deep understanding of ApacheKafka for stream processing workflows (required) Proficiency in managing and optimizing databases such as PostgreSQL, MySQL, MSSQL. Familiarity with analytical databases. Familiarity with both cloud solutions (AWS preferably) and on … the implementation of scalable, efficient data pipelines and architectures, with a strong focus on stream processing. Develop and maintain robust data storage and processing solutions, leveraging tools like ApacheKafka, Redis, and ClickHouse. Guide the migration of selected cloud-based solutions to on-premises tools, optimizing costs while maintaining performance and reliability. Collaborate with stakeholders to gather requirements, propose More ❯
high-quality data solutions aligned with company goals. Requirements: 5+ years of hands-on experience in data engineering, including expertise in Python, Scala, or Java. Deep understanding of ApacheKafka for stream processing workflows (required) Proficiency in managing and optimizing databases such as PostgreSQL, MySQL, MSSQL. Familiarity with analytical databases. Familiarity with both cloud solutions (AWS preferably) and on … the implementation of scalable, efficient data pipelines and architectures, with a strong focus on stream processing. Develop and maintain robust data storage and processing solutions, leveraging tools like ApacheKafka, Redis, and ClickHouse. Guide the migration of selected cloud-based solutions to on-premises tools, optimizing costs while maintaining performance and reliability. Collaborate with stakeholders to gather requirements, propose More ❯
high-quality data solutions aligned with company goals. Requirements: 5+ years of hands-on experience in data engineering, including expertise in Python, Scala, or Java. Deep understanding of ApacheKafka for stream processing workflows (required) Proficiency in managing and optimizing databases such as PostgreSQL, MySQL, MSSQL. Familiarity with analytical databases. Familiarity with both cloud solutions (AWS preferably) and on … the implementation of scalable, efficient data pipelines and architectures, with a strong focus on stream processing. Develop and maintain robust data storage and processing solutions, leveraging tools like ApacheKafka, Redis, and ClickHouse. Guide the migration of selected cloud-based solutions to on-premises tools, optimizing costs while maintaining performance and reliability. Collaborate with stakeholders to gather requirements, propose More ❯
high-quality data solutions aligned with company goals. Requirements: 5+ years of hands-on experience in data engineering, including expertise in Python, Scala, or Java. Deep understanding of ApacheKafka for stream processing workflows (required) Proficiency in managing and optimizing databases such as PostgreSQL, MySQL, MSSQL. Familiarity with analytical databases. Familiarity with both cloud solutions (AWS preferably) and on … the implementation of scalable, efficient data pipelines and architectures, with a strong focus on stream processing. Develop and maintain robust data storage and processing solutions, leveraging tools like ApacheKafka, Redis, and ClickHouse. Guide the migration of selected cloud-based solutions to on-premises tools, optimizing costs while maintaining performance and reliability. Collaborate with stakeholders to gather requirements, propose More ❯
high-quality data solutions aligned with company goals. Requirements: 5+ years of hands-on experience in data engineering, including expertise in Python, Scala, or Java. Deep understanding of ApacheKafka for stream processing workflows (required) Proficiency in managing and optimizing databases such as PostgreSQL, MySQL, MSSQL. Familiarity with analytical databases. Familiarity with both cloud solutions (AWS preferably) and on … the implementation of scalable, efficient data pipelines and architectures, with a strong focus on stream processing. Develop and maintain robust data storage and processing solutions, leveraging tools like ApacheKafka, Redis, and ClickHouse. Guide the migration of selected cloud-based solutions to on-premises tools, optimizing costs while maintaining performance and reliability. Collaborate with stakeholders to gather requirements, propose More ❯
high-quality data solutions aligned with company goals. Requirements: 5+ years of hands-on experience in data engineering, including expertise in Python, Scala, or Java. Deep understanding of ApacheKafka for stream processing workflows (required) Proficiency in managing and optimizing databases such as PostgreSQL, MySQL, MSSQL. Familiarity with analytical databases. Familiarity with both cloud solutions (AWS preferably) and on … the implementation of scalable, efficient data pipelines and architectures, with a strong focus on stream processing. Develop and maintain robust data storage and processing solutions, leveraging tools like ApacheKafka, Redis, and ClickHouse. Guide the migration of selected cloud-based solutions to on-premises tools, optimizing costs while maintaining performance and reliability. Collaborate with stakeholders to gather requirements, propose More ❯
high-quality data solutions aligned with company goals. Requirements: 5+ years of hands-on experience in data engineering, including expertise in Python, Scala, or Java. Deep understanding of ApacheKafka for stream processing workflows (required) Proficiency in managing and optimizing databases such as PostgreSQL, MySQL, MSSQL. Familiarity with analytical databases. Familiarity with both cloud solutions (AWS preferably) and on … the implementation of scalable, efficient data pipelines and architectures, with a strong focus on stream processing. Develop and maintain robust data storage and processing solutions, leveraging tools like ApacheKafka, Redis, and ClickHouse. Guide the migration of selected cloud-based solutions to on-premises tools, optimizing costs while maintaining performance and reliability. Collaborate with stakeholders to gather requirements, propose More ❯
high-quality data solutions aligned with company goals. Requirements: 5+ years of hands-on experience in data engineering, including expertise in Python, Scala, or Java. Deep understanding of ApacheKafka for stream processing workflows (required) Proficiency in managing and optimizing databases such as PostgreSQL, MySQL, MSSQL. Familiarity with analytical databases. Familiarity with both cloud solutions (AWS preferably) and on … the implementation of scalable, efficient data pipelines and architectures, with a strong focus on stream processing. Develop and maintain robust data storage and processing solutions, leveraging tools like ApacheKafka, Redis, and ClickHouse. Guide the migration of selected cloud-based solutions to on-premises tools, optimizing costs while maintaining performance and reliability. Collaborate with stakeholders to gather requirements, propose More ❯
high-quality data solutions aligned with company goals. Requirements: 5+ years of hands-on experience in data engineering, including expertise in Python, Scala, or Java. Deep understanding of ApacheKafka for stream processing workflows (required) Proficiency in managing and optimizing databases such as PostgreSQL, MySQL, MSSQL. Familiarity with analytical databases. Familiarity with both cloud solutions (AWS preferably) and on … the implementation of scalable, efficient data pipelines and architectures, with a strong focus on stream processing. Develop and maintain robust data storage and processing solutions, leveraging tools like ApacheKafka, Redis, and ClickHouse. Guide the migration of selected cloud-based solutions to on-premises tools, optimizing costs while maintaining performance and reliability. Collaborate with stakeholders to gather requirements, propose More ❯