steadfast commitment to security. As a FinTech Cloud Engineer, you'll be instrumental in defining the future of finance." Skills Strong Golang Experience Postres, Kafka and Kubernetes If you're passionate about FinTech, cloud engineering, and pioneering the future of finance, I want to hear from you! Email Benjamin.smith more »
Manchester Area, United Kingdom Hybrid / WFH Options
Maxwell Bond
clouds. The successful Lead Data Engineer will have: Experience leading a Data Engineering team. Extensive working experience with GCP, SQL and DBT. Proficient in: Kafka, Dataform, Airflow, Tableau, PowerBI, Redshift, Snowflake, Terraform and BigQuery. What's in it for the successful Lead Data Engineer: Hybrid working for a better more »
Head of Data Platform/Delivery (Contract) Snowflake, Kafka, AWS. 6 month contract, Rate up to £900 per day (Outside IR35) Hybrid work into Central London I am working with a leading insurance business as they undergo a group wide digital and data transformation programme. The business has been more »
or Prime Brokerage would be good) Experience integrating with 3rd-party APIs Experience with AWS for writing systems on, or tooling Experience working with Kafka and Snowflake (nice to have) Prior experience working with a mid-early stage fintech start up (nice to have) Experience with standard tooling for more »
for someone to bring depth and experience to the testing team. Required Skills - Strong Java code experience API testing Java, Spring framework OPC/Kafka Microservice testing at scale Working on complex Greenfield projects Able to challenge the team If this might be of interest, I'd love to more »
relational database technology, particularly Postgres Knowledgeable in distributed systems architectures, failure modes, and mitigations Experience with distributed message and event streaming systems, such as Kafka Familiarity with various cloud-native technologies, including observability tooling and SLOs/SLAs management Adherence to industry-accepted cybersecurity best practices, including secure-by more »
to create exceptional product experiences. You do not need to have fintech experience for this role knowledge would be beneficial Experience in Databricks or Kafka is advantageous. Equal Opportunities: We are committed to promoting equality of opportunity for all employees and job applicants. In line with the Equality Act more »
with PHP and React (with React Native or Next.js framework) Mastery of RESTful API development. In-depth understanding of MySQL and Redis. Familiarity with Kafka Acquaintance with TypeScript Nice-to-Haves: Node.js proficiency, especially with websockets. Experience working with unit and integration testing frameworks. NPM package management (Private) PHP more »
Demostrate In-depth knowledge of large-scale data platforms (Databricks, Snowflake) and cloud-native tools (Azure Synapse, RedShift) Experience of analytics technologies (Spark, Hadoop, Kafka) Have familiarity with Data Lakehouse architecture, SQL Server, DataOps, and data lineage concepts Demonstrate In-depth knowledge of large-scale data platforms (Databricks, Snowflake more »
Cognizant is looking for an AWS DevOps Engineer (Kafka). This is a full-time role that comes with a generous salary and benefits package. As an AWS DevOps Engineer (Kafka), you will be responsible for building and maintaining a high-performing Confluent Kafka platform on-premise … and in the AWS Cloud. In your first week in this AWS DevOps Engineer (Kafka) role, you can expect to: Build and maintain scalable, reliable, and high-performance Kafka cluster in both cloud and on-premise environments Develop and maintain Kafka connectors to integration with various data … for this AWS DevOps Engineer (Kafka) role, your soft skills, expertise and experience should include: Experience with Terraform and Ansible Experience with ApacheKafka and confluent platform In-depth knowledge of Kafka Architecture In-Depth Experience with AWS Cloud Platform Core Components - EC2, EBS, S3, IAM, KMS more »
within the business. SKILLS AND EXPERIENCE Commercial experience building Data pipelines using Python. Strong knowledge of SQL. Experience working within GCP. An understanding of Kafka, or other steaming technologies. THE BENEFITS Share options. Generous Pension scheme. Private healthcare. Cycle to work scheme. Monthly fitness fund. HOW TO APPLY Please more »
Proficient at programming in Python Experience using front-office pricing libraries Comfortable with relational and timeseries databases Exposure to distributed systems and messaging (e.g. Kafka) Comfortable with Numpy, Pandas and Jupyter Highly self-motivated, willing to take initiative and make technical decisions more »
Management, Incident Response, and Capacity Planning for a number of their core services. Some of their key technologies are: AWS Cloud, Dynatrace, Terraform, Biztalk, Kafka Key Responsibilities: Part of the Platform Operations team with a heavy focus on platform and system operations in Production. Collaborate with Client Services, Application more »
management fundamentals. Experience in fast-paced, dynamic environments. Ability to work effectively between delivery teams and platform capabilities. Familiarity with technologies like Kubernetes, Golang, Kafka, Terraform, Spinnaker, and Jenkins is advantageous. Background in engineering or DevOps is required Ability to manage a large team of engineers who are all more »
Stoke-On-Trent, England, United Kingdom Hybrid / WFH Options
bet365
also ensuring that our processes and technology are efficient and effective. The team works with a number of core technologies including Python, Golang, JavaScript, Kafka, New Relic, Splunk, Influx, Grafana and Ansible. This role is eligible for inclusion in the Company’s hybrid working from home policy. Preferred Skills more »
Havant, England, United Kingdom Hybrid / WFH Options
Lockheed Martin
AIX Power series hardware and AIX OS is a plus Familiarity with disk systems like SSA and SAN technologies, HACMP, HMC, CSM, NIM, VI, Kafka architecture, and Zabbix monitoring platform. Performance Tuning: Skilled in performance monitoring and tuning of operating systems and hardware. Networking: Solid understanding of networking (TCP more »
Stanmore, England, United Kingdom Hybrid / WFH Options
Sky
delivery teams, an understanding of iterative development and agile ways of working. Specifically, backend delivery teams using technologies including APIs, queues, and databases. (e.g. Kafka and Cassandra). Strong communication and interpersonal skills, able to assimilate information and present to audiences from various backgrounds and levels of understanding, spanning more »
City Of London, England, United Kingdom Hybrid / WFH Options
Cititec Talent
London office Experience Required: - Commodities industry experience Weather forecasting Airflow (or equivalent data orchestration platforms) Data Modeling (Star Schema) Data Warehousing Data Pipeline Orchestration (Kafka) On-Prem SQL more »
experiences. You have exceptional communication and stakeholder engagement. You do not need to have fintech experience for this role Experience in data bricks or Kafka is advantageous Equal Opportunities: We are committed to promoting equality of opportunity for all employees and job applicants. In line with the Equality Act more »
find below the job description. Primary skills: Java back-end Good to have: Agile, Basic Golang Java Backend Engineer Primary skillset Java Springboot Microservices Kafka SQL Postgres preferably NoSQL databases Couchbase CI CD tools and Agile. Responsibilities: Building software to expected quality and standards using distributed enterprise frameworks Participating … and Java 17 is a plus Must have good working knowledge with Spring Boot for service development Must have good working knowledge with Kafka and its integrations Must have good working knowledge in both SQL and No SQL databases like Oracle Postgre SQL Couchbase Cassandra etc 5 years of more »
Senior Full-Stack Software Developer (Python) - £70-75k - Canary Wharf - Hybrid We are currently looking for a skilled, passionate and experienced Full-Stack Developer to coma and join an exceptional team based in Canary Wharf. Working for a passionate more »
Burgess Hill, England, United Kingdom Hybrid / WFH Options
Tata Consultancy Services
Role: Backend Senior Engineer Job Type: Permanent Location: Burgess Hill, UK (Hybrid) Are you a highly skilled Java engineer who is comfortable with back-end programming? Are you passionate about all aspects of software development, including design, implementation, and deployment more »
Scientists and Service Engineering teams Experience with design, development and operations that leverages deep knowledge in the use of services like Amazon Kinesis, ApacheKafka, Apache Spark, Amazon Sagemaker, Amazon EMR, NoSQL technologies and other 3rd parties Develop and define key business questions and to build data sets that … Mathematics or a related field Experience of Data platform implementation, including 3+ years of hands-on experience in implementation and performance tuning Kinesis/Kafka/Spark/Storm implementations Experience with analytic solutions applied to the Marketing or Risk needs of enterprises Basic understanding of machine learning fundamentals … models and implement them as part of data pipeline IT platform implementation experience Experience with one or more relevant tools ( Flink, Spark, Sqoop, Flume, Kafka, Amazon Kinesis) Experience developing software code in one or more programming languages (Java, JavaScript, Python, etc) Current hands-on implementation experience required Preferred Qualifications more »
data engineer with a proven ability to provide effective data solutions in a production setting. Knowledge of developing real-time data stream systems (ideally Kafka). Proven track record in developing data systems using PySpark and Apache Spark for batch processing. Capable of managing data intake from various sources … proper data governance. Utilise and improve our current AWS-based data platform. Work with our tech stack, which includes dbt/DuckDB for transformation, Kafka/RabbitMQ as a streaming platform, Deltalake as a data format, Dagster for managing data assets, and Terraform, Kubernetes, and ArgoCD as the underlying more »