performance, reliability, and security. - Implement event-driven architectures using Kafka for real-time data processing and communication between microservices. - Utilize Big Data technologies (e.g., ApacheSpark, Hadoop) to process and analyze large volumes of data, extracting valuable insights to drive decision-making. - Design and optimize data pipelines for … principles and best practices. - Experience with Kafka for building event-driven architectures and real-time data processing. - Familiarity with Big Data technologies such as ApacheSpark, Hadoop, or similar frameworks. - Proven track record of delivering scalable and reliable software solutions in a fast-paced environment. - Excellent communication skills more »
Manchester, North West, United Kingdom Hybrid / WFH Options
Dupen Ltd
Linux, APIs, infrastructure design – load balancing, VMs, PostgreSQL, vector dbs. ML Learning Engineer – desirable skills: Version control (Git), computer vision libraries, Big Data (Hadoop, Spark), Cloud – AWS, Google Cloud, Azure, and a knowledge of secure coding techniques – PCI-DSS, PA-DSS, ISO27001. Note: as there are actually two roles more »
Employment Type: Permanent
Salary: £50000 - £60000/annum To £60,000 + range of benefits
THG Icon Offices 1, 7-9 Sunbank Lane, Altrincham, England
THG
3rd party sources and APIs). You will automate repetitive data tasks and infrastructure using CI/CD combined with tools like Python/ApacheSpark/AWS Glue/BASH/YAML/Terraform. Possible options include: Infrastructure Engineer/Cloud Engineer Full Stack Developer Solution Architect more »
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Alexander Mann Solutions - Public Sector Resourcing
abreast of market and industry trends, and You'll have relevant experience in: Strong experience of programming languages Python Pyspark SQL (Hive SQL/Spark SQL) Bash Hue GitHub/Lab (*Version Controls), Azkaban (or similar scheduling tools) CI/CD (*Concourse or similar) R-Studio Juypter-Lab Kafka more »
Manchester, North West, United Kingdom Hybrid / WFH Options
N Brown Group
and practices and tools like Jira and Confluence. What technical skills you will have Experience with general Cloud products (Cloyd SQL, BigQuery, RedShift, Snowflake, Apache Beam, Spark) or similar products. Experience with open-source data-stack tools such as Airflow, Airbyte, DBT, Kafka etc. Awareness of data visualisation more »
Manchester Area, United Kingdom Hybrid / WFH Options
Version 1
REMOTE BASED WITH VERY OCCASIONAL TRAVEL TO CLIENT SITES AND OFFICE. Would you like to the opportunity to expand your skillset across Java, Python, Spark, Hadoop, Trino & Airflow across the Banking & Financial Services industries? How about if you worked with an Innovation Partner of the Year Winner (2023 Oracle … date with the latest trends and best practices, and share knowledge with the team. Qualifications You will have expertise within the following: Java, Python, Spark, Hadoop (Essential) Trino, Airflow (Desirable) Architecture and capabilities. Designing and implementing complex solutions with a focus on scalability and security. Excellent communication and collaboration more »