experience working as a Software Engineer on large software applications Proficient in many of the following technologies - Python, REST, PyTorch, TensorFlow, Docker, FastAPI, Selenium, React, TypeScript, Redux, GraphQL, Kafka, Apache Spark. Experience working with one or more of the following database systems - DynamoDB, DocumentDB, MongoDB Demonstrated expertise in unit testing and tools - JUnit, Mockito, PyTest, Selenium. Strong working knowledge More ❯
or MS degree in Computer Science or equivalent Experience in developing Finance or HR related applications Working experience with Tableau Working experience with Terraform Experience in creating workflows for Apache Airflow and Jenkins Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive More ❯
North West London, London, United Kingdom Hybrid / WFH Options
Anson Mccade
knowledge of Kafka , Confluent , and event-driven architecture Hands-on experience with Databricks , Unity Catalog , and Lakehouse architectures Strong architectural understanding across AWS, Azure, GCP , and Snowflake Familiarity with Apache Spark, SQL/NoSQL databases, and programming (Python, R, Java) Knowledge of data visualisation, DevOps principles, and ML/AI integration into data architectures Strong grasp of data governance More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
Azure Storage, Medallion Architecture, and working with data formats such as JSON, CSV, and Parquet. • Strong understanding of IT concepts, including security, IAM, Key Vault, and networking. • Exposure to Apache Airflow and DBT is a bonus. • Familiarity with agile principles and practices. • Experience with Azure DevOps pipelines. The "Nice to Haves": • Certification in Azure or related technologies. • Experience with More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Made Tech Limited
Strong experience in Infrastructure as Code (IaC) and deploying infrastructure across environments Managing cloud infrastructure with a DevOps approach Handling and transforming various data types (JSON, CSV, etc.) using Apache Spark, Databricks, or Hadoop Understanding modern data system architectures (Data Warehouse, Data Lakes, Data Meshes) and their use cases Creating data pipelines on cloud platforms with error handling and More ❯
with Broadcom SiteMinder on Linux. Solid understanding of Microsoft Windows Server and IIS 7.x+ administration. Experience with: CA Federation, CA Directory, Oracle DSEE/ODU, Oracle DB, LDAP JBoss, Apache, iPlanet Web Server IBM WebSphere Application Server DevOps tools: GIT/STASH, Jenkins, Nolio, Nexus, Shell scripting, Groovy Monitoring tools: CA APM, Wily, AppDynamics, Splunk Automation tools: CHEF (including More ❯
with Broadcom SiteMinder on Linux. Solid understanding of Microsoft Windows Server and IIS 7.x+ administration. Experience with: CA Federation, CA Directory, Oracle DSEE/ODU, Oracle DB, LDAP JBoss, Apache, iPlanet Web Server IBM WebSphere Application Server DevOps tools: GIT/STASH, Jenkins, Nolio, Nexus, Shell scripting, Groovy Monitoring tools: CA APM, Wily, AppDynamics, Splunk Automation tools: CHEF (including More ❯
GIT/STASH, Nolio Collaborate with cross-functional teams to maintain system performance Essential Skills: Strong hands-on experience with SiteMinder on Linux Knowledge of Windows Server, IIS, WebSphere, Apache, JBoss Proficiency with DevOps and automation tools (Docker, OpenShift, Chef) Familiarity with IAM protocols: SAML, OAuth, OpenID Connect Desirable: AWS Certification, Financial Services experience If this role is of More ❯
with Broadcom SiteMinder on Linux. 2. Solid understanding of Microsoft Windows Server and IIS 7.x administration 3. CA Federation, CA Directory, Oracle DSEE/ODU, Oracle DB, LDAP, JBoss, Apache, iPlanet Web Server, IBM WebSphere Application Server 4. Monitoring tools: CA APM, Wily, AppDynamics, Splunk Desirable Skills/Experience 1. AWS Practitioner or Associate certification. 2. Exposure to enterprise More ❯
includes shifts and on-call duties after training-no two weeks are ever the same. Essential Skills Solid Unix/Linux skills Experience with Bash, SQL, PHP Comfortable with Apache/Nginx, load balancers (HAProxy), and monitoring tools (Nagios, Grafana, Prometheus) Knowledge of log management (Graylog, Elasticsearch) Familiar with Ansible and Gitlab CI/CD Experience using Git/ More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
DCS Recruitment
includes shifts and on-call duties after training-no two weeks are ever the same. Essential Skills Solid Unix/Linux skills Experience with Bash, SQL, PHP Comfortable with Apache/Nginx, load balancers (HAProxy), and monitoring tools (Nagios, Grafana, Prometheus) Knowledge of log management (Graylog, Elasticsearch) Familiar with Ansible and Gitlab CI/CD Experience using Git/ More ❯
Crewe, Cheshire, England, United Kingdom Hybrid / WFH Options
DCS Recruitment
knowledge of application security and working within dev teams Hands-on experience with EDR/NDR technologies Familiar with standards like Cyber Essentials, ISO27001 Working knowledge of Linux, Ubuntu, Apache, MySQL, PHP, Git, PostgreSQL Cloud security skills, ideally in AWS Understanding of open-source risk management and enterprise tooling Exposure to ITIL service management disciplines Familiarity with MITRE ATT More ❯
detection and response (NDR) technologies. Detailed knowledge of Information Security standards including Cyber Essentials, Cyber Essentials Plus and ISO27001. Good understanding of Linux and database technologies such as Ubuntu, Apache, PHP, MySQL, PostgreSQL, Nginx, Mercurial and Git. Good understanding of cyber security practices in relation to cloud hosting, preferably with experience of AWS. Good understanding of open-source risk More ❯
speed and sustainability-delivering under tight deadlines without compromising quality. Your Qualifications 12+ years of software engineering experience, ideally in platform, infrastructure, or data-centric product development. Expertise in Apache Kafka, Apache Flink, and/or Apache Pulsar. Deep understanding of event-driven architectures, data lakes, and streaming pipelines. Strong experience integrating AI/ML models into More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (AI-Driven SaaS plaform)(Python, Snowflake, Data Modelling, ETL/ELT, Apache Airflow, Kafka, AWS)Large-scale data environmentUp to £70,000 plus benefitsFULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalable data structures and data pipelines within a truly enterprise-scale data processing environment? Do … platform integrates Python and Snowflake and you'll need a deep understanding of SQL and NoSQL databases (MongoDB or similar!) You'll also have exposure with streaming platforms like Apache Kafka and be able to develop and maintain ELT and essentially bring a solid understanding of data warehousing concepts and best practice. Essentially, a strong Data Engineer who is … a Snowflake enthusiast who can write solid SQL queries, within Snowflake! You will understanding Apache Kafka to a high standard and have solid knowledge of Apache Airflow - from a Cloud perspective, good AWS exposure. Naturally you will have good understanding on AWS. I'd love you to be an advocate of Agile too - these guys are massive on More ❯
Data Engineer (AI-Driven SaaS plaform) (Python, Snowflake, Data Modelling, ETL/ELT, Apache Airflow, Kafka, AWS) Large-scale data environment Up to £70,000 plus benefits FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalable data structures and data pipelines within a truly enterprise-scale data processing … platform integrates Python and Snowflake and you'll need a deep understanding of SQL and NoSQL databases (MongoDB or similar!) You'll also have exposure with streaming platforms like Apache Kafka and be able to develop and maintain ELT and essentially bring a solid understanding of data warehousing concepts and best practice. Essentially, a strong Data Engineer who is … a Snowflake enthusiast who can write solid SQL queries, within Snowflake! You will understanding Apache Kafka to a high standard and have solid knowledge of Apache Airflow - from a Cloud perspective, good AWS exposure. Naturally you will have good understanding on AWS. I'd love you to be an advocate of Agile too - these guys are massive on More ❯
of a forward-thinking company where data is central to strategic decision-making. We’re looking for someone who brings hands-on experience in streaming data architectures, particularly with Apache Kafka and Confluent Cloud, and is eager to shape the future of scalable, real-time data pipelines. You’ll work closely with both the core Data Engineering team and … the Data Science function, bridging the gap between model development and production-grade data infrastructure. What You’ll Do: Design, build, and maintain real-time data streaming pipelines using Apache Kafka and Confluent Cloud. Architect and implement robust, scalable data ingestion frameworks for batch and streaming use cases. Collaborate with stakeholders to deliver high-quality, reliable datasets to live … experience in a Data Engineering or related role. Strong experience with streaming technologies such as Kafka, Kafka Streams, and/or Confluent Cloud (must-have). Solid knowledge of Apache Spark and Databricks. Proficiency in Python for data processing and automation. Familiarity with NoSQL technologies (e.g., MongoDB, Cassandra, or DynamoDB). Exposure to machine learning pipelines or close collaboration More ❯
focused data team responsible for building and optimising scalable, production-grade data pipelines and infrastructure. Key Responsibilities: Design and implement robust, scalable ETL/ELT pipelines using Databricks and Apache Spark Ingest, transform, and manage large volumes of data from diverse sources Collaborate with analysts, data scientists, and business stakeholders to deliver clean, accessible datasets Ensure high performance and … practices Work with cloud-native tools and services (preferably Azure ) Required Skills & Experience: Proven experience as a Data Engineer on cloud-based projects Strong hands-on skills with Databricks , Apache Spark , and Python or Scala Proficient in SQL and working with large-scale data environments Experience with Delta Lake , Azure Data Lake , or similar technologies Familiarity with version control More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
IO Associates
focused data team responsible for building and optimising scalable, production-grade data pipelines and infrastructure. Key Responsibilities: Design and implement robust, scalable ETL/ELT pipelines using Databricks and Apache Spark Ingest, transform, and manage large volumes of data from diverse sources Collaborate with analysts, data scientists, and business stakeholders to deliver clean, accessible datasets Ensure high performance and … practices Work with cloud-native tools and services (preferably Azure ) Required Skills & Experience: Proven experience as a Data Engineer on cloud-based projects Strong hands-on skills with Databricks , Apache Spark , and Python or Scala Proficient in SQL and working with large-scale data environments Experience with Delta Lake , Azure Data Lake , or similar technologies Familiarity with version control More ❯
evolution of our technical stack through the implementation and adoption of new technologies. You will report to the leadership within the Persistence Infrastructure Team. Your Impact Provisioning and maintaining Apache Pulsar Infrastructure on Kubernetes for Event Driven Architecture Developing and deploying software and tools for managing the lifecycle of persistence services such as Kubernetes operators, configuration management tools, shell … hardening activities Developing automation to remove manual tasks Developing and maintaining observability dashboards and alerting Collaborating with Software Engineers and Users across the business Your Qualifications Operational experience with Apache Pulsar or Kafka Experience working with Kubernetes Experience in Linux system administration Familiarity with CI/CD pipeline tooling Comfortable with scripting for automation Preferred Skills Software development skills More ❯