workplace where each employee's privacy and personal dignity is respected and protected from offensive or threatening behaviour including violence and sexual harassment Role: Apache Spark Application Developer Skills Required: Hands on Experience as a software engineer in a globally distributed team working with Scala, Java programming language (preferably more »
ends (React, Redux, NodeJS, Webpack) • Strong understanding of AWS ecosystems like Lambdas, step functions and ECS services. • Experience with data stack technologies, such as Apache Iceberg & DBT. Preferred Skills • Experience on RDBMS like PostgreSQL would be a plus. Exposure to Apache Airflow, Prefect, Dagster would be beneficial. • Experience more »
Extremely talented in applying SCD, CDC and DQ/DV framework. Familiar with JIRA & Confluence . Must have exposure to technologies such as dbt, Apache airflow, and Snowflake . Desire to continually keep up with advancements in data engineering practices. Knowledge of AWS cloud, and Python is a plus. … Requirements 5+ years of IT experience with major focus on data warehouse/database related projects Must have exposure to technologies such as dbt, Apache Airflow, Snowflake. Experience in data platforms: Snowflake, Oracle, SQL Server, MDM etc Expertise in writing SQL and database objects - Stored procedures, functions, and views. … Hands-on experience in ETL/ELT and data security, SQL performance optimization and job orchestration tools and technologies e.g., dbt, APIs, Apache Airflow, etc. Experience in data modeling and relational database design Well-versed in applying SCD, CDC, and DQ/DV framework. Demonstrate ability to write new more »
of the company's data infrastructure. You will work with some of the most innovative tools in the market including Snowflake, AWS (Glue, S3), Apache Spark, Apache Airflow and DBT!! The role is hybrid, with 2 days in the office in central London and the company is offering more »
system. · Significant experience with Python, and experience with modern software development and release engineering practices (e.g. TDD, CI/CD). · Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Arrow, MapR). · Significant experience with SQL – comfortable writing efficient SQL. · Experience using enterprise … scheduling tools (e.g. Apache Airflow, Spring DataFlow, Control-M) · Experience with Linux and containerisation What you’ll get in return ·Competitive base salary ·Up to 20% bonus ·25 days holiday ·BAYE, SAYE & Performance share schemes ·7% pension ·Life Insurance ·Work Away Scheme ·Flexible benefits package ·Excellent staff travel benefits more »
features from idea to production unattended. Also, actively manages and escalates risk and customer-impacting issues. Responsibilities Install and maintain JBOSS application server and Apache platforms End-to-end setup of Virtual Machines/servers with pre-requisites like file systems, backups, logging, monitoring, etc. required for the application … if you have: Experience using containerized platforms including Kubernetes, Docker and OpenShift Experience in JBOSS 7.x/8.x, Redhat Linux, Redhat OpenJDK, Oracle Java, Apache 2.x Experience in Java-based applications Experience in Recovery Collection Applications, including Debt Management and Recovery Possess technical knowledge on AWS and GCP cloud more »
Python Scala Kotlin Spark Google PubSub Elasticsearch, Bigquery, PostgresQL Kubernetes, Docker, Airflow Key Responsibilities Designing and implementing scalable data pipelines using tools such as Apache Spark, Google PubSub etc. Optimising data storage and retrieval systems for maximum performance using both relational and NoSQL databases. Continuously monitoring and improving the … Data Infrastructure projects, as well as designing and building data intensive applications and services. Experience with data processing and distributed computing frameworks such as Apache Spark Expert knowledge in one or more of the following languages - Python, Scala, Java, Kotlin Deep knowledge of data modelling, data access, and data more »
data- and software engineering, and how to combine those to build data products. * Experience with DevOps methods. * Experience with Middleware, ETL/ELT, SQL, Apache Kafka, Streamsets, DBT, Apache Airflow, Snowflake or similar tooling stack. * Experience with building cloud solutions (AWS/Azure, Serverless, cost engineering etc.) * Strong more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Xpertise Recruitment
CD, and model monitoring. Proficiency in Python and relevant data manipulation and analysis libraries (e.g., pandas, NumPy). Experience with distributed computing frameworks like Apache Spark is a plus. Apache Spark and Airflow would be a bonus. Role overview: If you're looking to work with a team more »
Newcastle Upon Tyne, England, United Kingdom Hybrid / WFH Options
Xpertise Recruitment
CD, and model monitoring. Proficiency in Python and relevant data manipulation and analysis libraries (e.g., pandas, NumPy). Experience with distributed computing frameworks like Apache Spark is a plus. Apache Spark and Airflow would be a bonus. Role overview: If you're looking to work with a team more »
Ability to design and implement data warehousing solutions using Azure Synapse Analytics. Azure Databricks: Proficiency in using Azure Databricks for data processing and analytics. Apache Spark: Deep understanding of Apache Spark for large-scale data processing. Azure Blob Storage and Azure Data Lake Storage: Expertise in setting up more »
development (ideally AWS) and container technologies Strong communication and interpersonal skills Experience managing projects and working with external third party teams Ideally experience with Apache Spark or Apache Flink (but not essential) Please note, this role is unable to provide sponsorship. If this role sounds of interest and more »
service incidents. Responsible for the technical design, development, installation, monitoring and ongoing support and maintenance of a diverse set of middleware technologies including WebSphere, Apache, Tomcat, and Jboss. The role is a technical, hands-on opportunity with a heavy focus on automation, resilient design and deployment of middleware ready … impacting issues within the day-to-day role to management. Responsibilities Strong engineering experience in installation and maintenance of WebSphere application server, Tomcat, and Apache platforms Implement DevOps practices through GitOps framework Implement Configuration Management and Infrastructure as Code (e.g. Terraform, Python, Chef, Ansible, and Bash) Achieves product commitments more »
performance, reliability, and security. - Implement event-driven architectures using Kafka for real-time data processing and communication between microservices. - Utilize Big Data technologies (e.g., Apache Spark, Hadoop) to process and analyze large volumes of data, extracting valuable insights to drive decision-making. - Design and optimize data pipelines for ingesting … principles and best practices. - Experience with Kafka for building event-driven architectures and real-time data processing. - Familiarity with Big Data technologies such as Apache Spark, Hadoop, or similar frameworks. - Proven track record of delivering scalable and reliable software solutions in a fast-paced environment. - Excellent communication skills and more »
OpenShift (OpenShift Clusters - 1 Hosting Production, 1 Hosting Dev/Test environments) RedHat CoreOS Enterprise Service Bus (ESB) Common Object Request Broker Architecture (CORBA) Apache Kafka Broker RedHat AMQ Broker Linux WAF based on Apache and Mod Security Kubernetes and Docker Containers Terraform Apache Active MQ Artemis … ServiceNow API Integration PostgreSQL Apache Camel Bitbucket & Atlassian Bamboo Additional Information Location: This role can be delivered in a hybrid nature from one of these offices Belfast, Birmingham, Manchester, Edinburgh, London or Newcastle. At Version 1, we believe in providing our employees with a comprehensive benefits package that prioritises more »
OpenShift (OpenShift Clusters - 1 Hosting Production, 1 Hosting Dev/Test environments) RedHat CoreOS Enterprise Service Bus (ESB) Common Object Request Broker Architecture (CORBA) Apache Kafka Broker RedHat AMQ Broker Linux WAF based on Apache and Mod Security Kubernetes and Docker Containers Terraform Apache Active MQ Artemis … ServiceNow API Integration PostgreSQL Apache Camel Bitbucket & Atlassian Bamboo Additional Information Location: This role can be delivered in a hybrid nature from one of these offices Belfast, Birmingham, Manchester, Edinburgh, London or Newcastle. At Version 1, we believe in providing our employees with a comprehensive benefits package that prioritises more »
OpenShift (OpenShift Clusters - 1 Hosting Production, 1 Hosting Dev/Test environments) RedHat CoreOS Enterprise Service Bus (ESB) Common Object Request Broker Architecture (CORBA) Apache Kafka Broker RedHat AMQ Broker Linux WAF based on Apache and Mod Security Kubernetes and Docker Containers Terraform Apache Active MQ Artemis … ServiceNow API Integration PostgreSQL Apache Camel Bitbucket & Atlassian Bamboo Additional Information Location: This role can be delivered in a hybrid nature from one of these offices Belfast, Birmingham, Manchester, Edinburgh, London or Newcastle. At Version 1, we believe in providing our employees with a comprehensive benefits package that prioritises more »
OpenShift (OpenShift Clusters - 1 Hosting Production, 1 Hosting Dev/Test environments) RedHat CoreOS Enterprise Service Bus (ESB) Common Object Request Broker Architecture (CORBA) Apache Kafka Broker RedHat AMQ Broker Linux WAF based on Apache and Mod Security Kubernetes and Docker Containers Terraform Apache Active MQ Artemis … ServiceNow API Integration PostgreSQL Apache Camel Bitbucket & Atlassian Bamboo Additional Information Location: This role can be delivered in a hybrid nature from one of these offices Belfast, Birmingham, Manchester, Edinburgh, London or Newcastle. At Version 1, we believe in providing our employees with a comprehensive benefits package that prioritises more »
OpenShift (OpenShift Clusters - 1 Hosting Production, 1 Hosting Dev/Test environments) RedHat CoreOS Enterprise Service Bus (ESB) Common Object Request Broker Architecture (CORBA) Apache Kafka Broker RedHat AMQ Broker Linux WAF based on Apache and Mod Security Kubernetes and Docker Containers Terraform Apache Active MQ Artemis … ServiceNow API Integration PostgreSQL Apache Camel Bitbucket & Atlassian Bamboo Additional Information Location: This role can be delivered in a hybrid nature from one of these offices Belfast, Birmingham, Manchester, Edinburgh, London or Newcastle. At Version 1, we believe in providing our employees with a comprehensive benefits package that prioritises more »
OpenShift (OpenShift Clusters - 1 Hosting Production, 1 Hosting Dev/Test environments) RedHat CoreOS Enterprise Service Bus (ESB) Common Object Request Broker Architecture (CORBA) Apache Kafka Broker RedHat AMQ Broker Linux WAF based on Apache and Mod Security Kubernetes and Docker Containers Terraform Apache Active MQ Artemis … ServiceNow API Integration PostgreSQL Apache Camel Bitbucket & Atlassian Bamboo Additional Information Location: This role can be delivered in a hybrid nature from one of these offices Belfast, Birmingham, Manchester, Edinburgh, London or Newcastle. At Version 1, we believe in providing our employees with a comprehensive benefits package that prioritises more »
Software Engineer for this role, you will collaborate with the founding team to expand the integration of our Big Data processing acceleration technology with Apache Spark to drive new optimizations and broader SQL operation coverage. Your contributions to our core solution will directly impact data infrastructure processing 10s of … as batch processing code, data parsing, shuffling and data partitioning algorithms. Maintain the solution up to date and compatible with a variety of supported Apache Spark runtimes. Independently and diligently write, test and deploy production code driven by modern software engineering practices. Work with the internals of leading open more »
to ensure efficient and accurate data delivery. Optimize data workflows for performance, scalability, and cost-effectiveness. Technical Expertise: Demonstrate in-depth expertise in Databricks, Apache Spark, and related big data technologies. Stay informed about the latest industry trends and advancements in data engineering. Quality Assurance: Conduct thorough testing and … projects. Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience in data engineering with a focus on Databricks and Apache Spark. Strong programming skills, preferably in Python or Scala. Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and associated data services. Excellent communication skills more »
Data Engineer 6 Month Contract Inside IR35 £450/day Hiring Immediately Job Description (Apache Iceberg, Spark, Big Data) Job Details Overview: Overall IT experience of 5+ years of total experience with strong programming skills Excellent skill in Apache Iceberg, Spark, Big Data 3+ years of Big Data … project development experience Hands on experience in working areas like Apache Iceberg & Spark, Hadoop, Hive Must have knowledge in any Database Ex: Postgres, Oracle, MongoDB Excellent in SDLC Processes and DevOps knowledge (Jira, Jenkins pipeline) Working in Agile POD and with team collaboration Ability to participate in deep technical more »