features from idea to production unattended. Also, actively manages and escalates risk and customer-impacting issues. Responsibilities Install and maintain JBOSS application server and Apache platforms End-to-end setup of Virtual Machines/servers with pre-requisites like file systems, backups, logging, monitoring, etc. required for the application … if you have: Experience using containerized platforms including Kubernetes, Docker and OpenShift Experience in JBOSS 7.x/8.x, Redhat Linux, Redhat OpenJDK, Oracle Java, Apache 2.x Experience in Java-based applications Experience in Recovery Collection Applications, including Debt Management and Recovery Possess technical knowledge on AWS and GCP cloud more »
cloud-based data storage technologies such as Google BigQuery, Amazon S3, and Redshift. Hands-on experience with data processing frameworks and tools such as Apache Spark, Apache Beam, and TensorFlow. Proficiency in programming languages such as Python, Java, or Scala. Solid understanding of data modeling concepts and database more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Xpertise Recruitment
CD, and model monitoring. Proficiency in Python and relevant data manipulation and analysis libraries (e.g., pandas, NumPy). Experience with distributed computing frameworks like Apache Spark is a plus. Apache Spark and Airflow would be a bonus. Role overview: If you're looking to work with a team more »
Newcastle Upon Tyne, England, United Kingdom Hybrid / WFH Options
Xpertise Recruitment
CD, and model monitoring. Proficiency in Python and relevant data manipulation and analysis libraries (e.g., pandas, NumPy). Experience with distributed computing frameworks like Apache Spark is a plus. Apache Spark and Airflow would be a bonus. Role overview: If you're looking to work with a team more »
Ability to design and implement data warehousing solutions using Azure Synapse Analytics. Azure Databricks: Proficiency in using Azure Databricks for data processing and analytics. Apache Spark: Deep understanding of Apache Spark for large-scale data processing. Azure Blob Storage and Azure Data Lake Storage: Expertise in setting up more »
Engineering, providing DevOps support, and/or RHEL administration for mission-critical platforms, ideally Kafka. 4+ years of combined experience with Kafka (Confluent Kafka, Apache Kafka, Amazon MSK) 4+ years of experience with Ansible automation Must be able to obtain and maintain a Public Trust. Contract requirement. Selected candidate … Solid experience using version control software such as Git/Bitbucket including peer reviewing Ansible playbooks Hands-on experience administrating Kafka platform (Confluent Kafka, Apache Kafka, Amazon MSK) via Ansible playbooks or other automation. Understanding of Kafka architecture, including partition strategy, replication, transactions, tiered storage, and disaster recovery strategies. … STAND OUT FROM THE CROWD (Desired Skills) Showcase your knowledge of modern development through the following experience or skills: Preferred Confluent Certified Administrator for Apache Kafka (CCAAK) or Confluent Certified Developer for Apache Kafka (CCDAK) Practical experience with event-driven applications and at least one event processing framework more »
development (ideally AWS) and container technologies Strong communication and interpersonal skills Experience managing projects and working with external third party teams Ideally experience with Apache Spark or Apache Flink (but not essential) Please note, this role is unable to provide sponsorship. If this role sounds of interest and more »
system. · Significant experience with Python, and experience with modern software development and release engineering practices (e.g. TDD, CI/CD). · Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Arrow, MapR). · Significant experience with SQL – comfortable writing efficient SQL. · Experience using enterprise … scheduling tools (e.g. Apache Airflow, Spring DataFlow, Control-M) · Experience with Linux and containerisation What you’ll get in return ·Competitive base salary ·Up to 20% bonus ·25 days holiday ·BAYE, SAYE & Performance share schemes ·7% pension ·Life Insurance ·Work Away Scheme ·Flexible benefits package ·Excellent staff travel benefits more »
service incidents. Responsible for the technical design, development, installation, monitoring and ongoing support and maintenance of a diverse set of middleware technologies including WebSphere, Apache, Tomcat, and Jboss. The role is a technical, hands-on opportunity with a heavy focus on automation, resilient design and deployment of middleware ready … impacting issues within the day-to-day role to management. Responsibilities Strong engineering experience in installation and maintenance of WebSphere application server, Tomcat, and Apache platforms Implement DevOps practices through GitOps framework Implement Configuration Management and Infrastructure as Code (e.g. Terraform, Python, Chef, Ansible, and Bash) Achieves product commitments more »
Python Scala Kotlin Spark Google PubSub Elasticsearch, Bigquery, PostgresQL Kubernetes, Docker, Airflow Key Responsibilities Designing and implementing scalable data pipelines using tools such as Apache Spark, Google PubSub etc. Optimising data storage and retrieval systems for maximum performance using both relational and NoSQL databases. Continuously monitoring and improving the … Data Infrastructure projects, as well as designing and building data intensive applications and services. Experience with data processing and distributed computing frameworks such as Apache Spark Expert knowledge in one or more of the following languages - Python, Scala, Java, Kotlin Deep knowledge of data modelling, data access, and data more »
improvements. Specifically, tasks include: •Coordinate and gather requirements, design, develop, and document pre and post processing of customer data implementing established framework built on Apache Nifi (Niagra Falls) application software in support of existing and new customer data flows. •Assist in providing architecture and design work as well as … Prior experience in NiFi data flow design and development OR experience with similar ETL tools •Familiarity with web services SOAP/REST •Experience with Apache web server configuration and performance tuning. •Familiarity with XML/JSON •Experience with web front-end development skills: HTML/JS/TypeScript/ more »
so any knowledge of cross domain solutions or air gapped is a plus AWS as initial hosting provider Containerised apps using Docker and Kubernetes Apache Jena Elastic PostGIS Kafka Apache NiFi AWS Cognito HTTP REST, GraphQL, SPARQL interfaces Web apps based on HTML/CSS/Javascript frameworks more »
knowledge of cross domain solutions or air gapped is a plus AWS as initial hosting provider . Containerised apps using Docker and Kubernetes . Apache Jena . Elastic . PostGIS . Kafka . Apache NiFi . AWS Cognito . HTTP REST, GraphQL, SPARQL interfaces . Web apps based more »
Experienced creating data pipelines on a cloud (preferably AWS) environment CI/CD experience Containerization experience (Docker, Kubernetes, etc.) Experience with SQS/SNS, Apache Kafka, RabbitMQ Other interesting/bonus skills – Airflow, Trino, Apache Iceburg, Postgres, MongoDB You *must* be eligible to work in your chosen country more »
Software Engineer for this role, you will collaborate with the founding team to expand the integration of our Big Data processing acceleration technology with Apache Spark to drive new optimizations and broader SQL operation coverage. Your contributions to our core solution will directly impact data infrastructure processing 10s of … as batch processing code, data parsing, shuffling and data partitioning algorithms. Maintain the solution up to date and compatible with a variety of supported Apache Spark runtimes. Independently and diligently write, test and deploy production code driven by modern software engineering practices. Work with the internals of leading open more »
to ensure efficient and accurate data delivery. Optimize data workflows for performance, scalability, and cost-effectiveness. Technical Expertise: Demonstrate in-depth expertise in Databricks, Apache Spark, and related big data technologies. Stay informed about the latest industry trends and advancements in data engineering. Quality Assurance: Conduct thorough testing and … projects. Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience in data engineering with a focus on Databricks and Apache Spark. Strong programming skills, preferably in Python or Scala. Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and associated data services. Excellent communication skills more »
Swansea, Wales, United Kingdom Hybrid / WFH Options
CPS Group (UK) Limited
my client will train you): Knowledge of Microsoft SQL Server and packaged BI tools (SSAS and SSIS). Docker, Kubernetes and cloud computing technologies. Apache Kafka and data streaming. Familiarity with Apache Spark or similar data processing tools. Experience developing and maintaining CICD pipelines, particularly Azure DevOps or more »
Nottingham, Nottinghamshire, East Midlands, United Kingdom
Microlise
data practices Possess strong knowledge of data tools, data management tools, and various data and information technologies. E.g. DAMA DMBOK, Microsoft SQL Server, Couchbase, Apache Druid, Spark, Kafka, Airflow, etc In-depth understanding of modern data principles, methodologies, and tools Excellent communication and collaboration skills, with the ability to … native computing concepts and experience working with hybrid or private cloud platforms is a plus. Demonstrable technical experience working with a Microsoft, Redhat, and Apache data and software engineering environment. A team-oriented individual with a passion for engineered excellence and the ability to lead and motivate a team more »
managers, to understand data requirements and deliver high-quality solutions as well as architecting data ingestion, transformation, and storage processes using tools such as Apache Spark, Azure Data Factory, and other similar technologies. Other core duties include optimizing data pipeline performance, ensuring data accuracy, reliability, and timely delivery. Requirements … Services Certifications in relevant technologies, such as Azure Data Engineer or Databricks Certified Developer Experience with real-time data processing and streaming technologies like Apache Kafka or Azure Event Hubs Knowledge of data visualization tools, such as Power BI or Tableau Contributions to open-source projects or active participation more »
Manchester, North West, United Kingdom Hybrid / WFH Options
Searchability NS&D Ltd
existing architectural components including Data Ingest, Data Stores and REST APIs. THE DEVOPS ENGINEER SHOULD HAVE…. You must have an active eDV Clearance. Apache Nifi Flink Java Ansible Docker Kubernetes ELK stack Linux Sys Admin for deployed Clusters (10's of servers) Jenkins Pipeline development Integration/debugging … DEVOPS ENGINEERS - MANCHESTER KEY SKILLS: SOFTWARE DEVELOPER/SOFTWARE ENGINEER/SENIOR SOFTWARE DEVELOPER/SENIOR SOFTWARE ENGINEER/DEVOPS ENGINEER/DEVOPS/APACHE NIFI/FLINK/JAVA/ANSIBLE/DOCKER/KUBERNETES/ELK STACK/TERRAFORM/LINUX/GIT more »
syntax • Network basics ( nslookup, ping, netcat, traceroute, tcpdump, etc ) • Coordinate and collaborate well with other team members and external partners Desired Experience • Familiarity with Apache Tomcat and ApacheHTTPServer • Familiarity with Cisco Splunk querying • Familiarity with Genesys configuration manager • Some understanding of Session Initiation Protocol (SIP more »
Data Engineer 6 Month Contract Inside IR35 £450/day Hiring Immediately Job Description (Apache Iceberg, Spark, Big Data) Job Details Overview: Overall IT experience of 5+ years of total experience with strong programming skills Excellent skill in Apache Iceberg, Spark, Big Data 3+ years of Big Data … project development experience Hands on experience in working areas like Apache Iceberg & Spark, Hadoop, Hive Must have knowledge in any Database Ex: Postgres, Oracle, MongoDB Excellent in SDLC Processes and DevOps knowledge (Jira, Jenkins pipeline) Working in Agile POD and with team collaboration Ability to participate in deep technical more »