practices, and troubleshooting methodologies. Familiarity with integration tools (ESB, Web Services, WebSphere, SOAP). Knowledge of infrastructure: Windows Server/Client, RDS, IIS/Apache/WebSphere, web browsers. Experience in system upgrades, migrations, and CI/CD (Microsoft DevOps). Proficient in SQL, XML, and basic scripting (Python More ❯
Watford, Hertfordshire, South East, United Kingdom
ECS
practices, and troubleshooting methodologies. Familiarity with integration tools (ESB, Web Services, WebSphere, SOAP). Knowledge of infrastructure: Windows Server/Client, RDS, IIS/Apache/WebSphere, web browsers. Experience in system upgrades, migrations, and CI/CD (Microsoft DevOps). Proficient in SQL, XML, and basic scripting (Python More ❯
Basingstoke, Hampshire, South East, United Kingdom
Experis
fundamentals and IP packet structure Strong hands-on experience using PowerShell and Python for scripting, automation and test development Background in web development with Apache and PHP Familiarity with Active Directory, PKI, VMWare virtualisation, Windows Server 2019 and gold image creation Exposure to Agile methodologies for the delivery of More ❯
Languages : Python, Bash, Go Network Modelling : YANG API Protocols : RESTCONF, NETCONF Platforms : ServiceNow, GitHub, Azure, AWS Data : XML, JSON Other : Azure DevOps, GIT, Linux, Apache, MySQL Ideal Candidate Strong experience in automation and systems integration. Proficient in Python and automation using Ansible. Familiarity with ServiceNow, GitHub workflows, and network More ❯
Cheltenham, Gloucestershire, South West, United Kingdom
Defence
data engineering and backend development within the defence and security sector Technical proficiency in: Spring Boot Java Enterprise development React/VueJS/AngularJS Apache Nifi Flink Desired technical skills (at least 3 of the following): Ansible Docker Kubernetes Grafana/Prometheus Linux Sys Admin for deployed Clusters More ❯
technical specifications; assured code/systems against standards; championed data standards Metadata Management: Designed/Developed data catalogues or metadata repositories; used tools like Apache Atlas/Hive Metastore/AWS Glue/AWS Datazone Data Design: Designed data lakes/data warehouses/data lakehouses/data pipelines More ❯
Tools: GitLab CI, Terraform, Ansible, Helm Charts, Python, PowerShell, REST APIs. Kubernetes: Experience building and managing Kubernetes clusters and application delivery. Applications: Familiarity with Apache NiFi, Elastic ECK, Artifactory. Secret Management: Expertise in using HashiCorp Vault. Operating Systems: Solid experience with Red Hat and Windows environments. Apply today via More ❯
architects to design scalable solutions. Ensure adherence to best practices in Databricks data engineering. Required Skills & Experience: Hands-on experience with Microsoft Databricks and Apache Spark. Strong knowledge of Git, DevOps integration, and CI/CD pipelines. Expertise in Unity Catalog for data governance and security. Proven ability to More ❯
Basingstoke, Hampshire, South East, United Kingdom
Experis
of Exchange , Mail protocols , MTA , and SMTP Strong understanding of Border Sync Expertise in Directory Services Experience with Web technologies such as IIS and Apache Proven ability to build and configure servers from the ground up Solid understanding of Active Directory and Group Policy Deep knowledge of DNS and More ❯
understanding of networking and IP packet structure Experience working in a DevOps team designing, developing and supporting solutions Experience in web-site development using Apache and PHP Working Pattern: Monday-Friday Brief Overview of role/project: Do you want to make an impact and change the way the More ❯
including OAuth, JWT, and data encryption. Fluent in English with strong communication and collaboration skills. Preferred Qualifications Experience with big data processing frameworks like Apache Flink or Spark. Familiarity with machine learning models and AI-driven analytics. Understanding of front-end and mobile app interactions with backend services. Expertise More ❯
develop, and optimize data pipelines using Azure Databricks, PySpark, and Prophesy. Implement and maintain ETL/ELT pipelines using Azure Data Factory (ADF) and Apache Airflow for orchestration. Develop and optimize complex SQL queries and Python-based data transformation logic. Work with version control systems(GitHub, Azure DevOps) to … Azure Databricks, PySpark, and SQL. Hands-on experience with Prophesy for data pipeline development. Proficiency in Python for data processing and transformation. Experience with Apache Airflow for workflow orchestration. Strong expertise in Azure Data Factory (ADF) for building and managing ETL processes. Familiarity with GitHub and Azure DevOps for More ❯
to obtain DV level security clearance Responsibilities : Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data architects and security teams to ensure compliance with … Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with Apache NiFi for building and managing complex data flows and integration processes. Familiarity with containerization and orchestration tools such as Docker and Kubernetes. Experience with More ❯