and coding environments. Bonus Skills: Python/PHP/Typescript/ReactJS AI/ML models and usage ETL pipelines in AWS (Glue/Apache Spark) API Load testing If you would like more information on the role or like to apply for then please send your CV to more »
science or other related engineering fields Plusses: • Experience with React • Experience with MongoDB • Experience working on streaming technologies like Kaftka and distributed technologies like Apache Ignite • Experience working on AWS, GCP, Kubernetes, IaC • Experience working with C# • Financial industry experience • Experience with cloud like AWS, GCP, Azure (ideally GCP more »
Azure Data Lake , Azure Databricks or GCP Cloud Dataproc . Familiarity with big data technologies and distributed computing frameworks, such as Hadoop, Spark, or Apache Flink. Experience scaling an “API-Ecosystem ”, designing, and implementing “API-First” integration patterns. Experience working with authentication and authorisation protocols/patterns. Other Information more »
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
First Derivative
strategy must include data model designs, database development standards, implementation and management of data warehouses and data analytics systems. What experience will you need? Apache Spark Azure Databricks ETL Snowflake Big Query What's in it for you? You will embark upon a career with life-long learning at more »
engineers of varying levels of experience. Flexibility and willingness to adapt to new software and techniques. Nice to Have Experience working with projects in Apache Spark, Databricks of similar. Expert cloud platform knowledge, e.g. Azure What will be your key responsibilities? A technical expert and leader on the Petcare more »
Leeds, England, United Kingdom Hybrid / WFH Options
Harvey Nash
websites and web apps using HTML, PHP, Javascript Full stack development, Bootstrap, SQL Best practice PHP with an emphasis on secure development practices Linux, Apache/Nginx, PostgreSQL/MySQL, Bootstrap stack Creating scalable, clean and resilient solutions through code Version control through Git to manage the codebase efficiently more »
London, England, United Kingdom Hybrid / WFH Options
Austin Fraser
a plus: Cutting-Edge Tech: Experience with containerisation, Kubernetes, and observability platforms. Workflow Wizardry: Familiarity with data orchestration tools like Airflow and ETL with Apache Beam. Data Visionary: Knowledge of DataVault (DV2) and data management concepts. Location: Our opportunities are available in London Victoria and Bracknell. Choose the work more »
need to showcase a good understanding of modern AWS/Azure Well-Architected Frameworks along with demonstrable experience with SQL, Linux and web servers (Apache, Nginx) vs knowledge of containerisation and serverless paradigms. The right candidate will have a great attention to detail and strong analytical skills, with the more »
more of the following key technology skills: Systems integration, APIs – REST, SOAP etc Informatica SQL Server Integration Services (SSIS) Azure Data Factory Databricks/Apache Spark Amazon RedShift Azure Synapse SQL Server Oracle Database Oracle Data Integrator Oracle Integration Cloud Business Objects Data Services (BODS) Equivalent tech (useful similar more »
Kubernetes/Docker or other container technologies. Scripting skills including Python and Bash . Good understanding of Linux and database technologies such as Ubuntu, Apache, PHP, MySQL, PostgreSQL, Nginx, Mercurial and Git. more »
CMake Proficiency in developing cross-platform SDKs for Windows, macOS, Linux, WebAssembly and Embedded Platforms Knowledge of machine learning frameworks such as ONNXRuntime or Apache TVM Experience deploying and optimising real time embedded audio algorithms Familiarity with audio codecs, audio formats and audio streaming protocols is preferred Willingness to more »
and dealing with ad-hoc requests About You We’re looking for an experienced Data Engineer with excellent knowledge of Snowflake, AWS, Python, and Apache Airflow who is ready to lead by example and is used to rolling up their sleeves to get things done. The successful candidate must … 3NF and dimensional modelling, Kimball, DV 2.0 etc.) Strong experience in building robust and scalable ELT/ETL data pipelines Proficient coding in - python, Apache Spark and expert knowledge of SQL and good experience with shell-scripting languages Working knowledge of orchestration tools, e.g. Apache Airflow Experience of … or consumer finance IaaC such as Terraform or AWS CloudFormation Knowledge of visualization tools, e.g. Tableau, Looker, Power BI, AWS QuickSight Exposure to streaming: Apache Kafka, AWS MSK Docker Understanding of SCRUM and Agile principles and collaboration tools like JIRA software and Confluence. What's in it for you more »
ends (React, Redux, NodeJS, Webpack) • Strong understanding of AWS ecosystems like Lambdas, step functions and ECS services. • Experience with data stack technologies, such as Apache Iceberg & DBT. Preferred Skills • Experience on RDBMS like PostgreSQL would be a plus. Exposure to Apache Airflow, Prefect, Dagster would be beneficial. • Experience more »
development (ideally AWS) and container technologies Strong communication and interpersonal skills Experience managing projects and working with external third party teams Ideally experience with Apache Spark or Apache Flink (but not essential) Please note, this role is unable to provide sponsorship. If this role sounds of interest and more »
Job Description: Primary responsibility: Lead Kafka Architect will design, perform POC where needed and develop enterprise’s Apache Kafka Distributed Messaging and Integration Ecosystem. Solid experience and knowledge in the deployment of Kafka(Apache/Confluent Physical Deployment across multiple environments Optimization and Tuning based on performance metrics more »
features from idea to production unattended. Also, actively manages and escalates risk and customer-impacting issues. Responsibilities Install and maintain JBOSS application server and Apache platforms End-to-end setup of Virtual Machines/servers with pre-requisites like file systems, backups, logging, monitoring, etc. required for the application … if you have: Experience using containerized platforms including Kubernetes, Docker and OpenShift Experience in JBOSS 7.x/8.x, Redhat Linux, Redhat OpenJDK, Oracle Java, Apache 2.x Experience in Java-based applications Experience in Recovery Collection Applications, including Debt Management and Recovery Possess technical knowledge on AWS and GCP cloud more »
service incidents. Responsible for the technical design, development, installation, monitoring and ongoing support and maintenance of a diverse set of middleware technologies including WebSphere, Apache, Tomcat, and Jboss. The role is a technical, hands-on opportunity with a heavy focus on automation, resilient design and deployment of middleware ready … impacting issues within the day-to-day role to management. Responsibilities Strong engineering experience in installation and maintenance of WebSphere application server, Tomcat, and Apache platforms Implement DevOps practices through GitOps framework Implement Configuration Management and Infrastructure as Code (e.g. Terraform, Python, Chef, Ansible, and Bash) Achieves product commitments more »
required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI more »
required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI more »