cloud platforms, particularly AWS. Strong knowledge of data systems, including relational and non-relational data stores. Experience with big data tools and frameworks (e.g., Apache Spark, Kafka, Flink or Snowflake). Nice to have: Experience building or implementing data products, schema management, data contracts, data privacy regulations (e.g. GDPR More ❯
to Have: Familiarity with cloud platforms (AWS, GCP, Azure) and cloud-based database services (Snowflake). Knowledge of data warehousing, orchestration and pipeline technologies (Apache Airflow/Kafka, Azure DataFactory etc.). Experience with DBT for modelling Server administration and networking fundamentals More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Kolayo
to Have: Familiarity with cloud platforms (AWS, GCP, Azure) and cloud-based database services (Snowflake). Knowledge of data warehousing, orchestration and pipeline technologies (Apache Airflow/Kafka, Azure DataFactory etc.). Experience with DBT for modelling Server administration and networking fundamentals More ❯
needed: Distributed or large-scale systems MySQL/SQL database design, query optimization, and administration Web development using HTML, CSS, JavaScript, Vue/React Apache web server and related modules Cloud platforms such as AWS, Google Cloud, Azure CI/CD pipeline setup, testing, and administration Networking and firewall More ❯
NICE FOR THE SENIOR SOFTWARE ENGINEER TO HAVE. Cloud based experience Microservice architecture or server-less architecture Big Data/Messaging technologies such as Apache Nifi/MiNiFi/Kafka TO BE CONSIDERED. Please either apply by clicking online or emailing me directly to For further information please call More ❯
NICE FOR THE SENIOR SOFTWARE ENGINEER TO HAVE.... Cloud based experience Microservice architecture or server-less architecture Big Data/Messaging technologies such as Apache Nifi/MiNiFi/Kafka TO BE CONSIDERED.... Please either apply by clicking online or emailing me directly to dominic.barbet@searchability.comFor further information please More ❯
front-end technologies like HTML and Vue.js. Proficiency in scalable database design (SQL, NoSQL, Graph databases) such as SQL Server, MongoDB, Cassandra, Redis, and Apache Druid. Experience with REST APIs, GraphQL, and gRPC. Hands-on experience with version control (GitHub/GitLab) and testing frameworks like SonarQube, xUnit, Postman More ❯
in Next.js Experience with testing frameworks like Jest, Cypress, or React Testing Library. Experience with authentication strategies using OAuth, JWT, or Cognito Familiarity with Apache Spark/Flink for real-time data processing is an advantage. Hands-on experience with CI/CD tools Commercial awareness and knowledge of More ❯
usage, and data governance considerations, promoting transparency and responsible AI use. 7. Automate ETL pipeline orchestration and data processing workflows: Leverage orchestration tools like Apache Airflow, Perfect to schedule, automate, and manage ETL jobs, reducing manual intervention and improving operational reliability. 8. Implement monitoring, alerting, and troubleshooting for data More ❯
Strong hands-on experience in performant and scalable database design in SQL, NOSQL and GRAPH databases such SQL Server/PgSQL, MongoDB, Cassandra, Redis ,Apache Druid • Solid experience in REST APIs, GraphQL & gRPC • Strong hands-on experience in GitHub/GitLab and testing tools/frameworks such as SonarQube More ❯
London, England, United Kingdom Hybrid / WFH Options
Workato
data movement, databases (Oracle, SQL Server, PostgreSQL), and cloud analytics platforms (Snowflake, Databricks, BigQuery). Familiarity with emerging data technologies like Open Table Format, Apache Iceberg, and their impact on enterprise data strategies. Hands-on experience with data virtualization and analytics platforms (Denodo, Domo) to enable seamless self-service More ❯
SQL Server or PostgreSQL Familiarity with platforms like Databricks and Snowflake for data engineering and analytics Experience working with Big Data technologies (e.g., Hadoop, Apache Spark) Familiarity with NoSQL databases (e.g., columnar or graph databases like Cassandra, Neo4j) Research experience with peer-reviewed publications Certifications in cloud-based machine More ❯
intellectually curious, and team-oriented. Strong communication skills. Experience with options trading or options data is a strong plus. Experience with technologies like KDB, Apache Iceberg, and Lake Formation will be a meaningful differentiator. #J-18808-Ljbffr More ❯
systems Strong knowledge of Kubernetes and Kafka Experience with Git, and Deployment Pipelines Having worked with at least one of the following stacks: Hadoop, Apache Spark, Presto AWS Redshift, Azure Synapse or Google BigQuery Experience profiling performance issues in database systems Ability to learn and/or adapt quickly More ❯
systems Strong knowledge of Kubernetes and Kafka Experience with Git, and Deployment Pipelines Having worked with at least one of the following stacks: Hadoop, Apache Spark, Presto AWS Redshift, Azure Synapse or Google BigQuery Experience profiling performance issues in database systems Ability to learn and/or adapt quickly More ❯
City Of London, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
focus on public cloud onboarding. The platform is a Greenfield build using modern technologies such as Java, Spring Boot, Kubernetes, Kafka, MongoDB, RabbitMQ, Solace, Apache Ignite. The platform runs in a hybrid mode both on-premises and in AWS utilising technologies such as EKS, S3, FSX. Objectives Steering platform More ❯
London, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
focus on public cloud onboarding. The platform is a Greenfield build using modern technologies such as Java, Spring Boot, Kubernetes, Kafka, MongoDB, RabbitMQ, Solace, Apache Ignite. The platform runs in a hybrid mode both on-premises and in AWS utilising technologies such as EKS, S3, FSX. Objectives Steering platform More ❯
concepts to non-technical stakeholders. Preferred Skills: Experience with insurance platforms such as Guidewire, Duck Creek, or legacy PAS systems. Knowledge of Delta Lake, Apache Spark, and data pipeline orchestration tools. Exposure to Agile delivery methodologies and tools like JIRA, Confluence, or Azure DevOps. Understanding of regulatory data requirements More ❯
concepts to non-technical stakeholders. Preferred Skills: Experience with insurance platforms such as Guidewire, Duck Creek, or legacy PAS systems. Knowledge of Delta Lake, Apache Spark, and data pipeline orchestration tools. Exposure to Agile delivery methodologies and tools like JIRA, Confluence, or Azure DevOps. Understanding of regulatory data requirements More ❯
concepts to non-technical stakeholders. Preferred Skills: Experience with insurance platforms such as Guidewire, Duck Creek, or legacy PAS systems. Knowledge of Delta Lake, Apache Spark, and data pipeline orchestration tools. Exposure to Agile delivery methodologies and tools like JIRA, Confluence, or Azure DevOps. Understanding of regulatory data requirements More ❯
London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
focus on public cloud onboarding. The platform is a Greenfield build using modern technologies such as Java, Spring Boot, Kubernetes, Kafka, MongoDB, RabbitMQ, Solace, Apache Ignite. It runs in a hybrid mode both on-premises and in AWS, utilizing technologies such as EKS, S3, FSX. Objectives Steering platform onboarding More ❯
Salisbury, Wiltshire, South West, United Kingdom Hybrid / WFH Options
Anson Mccade
architectures Strong grasp of cloud data platforms (AWS, Azure, GCP, Snowflake) Understanding of Data Mesh , Data Fabric , and data product-centric approaches Familiarity with Apache Spark , Python , and ETL/ELT pipelines Strong knowledge of data governance, lifecycle management, and compliance (e.g. GDPR) Consulting experience delivering custom data solutions More ❯
London, England, United Kingdom Hybrid / WFH Options
Experis - ManpowerGroup
concepts to non-technical stakeholders. Preferred Skills: Experience with insurance platforms such as Guidewire, Duck Creek, or legacy PAS systems. Knowledge of Delta Lake, Apache Spark, and data pipeline orchestration tools. Exposure to Agile delivery methodologies and tools like JIRA, Confluence, or Azure DevOps. Understanding of regulatory data requirements More ❯
Cloud Data Lake activities. The candidate should have industry experience (preferably in Financial Services) in navigating enterprise Cloud applications using distributed computing frameworks as Apache Spark, Hadoop, Hive. Working knowledgeoptimizing database performance, scalability, ensuring data security and compliance. Education & Preferred Qualifications Bachelor’s/Master's Degree in a More ❯
adopt emerging technologies, and enhance analytics capabilities. Requirements: Technical Proficiency: Hands-on experience building ETL/ELT pipelines with Python, SQL, or tools like Apache Airflow, and expertise in visualisation tools (Power BI, Tableau, or Looker). Cloud Expertise: Familiarity with cloud platforms like Snowflake, Databricks, or AWS/ More ❯