Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Fruition Group
best practices for data security and compliance. Collaborate with stakeholders and external partners. Skills & Experience: Strong experience with AWS data technologies (e.g., S3, Redshift, Lambda). Proficient in Python, Apache Spark, and SQL. Experience in data warehouse design and data migration projects. Cloud data platform development and deployment. Expertise across data warehouse and ETL/ELT development in AWS More ❯
Amazon EKS, Amazon S3, AWS Glue, Amazon RDS, Amazon DynamoDB, Amazon Aurora, Amazon SageMaker, Amazon Bedrock (including LLM hosting and management). Expertise in workflow orchestration tools such as Apache Airflow. Experience implementing DataOps best practices and tooling, including DataOps.Live. Advanced skills in data storage and management platforms like Snowflake. Ability to deliver insightful analytics via business intelligence tools More ❯
pipeline development. • Working knowledge of Databricks, NiFi, AWS Redshift, and Elasticsearch. Preferred Skills • Databricks Workflows, Unity Catalog, Autoloader, Delta Live Tables/Delta Lake, Workspaces/Notebooks, and MLflow. • Apache Spark expertise. • Familiarity with AWS Professional certifications (Solutions Architect, Developer, Data Engineer). • Knowledge of CISA best practices, software supply chain security, and NIST SP 800-218. • Experience More ❯
Knowledge of MiFID II, Dodd-Frank regulations and controls. Knowledge/experience of FIX flows and RFQ workflows (TradeWeb, RFQ Hub, BlackRock, Bloomberg). Additional technology experience: React JS, Apache NiFi, Mongo, DBaaS, SaaS, Tibco/Solace or similar messaging middleware. Education Bachelor's degree or equivalent experience operating in a similar role. This job description provides a high More ❯
and managing cloud infrastructure as code Proficiency in programming languages such as Python, Spark, SQL Strong experience with SQL databases Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF) Experience with cloud platforms (Azure preferred) and related data services Excellent problem-solving skills and attention to detail Inclusive and curious, continuously seeks to build knowledge and More ❯
and managing cloud infrastructure as code Proficiency in programming languages such as Python, Spark, SQL Strong experience with SQL databases Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF) Experience with cloud platforms (Azure preferred) and related data services Excellent problem-solving skills and attention to detail Inclusive and curious, continuously seeks to build knowledge and More ❯
and data pipelines for relational, dimensional, data lakehouse (medallion architecture), data warehouse, data mart, SQL and NoSQL data stores. • Utilize Microsoft Azure services including Azure Data Factory, Synapse Pipelines, Apache Spark Notebooks, Python, SQL, stored procedures to develop high performing data pipelines. • Redevelop existing SSIS extract, transform, load scripts to Azure Data Factory and Synapse Pipelines. • Identify, create, prepare More ❯
of automation IT WOULD BE NICE FOR THE SENIOR SOFTWARE ENGINEER TO HAVE.... Cloud based experience Microservice architecture or server-less architecture Big Data/Messaging technologies such as Apache Nifi/MiNiFi/Kafka TO BE CONSIDERED.... Please either apply by clicking online or emailing me directly to For further information please call me on . I can More ❯
of automation IT WOULD BE NICE FOR THE SENIOR SOFTWARE ENGINEER TO HAVE.... Cloud based experience Microservice architecture or server-less architecture Big Data/Messaging technologies such as Apache Nifi/MiNiFi/Kafka TO BE CONSIDERED.... Please either apply by clicking online or emailing me directly to For further information please call me on . I can More ❯
Cheltenham, Gloucestershire, South West, United Kingdom
LM RECRUITMENT SOLUTIONS LTD
Software Engineers Cheltenham Permanent OR Contract Must hold Green Badge clearance Excellent salaries dependant on experience Experience with the below essential: Python Java Openshift nifi apache Bonus: CI/CD ML DevOps AWS experience/cloud engineering ( DESIRABLE More ❯
to solve complex client challenges Strong software engineering foundation in Python, JavaScript/TypeScript, SQL , and cloud platforms such as AWS, GCP, or Azure Familiarity with data technologies like Apache Spark or Databricks , and a structured, analytical approach to problem-solving If you're passionate about building AI-powered applications that positively impact millions of people and businesses, and More ❯
applicable. Essential Skills & Experience Proven experience owning and operating production data platforms within AWS. Strong understanding of AWS core services: EventBridge, Lambda, EC2, S3, and MWAA (Managed Workflows for Apache Airflow). Experience with infrastructure reliability, observability tooling, and platform automation. Solid experience with CI/CD pipelines, preferably Bitbucket Pipelines. Familiarity with Snowflake administration and deployment practices. Comfortable More ❯
Oxford, England, United Kingdom Hybrid / WFH Options
Akrivia Health
cloud technologies and modern engineering practices. ● Experience with the following technologies: o Cloud Provider: AWS o Languages: Python, PHP, Rust & SQL o Hosting: Kubernetes o Tooling & Analytics: Airflow, RabbitMQ, Apache Spark, PowerBI ● Proven ability to complete projects according to outlined scope, budget, and timeline ● Experience with industry standard tools such as Microsoft products, Jira, confluence, project management tools ● Ability More ❯
banbury, south east england, united kingdom Hybrid / WFH Options
Akrivia Health
cloud technologies and modern engineering practices. ● Experience with the following technologies: o Cloud Provider: AWS o Languages: Python, PHP, Rust & SQL o Hosting: Kubernetes o Tooling & Analytics: Airflow, RabbitMQ, Apache Spark, PowerBI ● Proven ability to complete projects according to outlined scope, budget, and timeline ● Experience with industry standard tools such as Microsoft products, Jira, confluence, project management tools ● Ability More ❯
A minimum of 6 years of experience in data engineering. Education: A degree in Computer Science or an equivalent qualification. Skills: Proficiency in SQL Server, Azure Cosmos DB and Apache Spark. Communication: Strong communication and organisational skills. Advantageous Skills The following skills and experiences are considered strong advantageous: Experience in Azure Databricks is a strong advantage. Experience in Azure More ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom
Norton Rose Fulbright LLP
Azure/Microsoft Fabric/Data Factory) and modern data warehouse technologies (Databricks, Snowflake) Experience with database technologies such as RDBMS (SQL Server, Oracle) or NoSQL (MongoDB) Knowledge in Apache technologies such as Spark, Kafka and Airflow to build scalable and efficient data pipelines Ability to design, build, and deploy data solutions that explore, capture, transform, and utilize data More ❯
data quality, security, and best practices Collaborate with cross-functional teams Implement and manage MLOps capabilities Essential Skills: Advanced Python programming skills Expertise in data engineering tools and frameworks (Apache Flink) Hands-on AWS experience (Serverless, CloudFormation, CDK) Strong understanding of containerization, CI/CD, and DevOps Modern data storage knowledge Sound like you? Please get your CV over More ❯
CLI Preferred Experience with software frameworks for big data such as Splunk and Elastic Stack Experience with IaC (Infrastructure as Code) principles and automation tools including Ansible Experience using Apache NiFi, REST APIs Experience with testing frameworks or automation tools More ❯
or another language such as Python Good knowledge of developing in a Linux environment Working knowledge of Git version control and GitLabs CI/CD pipelines Experience working with Apache NiFi Some exposure to front-end elements like JavaScript, TypeScript or React Some data interrogation with ElasticSearch and Kibana Exposure to working with Atlassian products Looking for a role More ❯
or another language such as Python Good knowledge of developing in a Linux environment Working knowledge of Git version control and GitLabs CI/CD pipelines Experience working with Apache NiFi Some exposure to front-end elements like JavaScript, TypeScript or React Some data interrogation with ElasticSearch and Kibana Exposure to working with Atlassian products Looking for a role More ❯
and Data Practice. You will have the following experience : 8+ years of experience in data engineering or cloud development. Strong hands-on experience with AWS services Proficiency in Databricks, Apache Spark, SQL, and Python. Experience with data modeling, data warehousing, and DevOps practices. Familiarity with Delta Lake, Unity Catalog, and Databricks REST APIs. Excellent problem-solving and communication skills. More ❯
and Data Practice. You will have the following experience : 8+ years of experience in data engineering or cloud development. Strong hands-on experience with AWS services Proficiency in Databricks, Apache Spark, SQL, and Python. Experience with data modeling, data warehousing, and DevOps practices. Familiarity with Delta Lake, Unity Catalog, and Databricks REST APIs. Excellent problem-solving and communication skills. More ❯
and Data Practice. You will have the following experience : 8+ years of experience in data engineering or cloud development. Strong hands-on experience with AWS services Proficiency in Databricks, Apache Spark, SQL, and Python. Experience with data modeling, data warehousing, and DevOps practices. Familiarity with Delta Lake, Unity Catalog, and Databricks REST APIs. Excellent problem-solving and communication skills. More ❯
and Data Practice. You will have the following experience : 8+ years of experience in data engineering or cloud development. Strong hands-on experience with AWS services Proficiency in Databricks, Apache Spark, SQL, and Python. Experience with data modeling, data warehousing, and DevOps practices. Familiarity with Delta Lake, Unity Catalog, and Databricks REST APIs. Excellent problem-solving and communication skills. More ❯