mocking frameworks, and Git version control. Experienced in Agile (SCRUM) development, relational (Oracle) and NoSQL (Mongo) databases. Familiar with GitLab CI/CD Pipelines, Apache NiFi, and Atlassian Suite (Bitbucket, Jira, Confluence). Knowledge of JavaScript/TypeScript, React, Elasticsearch, Kibana, Hibernate, JSF (PrimeFaces), and AWS cloud services. Skilled More ❯
mocking frameworks, and Git version control. Experienced in Agile (SCRUM) development, relational (Oracle) and NoSQL (Mongo) databases. Familiar with GitLab CI/CD Pipelines, Apache NiFi, and Atlassian Suite (Bitbucket, Jira, Confluence). Knowledge of JavaScript/TypeScript, React, Elasticsearch, Kibana, Hibernate, JSF (PrimeFaces), and AWS cloud services. Skilled More ❯
mocking frameworks, and Git version control. Experienced in Agile (SCRUM) development, relational (Oracle) and NoSQL (Mongo) databases. Familiar with GitLab CI/CD Pipelines, Apache NiFi, and Atlassian Suite (Bitbucket, Jira, Confluence). Knowledge of JavaScript/TypeScript, React, Elasticsearch, Kibana, Hibernate, JSF (PrimeFaces), and AWS cloud services. Skilled More ❯
Infra tooling using Terraform, Ansible and Jenkins whilst automating everything with Python Tech (experience in any listed is advantageous) Python Cloud: AWS Lake house: Apache Spark or AWS Glue Cloud Native storage: Iceberg, RDS, RedShift, Kafka IaC: Terraform, Ansible CI/CD: Jenkins, Gitlab Other platforms such as Databricks More ❯
Infra tooling using Terraform, Ansible and Jenkins whilst automating everything with Python Tech (experience in any listed is advantageous) Python Cloud: AWS Lake house: Apache Spark or AWS Glue Cloud Native storage: Iceberg, RDS, RedShift, Kafka IaC: Terraform, Ansible CI/CD: Jenkins, Gitlab Other platforms such as Databricks More ❯
Agile (Scrum) methodologies Database experience with Oracle and/or MongoDB Experience using the Atlassian suite : Bitbucket, Jira, and Confluence Desirable Skills Knowledge of Apache NiFi Front-end development with React (JavaScript/TypeScript) Working knowledge of Elasticsearch and Kibana Experience developing for cloud environments, particularly AWS (EC2, EKS More ❯
of this team, you will be working on a plethora of services such as Glue (ETL service), Athena (interactive query service), Managed Workflows of Apache Airflow, etc. Understanding of ETL (Extract, Transform, Load) Creation of ETL Pipelines to extract and ingest data into data lake/warehouse with simple More ❯
Agile (Scrum) methodologies Database experience with Oracle and/or MongoDB Experience using the Atlassian suite : Bitbucket, Jira, and Confluence Desirable Skills Knowledge of Apache NiFi Front-end development with React (JavaScript/TypeScript) Working knowledge of Elasticsearch and Kibana Experience developing for cloud environments, particularly AWS (EC2, EKS More ❯
Java development with strong Spring Boot expertise. Solid background in building cloud-native applications , particularly using AWS services such as S3, SQS, Kinesis, or Apache Flink. Strong knowledge of stream processing , low-latency systems, and microservice architectures. Demonstrated leadership experience, including mentoring and guiding junior developers. Proficient in designing More ❯
Java development with strong Spring Boot expertise. Solid background in building cloud-native applications , particularly using AWS services such as S3, SQS, Kinesis, or Apache Flink. Strong knowledge of stream processing , low-latency systems, and microservice architectures. Demonstrated leadership experience, including mentoring and guiding junior developers. Proficient in designing More ❯
Greater Bristol Area, United Kingdom Hybrid / WFH Options
Peaple Talent
and familiar with setting up CI/CD workflows using platforms like Azure DevOps or similar tools. Hands-on experience with orchestration tools like Apache Airflow for managing complex data workflows. Practical familiarity with low-code or no-code platforms such as Talend and SnapLogic for streamlined pipeline development. More ❯
Snowflake. Understanding of cloud platform infrastructure and its impact on data architecture. Data Technology Skills: A solid understanding of big data technologies such as Apache Spark, and knowledge of Hadoop ecosystems. Knowledge of programming languages such as Python, R, or Java is beneficial. Exposure to ETL/ELT processes More ❯
Nottinghamshire, United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (Python, Data Modelling, ETL/ELT, Apache Airflow, DBT, AWS) Enterprise-scale tech firm Up to £70,000 plus benefits - FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalabale data structures and data pipelines within a More ❯
Python. Experience in data modelling and design patterns; in-depth knowledge of relational databases (PostgreSQL) and familiarity with data lakehouse formats (storage formats, e.g. Apache Parquet, Delta tables). Experience with Spark, Databricks, data lakes/lakehouses. Experience working with external data suppliers (defining requirements for suppliers, defining Service More ❯
working with hierarchical reference data models. Proven expertise in handling high-throughput, real-time market data streams. Familiarity with distributed computing frameworks such as Apache Spark. Operational experience supporting real-time systems. Equal Opportunity Workplace We are proud to be an equal opportunity workplace. We do not discriminate based More ❯
e.g., Refinitiv, Bloomberg). Data Platforms: Warehouses: Snowflake, Google BigQuery, or Amazon Redshift. Analytics: Tableau, Power BI, or Looker for client reporting. Big Data: Apache Spark or Hadoop for large-scale processing. AI/ML: TensorFlow or Databricks for predictive analytics. Integration Technologies: API Management: Apigee, AWS API Gateway More ❯
e.g., Refinitiv, Bloomberg). Data Platforms: Warehouses: Snowflake, Google BigQuery, or Amazon Redshift. Analytics: Tableau, Power BI, or Looker for client reporting. Big Data: Apache Spark or Hadoop for large-scale processing. AI/ML: TensorFlow or Databricks for predictive analytics. Integration Technologies: API Management: Apigee, AWS API Gateway More ❯
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
Low Carbon Contracts Company
design principles for usability, maintainability and extensibility Experience working with Git in a version-controlled environment Good knowledge of parallel computing techniques (Python multiprocessing, Apache Spark), and performance profiling and optimisation Good understanding of data structures and algorithms An enthusiastic problem-solving mindset with a desire to solve technical More ❯
West Midlands, England, United Kingdom Hybrid / WFH Options
Aubay UK
migration projects, particularly large-scale migrations to distributed database platforms. Hands-on experience with big data processing technologies, including Spark (PySpark and SparkScala) and Apache Airflow. Expertise in distributed databases and computing environments. Familiarity with Enterprise Architecture methodologies, ideally TOGAF. Strong leadership experience, including managing technology teams and delivering More ❯
Python. Experience in data modelling and design patterns; in-depth knowledge of relational databases (PostgreSQL) and familiarity with data lakehouse formats (storage formats, e.g. Apache Parquet, Delta tables). Experience with Spark, Databricks, data lakes/lakehouses. Experience working with external data suppliers (defining requirements for suppliers, defining Service More ❯
availability. Very strong technical background leading application development - with experience in some or all of the following technologies: Python, Java, Spring Boot, TensorFlow, PyTorch, Apache Spark, Kafka, Jenkins, Git/Bitbucket, Terraform, Docker, ECS/EKS, IntelliJ, JIRA, Confluence, React/Typescript, Selenium, Redux, Junit, Cucumber/Gherkin. About More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Duel
Snowflake. You understand event-driven architectures and real-time data processing. You have experience implementing and maintaining scalable data pipelines using tools like dbt, Apache Airflow, or similar. You have no issue working with either structured or semi-structured data. You are comfortable working with data engineering and scripting More ❯
Github integration and automation). Experience with scripting languages such as Python or R. Working knowledge of message queuing and stream processing. Experience with Apache Spark or Similar Technologies. Experience with Agile and Scrum Technologies. Familiarity with dbt and Airflow is an advantage. Experience working in a start-up More ❯
Github integration and automation). Experience with scripting languages such as Python or R. Working knowledge of message queuing and stream processing. Experience with Apache Spark or Similar Technologies. Experience with Agile and Scrum Technologies. Familiarity with dbt and Airflow is an advantage. Experience working in a start-up More ❯
or all of the services below would put you at the top of our list Google Cloud Storage Google Data Transfer Service Google Dataflow (Apache Beam) Google PubSub Google CloudRun BigQuery or any RDBMS Python Debezium/Kafka dbt (Data Build tool) Interview process Interviewing is a two way More ❯