Nottinghamshire, United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (Python, Data Modelling, ETL/ELT, Apache Airflow, DBT, AWS) Enterprise-scale tech firm Up to £70,000 plus benefits - FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalabale data structures and data pipelines within a More ❯
Python. Experience in data modelling and design patterns; in-depth knowledge of relational databases (PostgreSQL) and familiarity with data lakehouse formats (storage formats, e.g. Apache Parquet, Delta tables). Experience with Spark, Databricks, data lakes/lakehouses. Experience working with external data suppliers (defining requirements for suppliers, defining Service More ❯
working with hierarchical reference data models. Proven expertise in handling high-throughput, real-time market data streams. Familiarity with distributed computing frameworks such as Apache Spark. Operational experience supporting real-time systems. Equal Opportunity Workplace We are proud to be an equal opportunity workplace. We do not discriminate based More ❯
understanding of financial markets. Experienceworking with hierarchical referencedata models. Provenexpertise in handling high-throughput, real-time marketdata streams Familiarity with distributed computing frameworks suchas Apache Spark Operational experience supporting real time systems. Equal Opportunity Workplace We are proud to be an equal opportunity workplace. We do not discriminatebased upon More ❯
Python. Experience in data modelling and design patterns; in-depth knowledge of relational databases (PostgreSQL) and familiarity with data lakehouse formats (storage formats, e.g. Apache Parquet, Delta tables). Experience with Spark, Databricks, data lakes/lakehouses. Experience working with external data suppliers (defining requirements for suppliers, defining Service More ❯
availability. Very strong technical background leading application development - with experience in some or all of the following technologies: Python, Java, Spring Boot, TensorFlow, PyTorch, Apache Spark, Kafka, Jenkins, Git/Bitbucket, Terraform, Docker, ECS/EKS, IntelliJ, JIRA, Confluence, React/Typescript, Selenium, Redux, Junit, Cucumber/Gherkin. About More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Duel
Snowflake. You understand event-driven architectures and real-time data processing. You have experience implementing and maintaining scalable data pipelines using tools like dbt, Apache Airflow, or similar. You have no issue working with either structured or semi-structured data. You are comfortable working with data engineering and scripting More ❯
Github integration and automation). Experience with scripting languages such as Python or R. Working knowledge of message queuing and stream processing. Experience with Apache Spark or Similar Technologies. Experience with Agile and Scrum Technologies. Familiarity with dbt and Airflow is an advantage. Experience working in a start-up More ❯
Github integration and automation). Experience with scripting languages such as Python or R. Working knowledge of message queuing and stream processing. Experience with Apache Spark or Similar Technologies. Experience with Agile and Scrum Technologies. Familiarity with dbt and Airflow is an advantage. Experience working in a start-up More ❯
or all of the services below would put you at the top of our list Google Cloud Storage Google Data Transfer Service Google Dataflow (Apache Beam) Google PubSub Google CloudRun BigQuery or any RDBMS Python Debezium/Kafka dbt (Data Build tool) Interview process Interviewing is a two way More ❯
services experience is desired but not essential. API development (FastAPI, Flask) Tech stack : Azure, Python, Databricks, Azure DevOps, ChatGPT, Groq, Cursor AI, JavaScript, SQL, Apache Spark, Kafka, Airflow, Azure ML, Docker, Kubernetes and many more. Role Overview: We are looking for someone who is as comfortable developing AI/ More ❯
london, south east england, united kingdom Hybrid / WFH Options
Aventis Solutions
services experience is desired but not essential. API development (FastAPI, Flask) Tech stack : Azure, Python, Databricks, Azure DevOps, ChatGPT, Groq, Cursor AI, JavaScript, SQL, Apache Spark, Kafka, Airflow, Azure ML, Docker, Kubernetes and many more. Role Overview: We are looking for someone who is as comfortable developing AI/ More ❯
Newcastle Upon Tyne, England, United Kingdom Hybrid / WFH Options
Noir
in a data-focused role, with a strong passion for working with data and delivering value to stakeholders. Strong proficiency in SQL, Python, and Apache Spark , with hands-on experience using these technologies in a production environment. Experience with Databricks and Microsoft Azure is highly desirable. Financial Services experience More ❯
to adopt in order to enhance our platform. What you'll do: Develop across our evolving technology stack - we're using Python, Java, Kubernetes, Apache Spark, Postgres, ArgoCD, Argo Workflow, Seldon, MLFlow and more. We are migrating into AWS cloud and adopting many services that are available in that More ❯
to adopt in order to enhance our platform. What you'll do: Develop across our evolving technology stack - we're using Python, Java, Kubernetes, Apache Spark, Postgres, ArgoCD, Argo Workflow, Seldon, MLFlow and more. We are migrating into AWS cloud and adopting many services that are available in that More ❯
or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., Apache Airflow, Kubernetes). You thrive when working as part of a team. Comfortable in a fast-paced environment. Have excellent written and verbal English More ❯
levels. Prior experience contributing to open-source projects or standards bodies (e.g., JCP). Some familiarity with the Hazelcast platform or similar technologies (e.g., Apache Ignite, Redis, AWS ElastiCache, Oracle Coherence, Kafka, etc.). Experience writing technical whitepapers or benchmark reports. BENEFITS 25 days annual leave + Bank holidays More ❯
Saffron Walden, Essex, South East, United Kingdom Hybrid / WFH Options
EMBL-EBI
Experience in developing web infrastructure (Solr, kubernetes) Experience in git and basic Unix Commands You may also have Experience with large data processing technologies (Apache Spark) Other helpful information: The team work in a hybrid working pattern and spends 2 days per week in office Apply now! Benefits and More ❯
to Have: AWS Certified Data Engineer, or AWS Certified Data Analytics, or AWS Certified Solutions Architect Experience with big data tools and technologies like Apache Spark, Hadoop, and Kafka Knowledge of CI/CD pipelines and automation tools such as Jenkins or GitLab CI About Adastra For more than More ❯
or all of the services below would put you at the top of our list: Google Cloud Storage. Google Data Transfer Service. Google Dataflow (Apache Beam). Google PubSub. Google CloudRun. BigQuery or any RDBMS. Python. Debezium/Kafka. dbt (Data Build tool). Interview process Interviewing is a More ❯
Warwick, Warwickshire, United Kingdom Hybrid / WFH Options
ICEO
English (B2 level or higher). Nice to Have: Experience with Argo CD and Argo Rollouts. Familiarity with technologies such as Kafka, Redis, Nginx, ApacheHTTPServer, OpenVPN, and Nats. Knowledge of logging tools (Kibana, FluentD, Elasticsearch). Expertise in configuring, managing, and optimizing large PostgreSQL databases. Understanding of More ❯
Modelling: Designing dimensional, relational, and Document data lineage and recommend improvements for data ownership and stewardship. Qualifications Programming: Python, SQL, Scala, Java. Big Data: Apache Spark, Hadoop, Databricks, Snowflake, etc. Cloud: AWS (Glue, Redshift), Azure (Synapse, Data Factory More ❯
FOR THE SENIOR SOFTWARE ENGINEER TO HAVE . Cloud based experience Microservice architecture or server-less architecture Big Data/Messaging technologies such as Apache Nifi/MiNiFi/Kafka TO BE CONSIDERED . Please either apply by clicking online or emailing me directly to For further information please More ❯
Compute Engine, Kubernetes Engine (GKE), Cloud Storage, BigQuery, and IAM. Terraform for IaC Docker Kubernetes Linux (sysadmin including firewalls and hardening) Web Server Config (Apache, Nginx) Database management (MongoDB & MySQL) for high availability and backups Git for version control Programming/Scripting languages like Node/TypeScript, Python Serverless More ❯
with cloud data platforms (Azure, AWS, GCP). Familiarity with machine learning and AI concepts. Experience with data platform migrations and integration tools (e.g., Apache Airflow). Knowledge of Python or other programming languages. Certification in relevant data engineering or cloud platforms. Personal Attributes: High level of accuracy and More ❯