newport, midlands, united kingdom Hybrid / WFH Options
Undisclosed
of these areas - Scala, Docker, Puppet, IntelliJ, IDE, Sbt Knowledge of -Java Development, Weblogic, Webmethods, Java Scripting, J2SE, J2EE, Spring, EJB, HTML, HTML5, Unix, Eclipse, SOAP, XML, REST, JBOSS, Apache, Tomact, SQL, Hibernate, JUnit, Selenium (Automation), GiT Ability to work in all areas of the project life cycle. Strong working knowledge of Agile approach & methodologies Good interpersonal and communication More ❯
pipelines and ETL processes. Proficiency in Python. Experience with cloud platforms (AWS, Azure, or GCP). Knowledge of data modelling, warehousing, and optimisation. Familiarity with big data frameworks (e.g. Apache Spark, Hadoop). Understanding of data governance, security, and compliance best practices. Strong problem-solving skills and experience working in agile environments. Desirable: Experience with Docker/Kubernetes, streaming More ❯
pipelines and ETL processes. Proficiency in Python. Experience with cloud platforms (AWS, Azure, or GCP). Knowledge of data modelling, warehousing, and optimisation. Familiarity with big data frameworks (e.g. Apache Spark, Hadoop). Understanding of data governance, security, and compliance best practices. Strong problem-solving skills and experience working in agile environments. Desirable: Experience with Docker/Kubernetes, streaming More ❯
Science, Computer Science, or a related field. 5+ years of experience in data engineering and data quality. Strong proficiency in Python/Java, SQL, and data processing frameworks including Apache Spark. Knowledge of machine learning and its data requirements. Attention to detail and a strong commitment to data integrity. Excellent problem-solving skills and ability to work in a More ❯
london (city of london), south east england, united kingdom
Humanoid
Science, Computer Science, or a related field. 5+ years of experience in data engineering and data quality. Strong proficiency in Python/Java, SQL, and data processing frameworks including Apache Spark. Knowledge of machine learning and its data requirements. Attention to detail and a strong commitment to data integrity. Excellent problem-solving skills and ability to work in a More ❯
Science, Computer Science, or a related field. 5+ years of experience in data engineering and data quality. Strong proficiency in Python/Java, SQL, and data processing frameworks including Apache Spark. Knowledge of machine learning and its data requirements. Attention to detail and a strong commitment to data integrity. Excellent problem-solving skills and ability to work in a More ❯
Amazon EKS, Amazon S3, AWS Glue, Amazon RDS, Amazon DynamoDB, Amazon Aurora, Amazon SageMaker, Amazon Bedrock (including LLM hosting and management). Expertise in workflow orchestration tools such as Apache Airflow Experience implementing DataOps best practices and tooling, including DataOps.Live Advanced skills in data storage and management platforms like Snowflake Ability to deliver insightful analytics via business intelligence tools More ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom
Norton Rose Fulbright LLP
Azure/Microsoft Fabric/Data Factory) and modern data warehouse technologies (Databricks, Snowflake) Experience with database technologies such as RDBMS (SQL Server, Oracle) or NoSQL (MongoDB) Knowledge in Apache technologies such as Spark, Kafka and Airflow to build scalable and efficient data pipelines Ability to design, build, and deploy data solutions that explore, capture, transform, and utilize data More ❯
scalable pipelines, data platforms, and integrations, while ensuring solutions meet regulatory standards and align with architectural best practices. Key Responsibilities: Build and optimise scalable data pipelines using Databricks and Apache Spark (PySpark). Ensure performance, scalability, and compliance (GxP and other standards). Collaborate on requirements, design, and backlog refinement. Promote engineering best practices including CI/CD, code More ❯
scalable pipelines, data platforms, and integrations, while ensuring solutions meet regulatory standards and align with architectural best practices. Key Responsibilities: Build and optimise scalable data pipelines using Databricks and Apache Spark (PySpark). Ensure performance, scalability, and compliance (GxP and other standards). Collaborate on requirements, design, and backlog refinement. Promote engineering best practices including CI/CD, code More ❯
london (city of london), south east england, united kingdom
Fimador
scalable pipelines, data platforms, and integrations, while ensuring solutions meet regulatory standards and align with architectural best practices. Key Responsibilities: Build and optimise scalable data pipelines using Databricks and Apache Spark (PySpark). Ensure performance, scalability, and compliance (GxP and other standards). Collaborate on requirements, design, and backlog refinement. Promote engineering best practices including CI/CD, code More ❯
Better Placed Ltd - A Sunday Times Top 10 Employer!
across multiple sources. Integrate services via RESTful APIs and manage structured/unstructured data formats. Data Visualisation & Automation Build interactive dashboards in BI tools such as Power BI or Apache Superset. Automate KPI tracking and reporting to streamline workflows. Partner with teams to identify opportunities for process optimisation. Apply best visualisation principles for clarity and impact. Ensure dashboard performance More ❯
the wider business. Skills & Experience: Strong experience with Typescript and React, developing modern, scalable, and high-performance web applications. Proficient in PHP, with experience in a LAMP (Linux/Apache/MariaDB/PHP) environment considered advantageous. Experience in migrating legacy systems to modern architectures and frameworks. Solid understanding of RESTful API development, web application design patterns, and secure More ❯
external suppliers, with annual budgets spreading from £1M- £2M+. Essential Skills Proven experience as a Data Engineer (or similar/related role) Experience with Azure Data Factory, Databricks, or Apache Spark, following modern ETL/ELT principles. Experience of using programming languages such as Python, Scala and SQL. Demonstrable knowledge of data modelling and data warehousing within platforms such More ❯
Stroud, England, United Kingdom Hybrid / WFH Options
Ecotricity
for best practice and technical excellence and be a person that actively looks for continual improvement opportunities. Knowledge and skills Experience as a Data Engineer or Analyst Databricks/Apache Spark SQL/Python BitBucket/GitHub. Advantageous dbt AWS Azure Devops Terraform Atlassian (Jira, Confluence) About Us What's in it for you... Healthcare plan, life assurance and More ❯
Stroud, south east england, united kingdom Hybrid / WFH Options
Ecotricity
for best practice and technical excellence and be a person that actively looks for continual improvement opportunities. Knowledge and skills Experience as a Data Engineer or Analyst Databricks/Apache Spark SQL/Python BitBucket/GitHub. Advantageous dbt AWS Azure Devops Terraform Atlassian (Jira, Confluence) About Us What's in it for you... Healthcare plan, life assurance and More ❯
or another language such as Python Good knowledge of developing in a Linux environment Working knowledge of Git version control and GitLabs CI/CD pipelines Experience working with Apache NiFi Some exposure to front-end elements like JavaScript, TypeScript or React Some data interrogation with ElasticSearch and Kibana Exposure to working with Atlassian products Looking for a role More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Singular Recruitment
Advanced-level Python for data applications and high proficiency in SQL (query tuning, complex joins) Hands-on experience designing and deploying ETL/ELT pipelines using Google Cloud Dataflow (Apache Beam) or similar tools Proficiency in data architecture, data modeling, and scalable storage design Solid engineering practices: Git and CI/CD for data systems Highly Desirable Skills GCP More ❯
tools, particularly Terraform. Experience with network design, administration, and troubleshooting. Knowledge of programming languages (e.g., JavaScript, Node.js, PHP). Experience with version control systems, ideally Git. Web server configuration (Apache, Nginx). Database management (MySQL, MongoDB), including high availability and backup solutions. Hands-on experience managing cloud providers, with significant experience in AWS and Google Cloud Platform (GCP). More ❯
or another language such as Python Good knowledge of developing in a Linux environment Working knowledge of Git version control and GitLabs CI/CD pipelines Experience working with Apache NiFi Some exposure to front-end elements like JavaScript, TypeScript or React Some data interrogation with ElasticSearch and Kibana Exposure to working with Atlassian products Looking for a role More ❯
Winchester, Hampshire, England, United Kingdom Hybrid / WFH Options
Ada Meher
days a week – based business need. To Be Considered: Demonstrable expertise and experience working on large-scale Data Engineering projects Strong experience in Python/PySpark, Databricks & Apache Spark Hands on experience with both batch & streaming pipelines Strong experience in AWS and associated tooling (Eg, S3, Glue, Redshift, Lambda, Terraform etc) Experience designing Data Engineering platforms from scratch Alongside More ❯
teams to build scalable data pipelines and contribute to digital transformation initiatives across government departments. Key Responsibilities Design, develop and maintain robust data pipelines using PostgreSQL and Airflow or Apache Spark Collaborate with frontend/backend developers using Node.js or React Implement best practices in data modelling, ETL processes and performance optimisation Contribute to containerised deployments (Docker/Kubernetes More ❯
of containerisation and orchestration (e.g., Docker , Kubernetes , OpenShift ). Experience with CI/CD pipelines (e.g., Jenkins, TeamCity, Concourse). Familiarity with web/application servers such as NGINX, Apache, or JBoss. Exposure to monitoring and logging tools (ELK, Nagios, Splunk, DataDog, New Relic, etc.). Understanding of security and identity management (OAuth2, SSO, ADFS, Keycloak, etc.). Experience More ❯
AWS) to join a contract till April 2026. Inside IR35 SC cleared Weekly travel to Newcastle Around £400 per day Contract till April 2026 Skills: - Python - AWS Services - Terraform - Apache Spark - Airflow - Docker More ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom
Opus Recruitment Solutions Ltd
SC cleared Software developers (Python & AWS) to join a contract till April 2026.Inside IR35SC clearedWeekly travel to Newcastle Around £400 per dayContract till April 2026Skills:- Python- AWS Services- Terraform- Apache Spark- Airflow- Docker More ❯