Science, Computer Science, or a related field. 5+ years of experience in data engineering and data quality. Strong proficiency in Python/Java, SQL, and data processing frameworks including Apache Spark. Knowledge of machine learning and its data requirements. Attention to detail and a strong commitment to data integrity. Excellent problem-solving skills and ability to work in a More ❯
Science, Computer Science, or a related field. 5+ years of experience in data engineering and data quality. Strong proficiency in Python/Java, SQL, and data processing frameworks including Apache Spark. Knowledge of machine learning and its data requirements. Attention to detail and a strong commitment to data integrity. Excellent problem-solving skills and ability to work in a More ❯
london (city of london), south east england, united kingdom
Humanoid
Science, Computer Science, or a related field. 5+ years of experience in data engineering and data quality. Strong proficiency in Python/Java, SQL, and data processing frameworks including Apache Spark. Knowledge of machine learning and its data requirements. Attention to detail and a strong commitment to data integrity. Excellent problem-solving skills and ability to work in a More ❯
record in full stack data development (from ingestion to visualization). Strong expertise in Snowflake, including data modeling, warehousing, and performance optimization. Hands-on experience with ETL tools (e.g., Apache Airflow, dbt, Fivetran) and integrating data from ERP systems like NetSuite. Proficiency in SQL, Python, and/or other scripting languages for data processing and automation. Familiarity with LLM More ❯
Southwark, London, United Kingdom Hybrid / WFH Options
Involved Productions Ltd
candidate for this role will likely have: a solid foundation in Python and JavaScript, ideally with proficiency in other programming languages. experience designing and implementing ETL pipelines, specifically using Apache Airflow (Astronomer). hands-on experience with ETL frameworks, particularly dbt (data build tool). SQL and various database management system skills. a good understanding of different database types More ❯
coding principles JavaScript, TypeScript, Vue.JS, SingleSPA. Experience - UI Development, UI testing (e.g. Selenium), Data Integration, CI/CD Hands-on experience in web services (REST, SOAP, WSDL etc.), using Apache Commons Suite & Maven, SQL Database such as Oracle MySQL, PostgreSQL etc. Hands-on experience in utilizing Spring Framework (Core, MVC, Integration and Data) Experience with Big Data/Hadoop More ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom
Norton Rose Fulbright LLP
Azure/Microsoft Fabric/Data Factory) and modern data warehouse technologies (Databricks, Snowflake) Experience with database technologies such as RDBMS (SQL Server, Oracle) or NoSQL (MongoDB) Knowledge in Apache technologies such as Spark, Kafka and Airflow to build scalable and efficient data pipelines Ability to design, build, and deploy data solutions that explore, capture, transform, and utilize data More ❯
Candidates must hold active SC Clearance. 75,000 - 85,000 Remote - with occasional client visits Skills : Python and SQL . Apache Airflow for DAG orchestration and monitoring. Docker for containerisation. AWS data services: Redshift , OpenSearch , Lambda , Glue , Step Functions , Batch . CI/CD pipelines and YAML-based configuration More ❯
Candidates must hold active SC Clearance. £75,000 - £85,000 Remote - with occasional client visits Skills : Python and SQL . Apache Airflow for DAG orchestration and monitoring. Docker for containerisation. AWS data services: Redshift , OpenSearch , Lambda , Glue , Step Functions , Batch . CI/CD pipelines and YAML-based configuration More ❯
scalable pipelines, data platforms, and integrations, while ensuring solutions meet regulatory standards and align with architectural best practices. Key Responsibilities: Build and optimise scalable data pipelines using Databricks and Apache Spark (PySpark). Ensure performance, scalability, and compliance (GxP and other standards). Collaborate on requirements, design, and backlog refinement. Promote engineering best practices including CI/CD, code More ❯
scalable pipelines, data platforms, and integrations, while ensuring solutions meet regulatory standards and align with architectural best practices. Key Responsibilities: Build and optimise scalable data pipelines using Databricks and Apache Spark (PySpark). Ensure performance, scalability, and compliance (GxP and other standards). Collaborate on requirements, design, and backlog refinement. Promote engineering best practices including CI/CD, code More ❯
scalable pipelines, data platforms, and integrations, while ensuring solutions meet regulatory standards and align with architectural best practices. Key Responsibilities: Build and optimise scalable data pipelines using Databricks and Apache Spark (PySpark). Ensure performance, scalability, and compliance (GxP and other standards). Collaborate on requirements, design, and backlog refinement. Promote engineering best practices including CI/CD, code More ❯
scalable pipelines, data platforms, and integrations, while ensuring solutions meet regulatory standards and align with architectural best practices. Key Responsibilities: Build and optimise scalable data pipelines using Databricks and Apache Spark (PySpark). Ensure performance, scalability, and compliance (GxP and other standards). Collaborate on requirements, design, and backlog refinement. Promote engineering best practices including CI/CD, code More ❯
london (city of london), south east england, united kingdom
Fimador
scalable pipelines, data platforms, and integrations, while ensuring solutions meet regulatory standards and align with architectural best practices. Key Responsibilities: Build and optimise scalable data pipelines using Databricks and Apache Spark (PySpark). Ensure performance, scalability, and compliance (GxP and other standards). Collaborate on requirements, design, and backlog refinement. Promote engineering best practices including CI/CD, code More ❯
and architecture. Skills & Experience Required: 2-5 years of software development experience. Strong hands-on expertise in Scala (mandatory) , plus Python and Java . Experience with Big Data frameworks ; Apache Spark experience is an advantage. Solid understanding of software engineering principles, data structures, and algorithms. Strong problem-solving skills and ability to work in an Agile environment. Educational Criteria More ❯
and architecture. Skills & Experience Required: 2-5 years of software development experience. Strong hands-on expertise in Scala (mandatory) , plus Python and Java . Experience with Big Data frameworks ; Apache Spark experience is an advantage. Solid understanding of software engineering principles, data structures, and algorithms. Strong problem-solving skills and ability to work in an Agile environment. Educational Criteria More ❯
and architecture. Skills & Experience Required: 2-5 years of software development experience. Strong hands-on expertise in Scala (mandatory) , plus Python and Java . Experience with Big Data frameworks ; Apache Spark experience is an advantage. Solid understanding of software engineering principles, data structures, and algorithms. Strong problem-solving skills and ability to work in an Agile environment. Educational Criteria More ❯
REST APIs and integration techniques Familiarity with data visualization tools and libraries (e.g., Power BI) Background in database administration or performance tuning Familiarity with data orchestration tools, such as Apache Airflow Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing Strong analytical skills, including a thorough understanding of how to interpret customer business requirements and More ❯
REST APIs and integration techniques Familiarity with data visualization tools and libraries (e.g., Power BI) Background in database administration or performance tuning Familiarity with data orchestration tools, such as Apache Airflow Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing Strong analytical skills, including a thorough understanding of how to interpret customer business requirements and More ❯
REST APIs and integration techniques Familiarity with data visualization tools and libraries (e.g., Power BI) Background in database administration or performance tuning Familiarity with data orchestration tools, such as Apache Airflow Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing Strong analytical skills, including a thorough understanding of how to interpret customer business requirements and More ❯
REST APIs and integration techniques Familiarity with data visualization tools and libraries (e.g., Power BI) Background in database administration or performance tuning Familiarity with data orchestration tools, such as Apache Airflow Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing Strong analytical skills, including a thorough understanding of how to interpret customer business requirements and More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
NLB Services
REST APIs and integration techniques · Familiarity with data visualization tools and libraries (e.g., Power BI) · Background in database administration or performance tuning · Familiarity with data orchestration tools, such as Apache Airflow · Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing · Strong analytical skills, including a thorough understanding of how to interpret customer business requirements and More ❯
Better Placed Ltd - A Sunday Times Top 10 Employer!
across multiple sources. Integrate services via RESTful APIs and manage structured/unstructured data formats. Data Visualisation & Automation Build interactive dashboards in BI tools such as Power BI or Apache Superset. Automate KPI tracking and reporting to streamline workflows. Partner with teams to identify opportunities for process optimisation. Apply best visualisation principles for clarity and impact. Ensure dashboard performance More ❯
the wider business. Skills & Experience: Strong experience with Typescript and React, developing modern, scalable, and high-performance web applications. Proficient in PHP, with experience in a LAMP (Linux/Apache/MariaDB/PHP) environment considered advantageous. Experience in migrating legacy systems to modern architectures and frameworks. Solid understanding of RESTful API development, web application design patterns, and secure More ❯