London, England, United Kingdom Hybrid / WFH Options
Rein-Ton
standards. Test and validate solutions for quality assurance. Qualifications: Proven experience as a Data Engineer, especially with data pipelines. Proficiency in Python, Java, or Scala; experience with Hadoop, Spark, Kafka. Experience with Databricks, Azure AI Services, and cloud platforms (AWS, Google Cloud, Azure). Strong SQL and NoSQL database skills. More ❯
AWS or equivalent cloud technologies. Knowledge of Serverless technologies, frameworks and best practices. Experience using AWS CloudFormation or Terraform for infrastructure automation. Knowledge of Scala or OO such as Java or C#. SQL or Python development experience. High-quality coding and testing practices. Willingness to learn new technologies and methodologies. More ❯
. Hands-on experience with designing and building end-to-end software solutions. Proficiency in at least one programming language such as Java, Python, Scala, or C#. Strong communication skills, high emotional intelligence and the ability to present ideas effectively. Experience in working in an Agile environment. We expect that More ❯
platform, either on Azure or AWS. Good working knowledge of Databricks components: DeltaLake, Unity Catalog, ML Flow, etc. Expertise in SQL, Python and Spark (Scala or Python). Experience working with relational SQL databases either on premises or in the cloud. Experience delivering multiple solutions using key techniques such as More ❯
have experience with: CICD Pipelines - GitLab Cloud Technologies such as AWS, Iceberg, Snowflake, Databricks Containerization/Orchestration: Docker/Kubernetes Framework Components- Spark/Scala/Kafka Unix: Scripting and Config Other Highly Valued Skills Include Automation - Python/Bash Scripting DataBase - Teradata, Oracle Workflow Management: Apache Airflow You may More ❯
tools such as Amazon Redshift, Google BigQuery, or Apache Airflow Proficiency in Python and at least one other programming language such as Java, or Scala Willingness to mentor more junior members of the team Strong analytical and problem-solving skills with the ability to work independently and in a team More ❯
London, England, United Kingdom Hybrid / WFH Options
Endava
measures (RBAC, encryption) and ensure regulatory compliance (GDPR). Document data lineage and recommend improvements for data ownership and stewardship. Qualifications Programming: Python, SQL, Scala, Java. Big Data: Apache Spark, Hadoop, Databricks, Snowflake, etc. Cloud: AWS (Glue, Redshift), Azure (Synapse, Data Factory, Fabric), GCP (BigQuery, Dataflow). Data Modelling & Storage More ❯
will too... WHAT WE NEED YOU TO HAVE EXPERIENCE IN Coding Coding/scripting experience developed in a commercial/industry setting (Python, Java, Scala or Go and SQL) Databases & frameworks Strong experience working with Kafka technologies Working experience with operational data stores , data warehouse , big data technologies and data More ❯
Understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures Proficiency in SQL and at least one programming language (e.g., Python, Scala, or Java) Demonstrated experience owning complex technical systems end-to-end, from design through production Excellent communication skills with the ability to explain technical concepts More ❯
Newcastle upon Tyne, England, United Kingdom Hybrid / WFH Options
Somerset Bridge
Delta Lake for schema evolution, ACID transactions, and time travel in Databricks. Strong Python (PySpark) skills for big data processing and automation. Experience with Scala (optional but preferred for advanced Spark applications). Experience working with Databricks Workflows & Jobs for data orchestration. Strong knowledge of feature engineering and feature stores More ❯
of data modeling, distributed systems, streaming architectures, and ETL/ELT pipelines. Proficiency in SQL and at least one programming language such as Python, Scala, or Java. Demonstrated experience owning and delivering complex systems from architecture through implementation. Excellent communication skills with the ability to explain technical concepts to both More ❯
data modeling , distributed systems , streaming architectures , and ETL/ELT pipelines . Proficiency in SQL and at least one programming language such as Python , Scala , or Java . Demonstrated experience owning and delivering complex systems from architecture through implementation. Excellent communication skills with the ability to explain technical concepts to More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
JR United Kingdom
data modeling , distributed systems , streaming architectures , and ETL/ELT pipelines . Proficiency in SQL and at least one programming language such as Python , Scala , or Java . Demonstrated experience owning and delivering complex systems from architecture through implementation. Excellent communication skills with the ability to explain technical concepts to More ❯
City of London, England, United Kingdom Hybrid / WFH Options
Staging It
Expertise in cloud security, data governance, and compliance (GDPR, HIPAA). Strong SQL skills and proficiency in at least one programming language (Python, Java, Scala). Excellent problem-solving, communication, and project management skills. Experience with DevOps, CI/CD pipelines, and infrastructure as code (Terraform, CloudFormation). Ability to More ❯
data modeling , distributed systems , streaming architectures , and ETL/ELT pipelines . Proficiency in SQL and at least one programming language such as Python , Scala , or Java . Demonstrated experience owning and delivering complex systems from architecture through implementation. Excellent communication skills with the ability to explain technical concepts to More ❯
Strong expertise in Databricks , Apache Spark , and Delta Lake . Experience with Hive Metastore and Unity Catalog for data governance. Proficiency in Python, SQL, Scala , or other relevant languages. Familiarity with structured streaming, event-driven architectures , and stateful processing . Ability to design, schedule, and optimize Databricks workflows . Knowledge More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Robert Half
cloud security , data governance , and compliance (e.g., GDPR, HIPAA). Strong SQL skills and proficiency in at least one programming language (e.g., Python, Java, Scala). Excellent problem-solving, communication, and project management skills. Experience with DevOps , CI/CD pipelines , and infrastructure as code (e.g., Terraform, CloudFormation). Ability More ❯
of performing architectural assessments, examining architectural alternatives, and choosing the best solution in collaboration with both IT and business stakeholders Fluent in Python, Java, Scala, or similar Object-Oriented Programming Languages Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with More ❯
of performing architectural assessments, examining architectural alternatives, and choosing the best solution in collaboration with both IT and business stakeholders Fluent in Python, Java, Scala, or similar Object-Oriented Programming Languages Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with More ❯
London, England, United Kingdom Hybrid / WFH Options
TN United Kingdom
measures (RBAC, encryption) and ensure regulatory compliance (GDPR). Document data lineage and recommend improvements for data ownership and stewardship. Qualifications Programming: Python, SQL, Scala, Java. Big Data: Apache Spark, Hadoop, Databricks, Snowflake, etc. Cloud: AWS (Glue, Redshift), Azure (Synapse, Data Factory, Fabric), GCP (BigQuery, Dataflow). Data Modelling & Storage More ❯
and Data Warehousing concepts. Experience of Enterprise ETL tools such as Informatica, Talend, DataStage or Alteryx. Project experience using the following technologies: Hadoop, Spark, Scala, Oracle, Pega, Salesforce. Cross and multi-platform experience. Team building and leadership skills. You must be: Willing to work on client sites, potentially for extended More ❯
for distributed data processing and scalable ETL workflows in data engineering pipelines. Polyglot Collaboration:Integrate with backend services or data processors developed in Java, Scala, or other enterprise technologies. Required Skills & Qualifications Bachelor's or Master's in Computer Science, Software Engineering, or a related technical field. 6+ years in More ❯
grids, SCADA systems). Hands-on experience in building and managing data pipelines. Proficiency in Python, SQL, and at least one other language (e.g., Scala, Java). Experience with relational (e.g., PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra). Familiarity with cloud platforms like AWS, Azure, or GCP. Experience with More ❯
applicable field such as Computer Science, Statistics, Maths or similar Science or Engineering discipline Strong Python and other programming skills (Java and/or Scala desirable) Strong SQL background Some exposure to big data technologies (Hadoop, spark, presto, etc.) NICE TO HAVES OR EXCITED TO LEARN: Some experience designing, building More ❯
have opportunities from junior to senior levels and seek data engineers with skills including: Proficiency in at least one programming language (Python, Java, or Scala) Experience with cloud platforms (AWS, GCP, or Azure) Experience with data warehousing and lake architectures ETL/ELT pipeline development SQL and NoSQL databases Distributed More ❯