Ability to write clean, scalable, maintainable code in Python with a good understanding of software engineering concepts and patterns. Proficiency in other languages like Scala, Java, C#, C++ are an advantage. Proven record of building and maintaining data pipelines deployed in at least one of the big 3 cloud ML More ❯
City of Westminster, England, United Kingdom Hybrid / WFH Options
nudge Global Ltd
big data processing Strong knowledge of open banking frameworks, protocols, and APIs (e.g. Open Banking UK, PSD2) Proficiency in programming languages such as Python, Scala, or Java Experience with cloud data platforms such as GCP (BigQuery, Dataflow) or Azure (Data Factory, Synapse) Expert in SQL, MongoDB and distributed data systems More ❯
using python or similar programming languages. Background in software engineering a plus. DESIRABLE LANGUAGES/TOOLS Proficiency in programming languages such as Python, Java, Scala, or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies More ❯
Experience with cloud-based data platforms (AWS, Azure, GCP) Proficiency in data orchestration tools (Airflow, Prefect, Dagster, or similar) Solid programming skills in Python, Scala, or Java Experience integrating ML workflows into production data systems Strong understanding of data modeling, ETL processes, and database design Demonstrated ability to architect solutions More ❯
tools. Deep understanding of DevOps principles, data warehousing, data modeling, and data integration patterns. Proficient in one or more programming languages such as Python, Scala, or Java, alongside SQL and NoSQL database experience. Knowledge of data quality assurance techniques and governance frameworks. Excellent analytical, problem-solving, and communication skills. Desirable More ❯
and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. More ❯
and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. More ❯
and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. More ❯
and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. More ❯
and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. More ❯
london (city of london), south east england, united kingdom
Mastek
and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. More ❯
and principles. Solid understanding of data warehousing, data modeling, and data integration principles. Proficiency in at least one scripting/programming language (e.g. Python, Scala, Java). Experience with SQL and NoSQL databases. Familiarity with data quality and data governance best practices. Strong analytical and problem-solving skills. Excellent communication More ❯
London, England, United Kingdom Hybrid / WFH Options
Lloyds Banking Group
will too... WHAT WE NEED YOU TO HAVE EXPERIENCE IN Coding Coding/scripting experience developed in a commercial/industry setting (Python, Java, Scala or Go, and SQL). Databases & frameworks Strong experience working with Kafka technologies. Working experience with operational data stores , data warehouse , big data technologies , and More ❯
especially when facing unclear or incomplete requirements 3+ years of experience designing and implementing Snowflake solutions Expert-level SQL and strong proficiency in Python, Scala, or Java Solid experience with ETL/ELT pipelines using dbt Hands-on experience with cloud platforms (AWS, Azure, or GCP) Familiarity with orchestration tools More ❯
with modern data stack components (e.g., dbt, Snowflake, Airbyte). Experience working in an Agile/Scrum environment. Knowledge of Python or Java/Scala for data engineering. Experience with version control systems (eg Git, CVS). Applicants will be required to obtain SC and therefore 5 year's UK More ❯
bonus. Experience with designing efficient physical data models/schemas and developing ETL/ELT scripts. Strong Python and other programming skills (Spark/Scala desirable). Experience both using and building APIs. Strong SQL background. Some exposure to big data technologies (Hadoop, Spark, Presto, etc.). Works well collaboratively More ❯
experience with big data technology, systems and tools such as AWS, Hadoop, Hive, and Snowflake Expertise with common Software Engineering languages such as Python, Scala, Java, SQL and a proven ability to learn new programming languages Experience with workflow orchestration tools such as Airflow Detailed problem-solving approach, coupled with More ❯
London, England, United Kingdom Hybrid / WFH Options
Rein-Ton
standards. Test and validate solutions for quality assurance. Qualifications: Proven experience as a Data Engineer, especially with data pipelines. Proficiency in Python, Java, or Scala; experience with Hadoop, Spark, Kafka. Experience with Databricks, Azure AI Services, and cloud platforms (AWS, Google Cloud, Azure). Strong SQL and NoSQL database skills. More ❯
. Hands-on experience with designing and building end-to-end software solutions. Proficiency in at least one programming language such as Java, Python, Scala, or C#. Strong communication skills, high emotional intelligence and the ability to present ideas effectively. Experience in working in an Agile environment. We expect that More ❯
tools such as Amazon Redshift, Google BigQuery, or Apache Airflow Proficiency in Python and at least one other programming language such as Java, or Scala Willingness to mentor more junior members of the team Strong analytical and problem-solving skills with the ability to work independently and in a team More ❯
London, England, United Kingdom Hybrid / WFH Options
Endava
measures (RBAC, encryption) and ensure regulatory compliance (GDPR). Document data lineage and recommend improvements for data ownership and stewardship. Qualifications Programming: Python, SQL, Scala, Java. Big Data: Apache Spark, Hadoop, Databricks, Snowflake, etc. Cloud: AWS (Glue, Redshift), Azure (Synapse, Data Factory, Fabric), GCP (BigQuery, Dataflow). Data Modelling & Storage More ❯
following technical skills and knowledge Programming Languages : Proficient in languages commonly used in data engineering, such as Python and SQL. Familiarity with languages like Scala or Go can also be beneficial. Database Management : Strong knowledge of relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra) for efficient data More ❯
City of London, England, United Kingdom Hybrid / WFH Options
Staging It
Expertise in cloud security, data governance, and compliance (GDPR, HIPAA). Strong SQL skills and proficiency in at least one programming language (Python, Java, Scala). Excellent problem-solving, communication, and project management skills. Experience with DevOps, CI/CD pipelines, and infrastructure as code (Terraform, CloudFormation). Ability to More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Robert Half
cloud security , data governance , and compliance (e.g., GDPR, HIPAA). Strong SQL skills and proficiency in at least one programming language (e.g., Python, Java, Scala). Excellent problem-solving, communication, and project management skills. Experience with DevOps , CI/CD pipelines , and infrastructure as code (e.g., Terraform, CloudFormation). Ability More ❯
of performing architectural assessments, examining architectural alternatives, and choosing the best solution in collaboration with both IT and business stakeholders Fluent in Python, Java, Scala, or similar Object-Oriented Programming Languages Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with More ❯