platforms supporting both batch and real-time processing architectures. Deep understanding of data warehousing, ETL/ELT pipelines, and analytics engineering principles. Proficient in programming languages such as Python, Scala, or Java, and experienced with cloud platforms (AWS, GCP, or Azure). Experience working with privacy-sensitive data and implementing comprehensive observability and governance solutions. Strong technical foundation with a More ❯
products on Databricks, Snowflake, GCP Big Data, Hadoop, Spark, etc. Excellent problem-solving, analytical, and troubleshooting skills. Strong communication and team collaboration abilities. Programming skills in Python (PySpark preferred), Scala, or SQL. Experience designing and implementing enterprise-level data pipelines, with knowledge of ETL processes. Ability to write production-grade, automated testing code. Experience deploying via CI/CD platforms More ❯
the 'why' behind their work. Have practical experience as a senior engineer in motivated teams, collaborating closely and taking pride in their work. Possess advanced knowledge of programming languages (Scala, Python, Java, C#) and understanding of domain modelling and application programming. Have experience with data management platforms (SQL, NoSQL, Spark/Databricks). Be familiar with modern engineering tools (Git More ❯
London, England, United Kingdom Hybrid / WFH Options
DEPOP
scale, including both batch processing and real-time streaming architectures. Deep understanding of data warehousing concepts, ETL/ELT processes, and analytics engineering. Strong programming skills, particularly in Python, Scala or Java, and a solid understanding of cloud platforms (AWS, GCP, or Azure). Experience working with privacy-sensitive datasets and implementing robust data observability and governance. A strong technical More ❯
focus on cloud-based data pipelines and architectures Strong expertise in Microsoft Fabric and Databricks, including data pipeline development, data warehousing, and data lake management Proficiency in Python, SQL, Scala, or Java Experience with data processing frameworks such as Apache Spark, Apache Beam, or Azure Data Factory Strong understanding of data architecture principles, data modelling, and data governance Experience with More ❯
focus on cloud-based data pipelines and architectures Strong expertise in Microsoft Fabric and Databricks, including data pipeline development, data warehousing, and data lake management Proficiency in Python, SQL, Scala, or Java Experience with data processing frameworks such as Apache Spark, Apache Beam, or Azure Data Factory Strong understanding of data architecture principles, data modelling, and data governance Experience with More ❯
collaborating with stakeholders to define optimal solutions - Strong SQL skills with experience across relational and non-relational databases - Proficiency in a modern programming language such as Python, Java or Scala - Hands-on experience using DBT for pipeline development and transformation - Familiarity with cloud platforms such as AWS, Azure or GCP - Knowledge of orchestration tooling (e.g., Airflow, Dagster, Azure Data Factory More ❯
in a leadership or managerial role. Strong background in financial services, particularly in market risk, counterparty credit risk, or risk analytics Proficiency in modern programming languages (e.g., Java, Python, Scala) and frameworks. Experience with cloud platforms (AWS, Azure, or GCP). Deep understanding of software development lifecycle (SDLC), agile methodologies, and DevOps practices. Preferred Skills: Strong communication and stakeholder management More ❯
it? Expertise in data engineering/analytics/architecture/security using native technologies of least one cloud platform (AWS, Azure, GCP) Expertise in prominent languages such as Python, Scala, Spark, SQL. Experience working with any database technologies from an application programming perspective - Oracle, MySQL, Mongo DB etc. Expertise in leading the design, build and maintenance of data pipelines and More ❯
the table: 4+ years of experience in the Data field, with a proven track record of technical ability Previous experience in using event-driven architecture with Kafka Experience with Scala for data pipelines Experience with Python and SQL for data pipelines Experience with modern cloud data warehouses (like AWS Redshift, GCP BigQuery, Azure Synapse or Snowflake) Strong communication skills and More ❯
Bradford, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
it? Expertise in data engineering/analytics/architecture/security using native technologies of least one cloud platform (AWS, Azure, GCP) Expertise in prominent languages such as Python, Scala, Spark, SQL. Experience working with any database technologies from an application programming perspective - Oracle, MySQL, Mongo DB etc. Expertise in leading the design, build and maintenance of data pipelines and More ❯
large-scale data processing systems with data tooling such as Spark, Kafka, Airflow, dbt, Snowflake, Databricks, or similar Strong programming skills in languages such as SQL, Python, Go or Scala Demonstrable use and an understanding of effective use of AI tooling in your development process A growth mindset and eagerness to work in a fast-paced, mission-driven environment Good More ❯
in motivated teams, collaborating effectively and taking pride in your work. Strong problem-solving skills, viewing technology as a means to solve challenges. Proficiency in a programming language (e.g., Scala, Python, Java, C#) with understanding of domain modelling and application development. Knowledge of data management platforms (SQL, NoSQL, Spark/Databricks). Experience with modern engineering tools (Git, CI/ More ❯
engineering using Snowflake and dbt , with a solid understanding of DataOps and CI/CD practices. Proficiency in writing clean, maintainable code in SQL or Python . Experience with Scala , Java , or similar languages is a plus. Hands-on experience with data pipeline orchestration tools such as Apache Airflow and Azure DevOps . Strong knowledge of cloud-based data engineering More ❯
Waltham Cross, England, United Kingdom Hybrid / WFH Options
Simple Machines
schemas and queries to meet business requirements. A passion and proven background in picking up and adopting new technologies on the fly. Backend server experience using Kotlin. Exposure to Scala, or functional programming generally. Experience with highly concurrent, asynchronous backend technologies, such as Ktor, http4k, http4s, Play, RxJava, etc. Experience with DynamoDB or similar NoSQL databases, such as Cassandra, HBase More ❯
you: 5+ years of experience in software development, with a strong focus on backend technologies and building distributed services. Proficiency in one or more programming languages including Java, Python, Scala or Golang. Experience with columnar, analytical cloud data warehouses (e.g., BigQuery, Snowflake, Redshift) and data processing frameworks like Apache Spark is essential. Experience with cloud platforms like AWS, Azure, or More ❯
London, England, United Kingdom Hybrid / WFH Options
Citi
education) in a STEM discipline. Proven experience in software engineering and development, and a strong understanding of computer systems and how they operate. Hands on Experience in Java , Spark , Scala ( or Java) Production scale hands-on Experience to write Data pipelines using Spark/any other distributed real time/batch processing. Strong skill set in SQL/Databases Strong More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
KPMG UK
it? Expertise in data engineering/analytics/architecture/security using native technologies of least one cloud platform (AWS, Azure, GCP) Expertise in prominent languages such as Python, Scala, Spark, SQL. Experience working with any database technologies from an application programming perspective - Oracle, MySQL, Mongo DB etc. Expertise in leading the design, build and maintenance of data pipelines and More ❯
of AWS services including S3, Redshift, EMR, Kinesis and RDS. - Experience with Open Source Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, etc.) - Ability to write code in Python, Ruby, Scala or other platform-related Big data technology - Knowledge of professional software engineering practices & best practices for the full software development lifecycle, including coding standards, code reviews, source control management, build More ❯
Out in Science, Technology, Engineering, and Mathematics
large-scale data processing systems with data tooling such as Spark, Kafka, Airflow, dbt, Snowflake, Databricks, or similar Strong programming skills in languages such as SQL, Python, Go or Scala Demonstrable use and an understanding of effective use of AI tooling in your development process A growth mindset and eagerness to work in a fast-paced, mission-driven environment Good More ❯
London, England, United Kingdom Hybrid / WFH Options
Simple Machines Pty Ltd
and queries to meet business requirements. Desirable A passion and proven background in picking up and adopting new technologies on the fly. Backend server experience using Kotlin. Exposure to Scala, or functional programming generally. Experience with highly concurrent, asynchronous backend technologies, such as Ktor, http4k, http4s, Play, RxJava, etc. Experience with DynamoDB or similar NoSQL databases, such as Cassandra, HBase More ❯
concepts to a range of audiences Able to provide coaching and training to less experienced members of the team Essential skills: Programming Languages such as Spark, Java, Python, PySpark, Scala, etc (minimum 2) Extensive Big Data hands-on experience (coding/configuration/automation/monitoring/security/etc) is a must Significant AWS or Azure hands-on experience More ❯
Monitor, SQL Database, SQL Managed Instance, Stream Analytics, Cosmos DB, Storage Services, ADLS, Azure Functions, Log Analytics, Serverless Architecture, ARM Templates. Strong proficiency in Spark, SQL, and Python/scala/Java. Experience in building Lakehouse architecture using open-source table formats like delta, parquet and tools like jupyter notebook. Strong notions of security best practices (e.g., using Azure Key More ❯
have a solid understanding of computer science and engineering fundamentals You are proficient in one or more of the following programming languages: C#, Java, C, C++, Python, SQL, or Scala You have a Bachelor’s/Master’s level degree in computer science or relevant engineering-related field or equivalent experience. Not everyone has the same level of access to More ❯
the System Delivery Life Cycle. Experience of using agile delivery tools such as JIRA, Pivotal, Collab, Confluence Experience of engineering based on the likes of SQL, SSIS, Python, Java, Scala, XML/FpML and Power BI Solution architecture (Business, Functional, Technical) Data architecture, data lineage and all aspects of AI including, but not limited to, NLP, ML, deep learning and More ❯