Hands-on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Familiarity with data warehousing, ETL/ELT processes, and analytics engineering Programming proficiency in Python, Scala or Java Experience operating in a cloud- environment (e.g. AWS, GCP, or Azure) Excellent stakeholder management and communication skills A strategic mindset, with a practical approach to delivery and prioritisation More ❯
and adopt new Azure capabilities. Requirements Strong experience with Azure Databricks. Proficient with Azure Data Factory, Synapse Analytics, Data Lake Storage, Stream Analytics, and Event Hubs. Skilled using Python, Scala, C#, .NET, and advanced SQL (T-SQL). Experience with CI/CD pipelines using Azure DevOps and infrastructure as code (Terraform, BICEP, ARM). Solid understanding of data engineering More ❯
West Midlands, England, United Kingdom Hybrid / WFH Options
Amtis - Digital, Technology, Transformation
and adopt new Azure capabilities. Requirements Strong experience with Azure Databricks. Proficient with Azure Data Factory, Synapse Analytics, Data Lake Storage, Stream Analytics, and Event Hubs. Skilled using Python, Scala, C#, .NET, and advanced SQL (T-SQL). Experience with CI/CD pipelines using Azure DevOps and infrastructure as code (Terraform, BICEP, ARM). Solid understanding of data engineering More ❯
platforms supporting both batch and real-time processing architectures. Deep understanding of data warehousing, ETL/ELT pipelines, and analytics engineering principles. Proficient in programming languages such as Python, Scala, or Java, and experienced with cloud platforms (AWS, GCP, or Azure). Experience working with privacy-sensitive data and implementing comprehensive observability and governance solutions. Strong technical foundation with a More ❯
platforms supporting both batch and real-time processing architectures. Deep understanding of data warehousing, ETL/ELT pipelines, and analytics engineering principles. Proficient in programming languages such as Python, Scala, or Java, and experienced with cloud platforms (AWS, GCP, or Azure). Experience working with privacy-sensitive data and implementing comprehensive observability and governance solutions. Strong technical foundation with a More ❯
Hands-on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Familiarity with data warehousing, ETL/ELT processes, and analytics engineering Programming proficiency in Python, Scala or Java Experience operating in a cloud-native environment (e.g. AWS, GCP, or Azure) Excellent stakeholder management and communication skills A strategic mindset, with a practical approach to delivery and More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
technical standards and contribute to solution development. Requirements Expertise in data engineering, analytics, architecture, or security using at least one cloud platform (AWS, Azure, GCP). Proficiency in Python, Scala, Spark, SQL. Experience with database technologies like Oracle, MySQL, MongoDB from an application development perspective. Leadership in designing, building, and maintaining data pipelines and infrastructure. Strong problem-solving skills with More ❯
understanding of data modelling, ETL processes, and data warehousing concepts. 5.Proficiency in statistical analysis, data mining, and machine learning techniques. 6.Proficiency in programming languages such as Python, R, or Scala for data analysis and modelling. 7.Experience with SQL and NoSQL databases, data visualisation tools, and statistical packages. Strong analytical, problem-solving, and critical thinking skills. 8.Experience with social media analytics More ❯
architecture roles, with deep experience designing solutions on Databricks and Apache Spark. Strong grasp of Delta Lake, Lakehouse architecture, and Unity Catalog governance. Expertise in Python, SQL, and optionally Scala; strong familiarity with dbt and modern ELT practices. Proven experience integrating Databricks with Azure services (e.g., Data Lake, Synapse, Event Hubs). Hands-on knowledge of CI/CD, GitOps More ❯
platforms supporting both batch and real-time processing architectures. Deep understanding of data warehousing, ETL/ELT pipelines, and analytics engineering principles. Proficient in programming languages such as Python, Scala, or Java, and experienced with cloud platforms (AWS, GCP, or Azure). Experience working with privacy-sensitive data and implementing comprehensive observability and governance solutions. Strong technical foundation with a More ❯
London, England, United Kingdom Hybrid / WFH Options
DEPOP
scale, including both batch processing and real-time streaming architectures. Deep understanding of data warehousing concepts, ETL/ELT processes, and analytics engineering. Strong programming skills, particularly in Python, Scala or Java, and a solid understanding of cloud platforms (AWS, GCP, or Azure). Experience working with privacy-sensitive datasets and implementing robust data observability and governance. A strong technical More ❯
collaborating with stakeholders to define optimal solutions - Strong SQL skills with experience across relational and non-relational databases - Proficiency in a modern programming language such as Python, Java or Scala - Hands-on experience using DBT for pipeline development and transformation - Familiarity with cloud platforms such as AWS, Azure or GCP - Knowledge of orchestration tooling (e.g., Airflow, Dagster, Azure Data Factory More ❯
focus on cloud-based data pipelines and architectures Strong expertise in Microsoft Fabric and Databricks, including data pipeline development, data warehousing, and data lake management Proficiency in Python, SQL, Scala, or Java Experience with data processing frameworks such as Apache Spark, Apache Beam, or Azure Data Factory Strong understanding of data architecture principles, data modelling, and data governance Experience with More ❯
focus on cloud-based data pipelines and architectures Strong expertise in Microsoft Fabric and Databricks, including data pipeline development, data warehousing, and data lake management Proficiency in Python, SQL, Scala, or Java Experience with data processing frameworks such as Apache Spark, Apache Beam, or Azure Data Factory Strong understanding of data architecture principles, data modelling, and data governance Experience with More ❯
in a leadership or managerial role. Strong background in financial services, particularly in market risk, counterparty credit risk, or risk analytics Proficiency in modern programming languages (e.g., Java, Python, Scala) and frameworks. Experience with cloud platforms (AWS, Azure, or GCP). Deep understanding of software development lifecycle (SDLC), agile methodologies, and DevOps practices. Preferred Skills: Strong communication and stakeholder management More ❯
it? Expertise in data engineering/analytics/architecture/security using native technologies of least one cloud platform (AWS, Azure, GCP) Expertise in prominent languages such as Python, Scala, Spark, SQL. Experience working with any database technologies from an application programming perspective - Oracle, MySQL, Mongo DB etc. Expertise in leading the design, build and maintenance of data pipelines and More ❯
the table: 4+ years of experience in the Data field, with a proven track record of technical ability Previous experience in using event-driven architecture with Kafka Experience with Scala for data pipelines Experience with Python and SQL for data pipelines Experience with modern cloud data warehouses (like AWS Redshift, GCP BigQuery, Azure Synapse or Snowflake) Strong communication skills and More ❯
Bradford, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
it? Expertise in data engineering/analytics/architecture/security using native technologies of least one cloud platform (AWS, Azure, GCP) Expertise in prominent languages such as Python, Scala, Spark, SQL. Experience working with any database technologies from an application programming perspective - Oracle, MySQL, Mongo DB etc. Expertise in leading the design, build and maintenance of data pipelines and More ❯
large-scale data processing systems with data tooling such as Spark, Kafka, Airflow, dbt, Snowflake, Databricks, or similar Strong programming skills in languages such as SQL, Python, Go or Scala Demonstrable use and an understanding of effective use of AI tooling in your development process A growth mindset and eagerness to work in a fast-paced, mission-driven environment Good More ❯
in motivated teams, collaborating effectively and taking pride in your work. Strong problem-solving skills, viewing technology as a means to solve challenges. Proficiency in a programming language (e.g., Scala, Python, Java, C#) with understanding of domain modelling and application development. Knowledge of data management platforms (SQL, NoSQL, Spark/Databricks). Experience with modern engineering tools (Git, CI/ More ❯
engineering using Snowflake and dbt , with a solid understanding of DataOps and CI/CD practices. Proficiency in writing clean, maintainable code in SQL or Python . Experience with Scala , Java , or similar languages is a plus. Hands-on experience with data pipeline orchestration tools such as Apache Airflow and Azure DevOps . Strong knowledge of cloud-based data engineering More ❯
Waltham Cross, England, United Kingdom Hybrid / WFH Options
Simple Machines
schemas and queries to meet business requirements. A passion and proven background in picking up and adopting new technologies on the fly. Backend server experience using Kotlin. Exposure to Scala, or functional programming generally. Experience with highly concurrent, asynchronous backend technologies, such as Ktor, http4k, http4s, Play, RxJava, etc. Experience with DynamoDB or similar NoSQL databases, such as Cassandra, HBase More ❯
you: 5+ years of experience in software development, with a strong focus on backend technologies and building distributed services. Proficiency in one or more programming languages including Java, Python, Scala or Golang. Experience with columnar, analytical cloud data warehouses (e.g., BigQuery, Snowflake, Redshift) and data processing frameworks like Apache Spark is essential. Experience with cloud platforms like AWS, Azure, or More ❯
London, England, United Kingdom Hybrid / WFH Options
Citi
education) in a STEM discipline. Proven experience in software engineering and development, and a strong understanding of computer systems and how they operate. Hands on Experience in Java , Spark , Scala ( or Java) Production scale hands-on Experience to write Data pipelines using Spark/any other distributed real time/batch processing. Strong skill set in SQL/Databases Strong More ❯
of AWS services including S3, Redshift, EMR, Kinesis and RDS. - Experience with Open Source Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, etc.) - Ability to write code in Python, Ruby, Scala or other platform-related Big data technology - Knowledge of professional software engineering practices & best practices for the full software development lifecycle, including coding standards, code reviews, source control management, build More ❯