define requirements and deliver innovative solutions Support internal capability development by sharing your expertise and experience What We're Looking For Strong hands-on experience with Python, Java, or Scala Proficiency in cloud environments (AWS, Azure, or GCP) and big data tech (Spark, Hadoop, Airflow) Solid understanding of SQL, ETL/ELT approaches, and data modelling techniques Experience building CI More ❯
serving layers Experience with data lakehouse architecture, schema design, and GDPR-compliant solutions Working knowledge of DevOps tools and CI/CD processes Bonus Points For Development experience in Scala or Java Familiarity with Cloudera, Hadoop, HIVE, and Spark ecosystem Understanding of data privacy regulations, including GDPR, and experience working with sensitive data Ability to learn and adapt new technologies More ❯
collaborating with stakeholders to define optimal solutions - Strong SQL skills with experience across relational and non-relational databases - Proficiency in a modern programming language such as Python, Java or Scala - Hands-on experience using DBT for pipeline development and transformation - Familiarity with cloud platforms such as AWS, Azure or GCP - Knowledge of orchestration tooling (e.g., Airflow, Dagster, Azure Data Factory More ❯
driven APIs, and designing database schemas and queries to meet business requirements. A passion and proven background in picking up and adopting new technologies on the fly. Exposure to Scala, or functional programming generally. Exposure with highly concurrent, asynchronous backend technologies, such as Ktor, http4k, http4s, Play, RxJava, etc. Exposure with DynamoDB or similar NoSQL databases, such as Cassandra, HBase More ❯
large-scale data processing systems with data tooling such as Spark, Kafka, Airflow, dbt, Snowflake, Databricks, or similar Strong programming skills in languages such as SQL, Python, Go or Scala Demonstrable use and an understanding of effective use of AI tooling in your development process A growth mindset and eagerness to work in a fast-paced, mission-driven environment Good More ❯
of emerging technology and trends. Provides out of hours support for applications to ensure the shop stays open and fully functional. Essential knowledge and skills Proficient in Python or Scala Familiarity in Java Experience in a Marketing technical stack and 3rd party tools Broad experience of working within AWS; including infrastructure (VPC, EC2, Security groups, S3 etc) to AWS data More ❯
such as Hadoop, Spark, and Kafka, with a strong emphasis on Java development. Proficiency in data modeling, ETL processes, and data warehousing concepts. Experience with data processing languages like Scala, Python, or SQL. Familiarity with containerization technologies (Docker) and orchestration tools (Kubernetes). Strong knowledge of software development principles, including object-oriented design, design patterns, and clean code practices. Excellent More ❯
Azure Event Hubs ) Solid understanding of SQL , data modelling , and lakehouse architecture Experience deploying via CI/CD tools (e.g., Azure DevOps, GitHub Actions) Nice to Have: Knowledge of Scala/Java Understanding of GDPR and handling sensitive data This is a contract role (UK-based) offering the chance to work on high-impact projects shaping the future of finance More ❯
as Actuarial and Finance, to help with their data presentation requirements and help deliver data for visualisation solutions, such as Power BI Key skills required: Expertise in Python, Java, Scala, or other programming languages used to build data pipelines, implement data transformations, and automate data workflows. Strong knowledge of Power BI Strong SQL experience. Familiarity with technical data structures, data More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Eden Smith Group
as Actuarial and Finance, to help with their data presentation requirements and help deliver data for visualisation solutions, such as Power BI Key skills required: Expertise in Python, Java, Scala, or other programming languages used to build data pipelines, implement data transformations, and automate data workflows. Strong knowledge of Power BI Strong SQL experience. Familiarity with technical data structures, data More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Investment Banking - London/Hybrid (Data Engineer, SQL Data Engineer, Java, Python, Spark, Scala, SQL, Snowflake, OO programming, Snowflake, Databricks, Data Fabric, design patterns, SOLID principles, ETL, Unit testing, NUnit, MSTest, Junit, Microservices Architecture, Continuous Integration, Azure DevOps, AWS, Jenkins, Agile, Data Engineer, SQL Data Engineer) We have several fantastic new roles including a Data Engineer position to More ❯
and working in a fast-paced and ever-changing environment. Ideally, you are also experienced with at least one of the programming languages such as Java, C++, Spark/Scala, Python, etc. Major Responsibilities: - Work with a team of product and program managers, engineering leaders, and business leaders to build data architectures and platforms to support business - Design, develop, and More ❯
Cambridge, Cambridgeshire, United Kingdom Hybrid / WFH Options
Arm Limited
re solving for reliability, compliance, performance, and speed - at once. You'll be key to making it work. Required Skills: Knowledge of one or more programming languages (Java/Scala, TypeScript, Python). Validated experience operating distributed systems at scale in production. Cloud AWS (primary), Kubernetes (future), Docker (current), Terraform. Excellent debugging skills across network, systems, and data stack. Observability More ❯
platforms, preferably with GCP expertise. Deep knowledge of cloud architecture, data engineering, pipelines, and big data technologies (e.g., BigQuery, Dataflow, Pub/Sub). Proficiency in Python, Java, or Scala; familiarity with microservices, Docker, Kubernetes, CI/CD tools (e.g., Jenkins, GitLab CI), and cloud monitoring. Proven experience in digital transformation and Agile environments. Preferred understanding of banking risk management More ❯
platforms, preferably with GCP expertise. Deep knowledge of cloud architecture, data engineering, pipelines, and big data technologies (e.g., BigQuery, Dataflow, Pub/Sub). Proficiency in Python, Java, or Scala; familiarity with microservices, Docker, Kubernetes, CI/CD tools (e.g., Jenkins, GitLab CI), and cloud monitoring. Proven experience in digital transformation and Agile environments. Preferred understanding of banking risk management More ❯
control, task tracking). Demonstrable experience writing ETL scripts and code to make sure the ETL processes perform optimally. Experience in other programming languages for data manipulation (e.g., Python, Scala). Extensive experience of data engineering and the development of data ingest and transformation routines and services using modern, cloud-based approaches and technologies. Understanding of the principles of data More ❯
in motivated teams, collaborating effectively and taking pride in your work. Strong problem-solving skills, viewing technology as a means to solve challenges. Proficiency in a programming language (e.g., Scala, Python, Java, C#) with understanding of domain modelling and application development. Knowledge of data management platforms (SQL, NoSQL, Spark/Databricks). Experience with modern engineering tools (Git, CI/ More ❯
control, task tracking). ·Demonstrable experience writing ETL scripts and code to make sure the ETL processes perform optimally. ·Experience in other programming languages for data manipulation (e.g., Python, Scala). ·Extensive experience of data engineering and the development of data ingest and transformation routines and services using modern, cloud-based approaches and technologies. ·Understanding of the principles of data More ❯
Wandsworth, Greater London, UK Hybrid / WFH Options
Datatech
control, task tracking). Demonstrable experience writing ETL scripts and code to make sure the ETL processes perform optimally. Experience in other programming languages for data manipulation (e.g., Python, Scala). Extensive experience of data engineering and the development of data ingest and transformation routines and services using modern, cloud-based approaches and technologies. Understanding of the principles of data More ❯
understanding of various data engineering technologies including Apache Spark, Databricks and Hadoop Strong understanding of agile ways of working Up-to-date understanding of various programming languages including Python, Scala, R and SQL Up-to-date understanding of various databases and cloud-based datastores including SQL and NoSQL Up-to-date understanding of cloud platforms including AWS and/or More ❯
Whetstone, Greater London, UK Hybrid / WFH Options
Baringa
You are great at problem solving and see all technologies/engineering as a means to achieve this. You have advanced working knowledge of a general programming language (e.g. Scala, Python, Java, C# etc.) and understand both domain modelling and application programming. You have working knowledge of data management platforms (SQL, NoSQL, Spark/Databricks etc.) You have working knowledge More ❯
Position Level: Junior/Mid Position Type: Full-time (Onsite) About Intelmatix: Intelmatix is a deep tech Artificial intelligence (AI) company founded in July 2021 by a group of MIT scientists with the vision of transforming enterprises to become cognitive. More ❯
security). Working knowledge of AWS core services, including S3, EC2/EMR, IAM, Athena, Glue or Redshift. Hands-on experience with Databricks Spark on large datasets, using PySpark, Scala, or SQL. Familiarity with Delta Lake, Unity Catalog or similar data lakehouse technologies. Proficient in Linux environments, including experience with shell scripting, basic system operations, and navigating file systems. Deep More ❯
data engineering experience - Experience with SQL - Experience with data modeling, warehousing and building ETL pipelines - Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS - Experience as a data engineer or related specialty (e.g., software engineer, business intelligence engineer, data scientist) with a track record of manipulating, processing, and extracting value from large More ❯
culture of innovation and challenge. We have over 300 tech experts across our teams all using the latest tools and technologies including Astro, Cloudflare, Docker, Kubernetes, AWS, Kafka, Java, Scala, Python, .Net Core, Node.js and MongoDB. Theres something for everyone . Were a place of opportunity. Youll have the tools and autonomy to drive your own career, supported by a More ❯