Understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures Proficiency in SQL and at least one programming language (e.g., Python, Scala, or Java) Demonstrated experience owning complex technical systems end-to-end, from design through production Excellent communication skills with the ability to explain technical concepts More ❯
platform, either on Azure or AWS. Good working knowledge of Databricks components: DeltaLake, Unity Catalog, ML Flow, etc. Expertise in SQL, Python and Spark (Scala or Python). Experience working with relational SQL databases either on premises or in the cloud. Experience delivering multiple solutions using key techniques such as More ❯
understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures Proficiency in SQL and at least one programming language (e.g., Python, Scala, or Java) Demonstrated experience owning complex technical systems end-to-end, from design through production Excellent communication skills with the ability to explain technical concepts More ❯
experience in the following skills: Relevant work experience in data science, machine learning, and business analytics Practical experience in coding languages eg. Python, R, Scala, etc.; (Python preferred) Proficiency in database technologies eg. SQL, ETL, No-SQL, DW, and Big Data technologies e.g. pySpark, Hive, etc. Experienced working with structured More ❯
City, Edinburgh, United Kingdom Hybrid / WFH Options
CreateFuture
of working with other languages and associated frameworks/libraries such as: JavaScript and Node.JS/Express or Java and Spring/SpringBoot TypeScript, Scala, Kotlin or Python Django or Play Enthusiastic and experienced when it comes to using engineering best practices, clean code and unit testing Experience working as More ❯
document and deliver large-scale, highly distributed, real-time and management systems that are core to effectively managing the supply chain business. - Use Java, Scala, object-oriented (OO) design patterns, NoSQL databases, and data modeling techniques. - Design PB scale big-data processing solutions leveraging latest AWS solutions. - Gather and analyze More ❯
Continuous Integration pipelines and tools Experience of TDD and BDD You may have: Knowledge of NestJS framework Experience working with an ORM Knowledge of Scala and functional programming Knowledge of AWS services and infrastructure monitoring Experience working with AWS Lambda functions Experience working with microservices/micro frontends Experience working More ❯
edinburgh, central scotland, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
backend. Their stack: Python React/TypeScript Node.js AWS PostgreSQL GenAI + experience with another backend language (Ruby, C++. Elixir, Haskell, Go, Clojure or Scala) Private projects to showcase + deep interest/passion for software engineering is also a plus! They are also working heavily within AI and working More ❯
and supervisor to junior colleagues. What you'll do: Utilise R and/or Python, and any of Perl, C, C++, Java, JavaScript or Scala for programming. Work with cloud platforms (predominantly GCP and AWS) Implement and manage data ingestion modules to download and manage publicly available genomic and molecular More ❯
edinburgh, central scotland, United Kingdom Hybrid / WFH Options
Harnham
Collaborate with various squads within the data team on project-based work. Develop and optimize data models, warehouse solutions, and ETL processes. Work with Scala, Spark, and Java to handle large-scale data processing. Contribute to manual Databricks-like data processing solutions. Requirements: Minimum of 4 years of experience with … Scala, Spark, and Java. Strong technical skills and a passion for working with data. A STEM degree or equivalent experience. Excellent communication skills. Experience with data modeling and data warehouse solutions. Nice to Have: Experience with AWS and Python. Interview Process CV run through Take home test Panel interview More ❯