Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Damia Group Ltd
Converted code is causing failures/performance issues. Your responsibilities; As a Spark Scala Engineer you will be working for the GDT (Global Data Technology) Team, you will be responsible for: Designing, building, and maintaining datapipelines using Apache Spark and Scala Working on an Enterprise … Cloud Services in one of the Clouds (GCP). Mandatory Skills; At least 8+ Years of IT Experience with designing, building, and maintaining datapipelines . At least 4+ Years of experience with designing, building, and maintaining datapipelines using Apache Spark and Scala. Programming languages … Proficiency in Scala and Spark is essential. Familiarity with Python and SQL is often a plus. Big Data technologies: Understanding of HDFS, Kafka, Hive, and cloud platforms is valuable. Data engineering concepts: Knowledge of data warehousing, datapipelines, data modeling more »
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Viqu Limited
Senior Data Engineer (Databricks) – Remote – Leeds - 6 Months Contract (Initial) – Outside IR35 We are seeking 4 Senior Data engineers (Databricks) on a contract basis to assist our client as they migrate over from legacy to AWS. The main focus will be to centralise the organisations data warehouses and consolidating/centralising a number of BI tools. Senior Data Engineers should have extensive experience working with Databricks on recent projects and be exposed to the latest features and toolsets of Databricks Role Responsibilities: Design and develop scalable datapipelines using Databricks. … Implement data processing workflows in Python and SQL. Manage and optimise data storage solutions on AWS. Ensure data quality and integrity across all stages of data processing. Develop and maintain documentation for data engineering processes. Utilise Databricks features, including Delta more »
Leeds, England, United Kingdom Hybrid / WFH Options
BJSS
our team with people just like you! We are DataOps advocates and use software engineering best practices to build scalable and re-usable data solutions to help clients use their data to gain insights, drive decisions and deliver business value. Clients don’t engage BJSS to … get to work with a wide range of tools and technologies and there are always new things to learn. About the Role BJSS data engineers are specialist software engineers that build, optimise and maintain data applications, systems and services. This role combines the discipline of software … engineering with the knowledge and experience of building data solutions in order to deliver business value. As a BJSS data engineer you’ll help our clients deploy datapipelines and processes in a production-safe manner, using the latest technologies and with a DataOps more »
Leeds, England, United Kingdom Hybrid / WFH Options
Damia Group
Converted code is causing failures/performance issues. Your responsibilities; As a Spark Scala Engineer you will be working for the GDT (Global Data Technology) Team, you will be responsible for: Designing, building, and maintaining datapipelines using Apache Spark and Scala Working on an Enterprise … Cloud Services in one of the Clouds (GCP). Mandatory Skills; At least 8+ Years of IT Experience with designing, building, and maintaining datapipelines . At least 4+ Years of experience with designing, building, and maintaining datapipelines using Apache Spark and Scala. Programming languages … Proficiency in Scala and Spark is essential. Familiarity with Python and SQL is often a plus. Big Data technologies: Understanding of HDFS,... more »