Manchester, North West, United Kingdom Hybrid / WFH Options
IO Associates
focused data team responsible for building and optimising scalable, production-grade data pipelines and infrastructure. Key Responsibilities: Design and implement robust, scalable ETL/ELT pipelines using Databricks and ApacheSpark Ingest, transform, and manage large volumes of data from diverse sources Collaborate with analysts, data scientists, and business stakeholders to deliver clean, accessible datasets Ensure high performance … practices Work with cloud-native tools and services (preferably Azure ) Required Skills & Experience: Proven experience as a Data Engineer on cloud-based projects Strong hands-on skills with Databricks , ApacheSpark , and Python or Scala Proficient in SQL and working with large-scale data environments Experience with Delta Lake , Azure Data Lake , or similar technologies Familiarity with version More ❯
North West London, London, United Kingdom Hybrid / WFH Options
Anson Mccade
knowledge of Kafka , Confluent , and event-driven architecture Hands-on experience with Databricks , Unity Catalog , and Lakehouse architectures Strong architectural understanding across AWS, Azure, GCP , and Snowflake Familiarity with ApacheSpark, SQL/NoSQL databases, and programming (Python, R, Java) Knowledge of data visualisation, DevOps principles, and ML/AI integration into data architectures Strong grasp of data More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Made Tech Limited
Strong experience in Infrastructure as Code (IaC) and deploying infrastructure across environments Managing cloud infrastructure with a DevOps approach Handling and transforming various data types (JSON, CSV, etc.) using ApacheSpark, Databricks, or Hadoop Understanding modern data system architectures (Data Warehouse, Data Lakes, Data Meshes) and their use cases Creating data pipelines on cloud platforms with error handling More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Talent Hero Ltd
to support data needs Optimise data storage and retrieval for performance Work with batch and real-time processing frameworks Implement and manage ETL processes Use tools like Python, SQL, Spark, Airflow, Kafka, dbt, Snowflake, Redshift, BigQuery Requirements Bachelors degree in Computer Science, Engineering, or a related field 1+ year in a Data Engineer or similar role Proficiency in SQL More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
First Central Services
control providers, preferably Git Proven to be an excellent communicator with a clear passion for data & analytics. Deep engineering and database skills in SQL, Server Azure Synapse, Data factory, spark compute and Databricks technologies. Experience of developing coding standards and deliver methods that others follow. Experience of testing techniques, tools and approaches. Extensive experience of the full data lifecycle. More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Northrop Grumman Corp. (JP)
developing & deploying scalable backend systems. Familiarity with CICD, containerisation, deployment technologies & cloud platforms (Jenkins, Kubernetes, Docker, AWS) or Familiarity with Big Data and Machine Learning technologies (NumPy, PyTorch, TensorFlow, Spark). Excellent communication, collaboration & problem solving skills, ideally with some experience in agile ways of working. Security clearance: You must be able to gain and maintain the highest level More ❯
Salford, Manchester, United Kingdom Hybrid / WFH Options
Manchester Digital
the ability to pivot strategies in response to innovative technologies, insights, or regulatory developments. Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and big data technologies (e.g., Snowflake, Spark). Strong communication skills, with the ability to distill complex data concepts into clear messages for non-technical stakeholders. Excellent stakeholder management and cross-functional collaboration skills, with the More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Manchester Digital
key abilities or experience: Deep technical expertise in at least one data ecosystem (e.g. Oracle, SQL Server, Cloud Data platforms) plus real-time data streaming/processing technologies (Kafka, Spark). Data modelling and governance proficiency, covering conceptual, logical, and physical architectures, metadata management. Cloud data technologies: experience architecting solutions on AWS, Azure, or GCP, optimising cost, performance, and More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
WorksHub
that help us achieve our objectives. So each team leverages the technology that fits their needs best. You'll see us working with data processing/streaming like Kinesis, Spark and Flink; application technologies like PostgreSQL, Redis & DynamoDB; and breaking things using in-house chaos principles and tools such as Gatling to drive load all deployed and hosted on More ❯