stakeholders can utilise. Key duties as PowerBI Developer: Design and development of a PowerBI platform for stakeholder reporting, visualisations, and dashboards Extracting raw data and setting up automated reporting functions Requirements gathering from stakeholders Developing relationships with finance, HR, IT, and Operations teams Requirements to be successful as … PowerBI Developer: Degree or similar level qualification in Computer Science, Data Analysis, or IT Subject Matter Expert in PowerBI Confident working with multiple data sources and types Data engineering experience, building datapipelines and automation Technical skills with SQL, DAX, and cloud more »
with one of the best leading insurance companies, in collaboration with a prestigious bank in London. We're actively seeking a talented Senior Data Engineer to enrich our dynamic team. Responsibilities: Experience designing data models and developing industrialized data pipelines. Strong knowledge of database … and data lake systems, with proficiency in Python and SQL (any flavor). Comfortable with shell scripting (e.g., Bash) and provisioning new infrastructure in leading cloud providers, preferably GCP or AWS. Experience creating DataOps pipelines. Adept at working in an Agile environment, actively participating in approaches such as more »
Data Engineer Leeds - on site 2x per week up to £55,000 Looking to join a dynamic team in the financial sector? Fruition IT are looking for an experienced Data Engineer to join a leading company specializing in financial solutions. Working closely with the Senior Data … whilst maintaining a core SAS user base. Requirements: Demonstratable proficiency with SAS platforms Strong commercial experience with SQL Hands on experience with Azure Data Factory and the build of datapipelines Proficiency in Python for data manipulation, transformation and analysis Experience with Azure Synapse more »
do things differently. We’re focused on our core values; using these we’ve seen significant growth across our practices and our Digital, Data and Cloud team is preparing for the next phase of expansion. This creates new opportunities for driven and skilled individuals to join one of … implementing cloud solutions for different applications, automating cloud operations and cloud deployments Assume responsibility for the architecture of solutions that leverage advanced analytics, datapipelines, web portals, statistical models, Machine Learning and AI, through our technology platform Draw on the strength and experience of a wider team of more »
do things differently. We’re focused on our core values; using these we’ve seen significant growth across our practices and our Digital, Data and Cloud team is preparing for the next phase of expansion. This creates new opportunities for driven and skilled individuals to join one of … implementing cloud solutions for different applications, automating cloud operations and cloud deployments Assume responsibility for the architecture of solutions that leverage advanced analytics, datapipelines, web portals, statistical models, Machine Learning and AI, through our technology platform Draw on the strength and experience of a wider team of more »
Newbury, England, United Kingdom Hybrid / WFH Options
Spectrum IT Recruitment Careers
Review clients' requirements Working closely with key business stakeholders Experience: Around 3 years' experience developing with SQL MSSQL, T-SQL Troubleshooting and resolving data quality issues Documentation Advantageous: Telecoms billing systems or processes Knowledge of Java or Kotlin NoSQL Databases Experience of CRM systems Azure Data Lake/DataPipelines This role offers a salary of up to £40,000 including generous benefits including... more »
Collaborate with cross-functional teams, dive into research, quickly learn relevant background information and contribute to the methodology development * Develop models, algorithms and datapipelines that leverage data sets relevant for a specific supply chain risk problem Requirements and Skills: * 2-5 years of professional experience … in Advanced Analytics/Data Science/Machine Learning/Statistical Modelling * Coding experience (preferably Python) to write robust and high standard code; experience with version control (preferably Git) * Ability to rapidly acquire new technical skills * Good understanding of mathematical foundations of Machine Learning models Desirable Skills: * Experience more »
The ideal candidate will have a strong background in MLOps, with experience in deploying models on multiple GPUs, scaling, load balancing, and managing data pipelines. This role is critical in ensuring the high performance and reliability of our AI-driven solutions. Responsibilities: Deploy LLMs using platforms like Huggingface … or similar. Manage deployment on multiple GPUs, ensuring optimal performance. Implement scaling and load balancing to handle varying loads efficiently. Design and maintain datapipelines for model training, preferably using Kubeflow. Establish continuous integration and delivery pipelines using Docker, Kubernetes, or AWS counterparts (e.g., ECR). Visualise fine … grained MLOps data using industry standard tools and platforms Ensure system security through triaging, log analysis, and infrastructure debugging. Implement Infrastructure as Code using tools like Terraform or CloudFormation. Requirements: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field or equivalent experience. Solid more »
Kubernetes) and Workflow orchestration (Temporal) .Strong skills in distributed architecture – local and cross site resiliency and failover .Achieving scalability on Apache Nifi based datapipelines to run several concurrent data movement feeds .Being able to visualize and integrate the platform and application architecture .Own and manage more »
appoint a talented Automation Test Engineer on a Contract basis. The role will work on creating test automation frameworks, developing test pipelines for datapipelines, and contributing to both manual testing and test management activities. About the role: Based in Greater London (Hybrid): Create test automation frameworks. Develop … test pipelines for data pipelines. Contribute to manual testing efforts. Assist in requirements and scenario definition and elicitation. Participate in test management and test planning activities. Create and manage CI/CD pipelines. About you: You will have the following experiences: Proficiency in test automation and knowledge of … test architecture. Experience with data and database testing. Strong test management and planning skills. Ability to create use cases, requirements, and scenarios. Local Authority/Public Sector experience is highly desired. What's on offer: Salary: £450+ per day (Inside IR35) *negotiable based on experience *please submit your more »
Central London, Belgravia, Greater London, United Kingdom
CPR
and reliability of their products through innovative testing practices. Key Responsibilities: Create and maintain test automation frameworks. Develop and manage test pipelines for data pipelines. Contribute to manual testing efforts and overall test management activities. Assist in requirements and scenario definition and elicitation. Participate in test planning and … and manage Continuous Integration/Continuous Deployment (CI/CD) pipelines. Skills and Qualifications: Proficiency in test automation and related tools. Experience with data and database testing. Strong skills in test management and planning. Ability to create use cases, requirements, and scenarios. Knowledge of test architecture … and best practices. About You: You are a detail-oriented and innovative professional with a strong background in test automation and datapipeline testing. Your ability to create robust test frameworks and manage test activities will contribute significantly to our product's success. Your experience with CI more »
office on a hybrid basis. As a Contract Senior Golang Developer, you will play a crucial role in assessing, analysing, and enhancing our data ingestion, transformation, and storage layers. Your primary focus will be on Go Programming and DBT, with secondary skills in Google Cloud Services. The successful … candidate will bring 5 to 10 years of experience in developing robust datapipelines, conducting unit testing, and providing support for production deployments. Key Responsibilities: Assess and analyse data ingestion, transformation, and storage layers, including RAW, Exploratory, Curated, and application layers. Understand existing codebase and make … alterations or fixes as per design requirements. Develop and optimize data ingestion pipelines using Go Programming and DBT. Conduct unit testing and validation of the developed code to ensure reliability and performance. Collaborate with cross-functional teams to support production deployments and resolve any issues that may arise. more »
Manchester, North West, United Kingdom Hybrid / WFH Options
Be Technology
Data Engineer - Manchester - £35,000 - Hybrid Working My client is hiring a full time Data Engineer to contribute to their expanding data science and analytics initiatives. Your role involves designing, building, and maintaining datapipelines, ensuring seamless data flow into … their central Microsoft Azure repository. Collaborate with Data Analysts and Scientists to optimise data quality, reliability, security, and automation. Skills & Responsibilities: Configure and troubleshoot Microsoft Azure, manage data ingestion Develop ETL scripts using Python, handle web scraping, APIs, and relational databases Familiarity with Power … will not have to use Power BI, but being aware of how to collaborate with people that do is important) Work closely with data analysts and scientists, contribute to data models and algorithms, and participate in data events Working knowledge of Git If you more »
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Damia Group Ltd
Converted code is causing failures/performance issues. Your responsibilities; As a Spark Scala Engineer you will be working for the GDT (Global Data Technology) Team, you will be responsible for: Designing, building, and maintaining datapipelines using Apache Spark and Scala Working on an Enterprise … Cloud Services in one of the Clouds (GCP). Mandatory Skills; At least 8+ Years of IT Experience with designing, building, and maintaining datapipelines . At least 4+ Years of experience with designing, building, and maintaining datapipelines using Apache Spark and Scala. Programming languages … Proficiency in Scala and Spark is essential. Familiarity with Python and SQL is often a plus. Big Data technologies: Understanding of HDFS, Kafka, Hive, and cloud platforms is valuable. Data engineering concepts: Knowledge of data warehousing, datapipelines, data modeling more »
Swansea, Neath Port Talbot, Wales, United Kingdom Hybrid / WFH Options
Inspire People
new services? We are partnering exclusively with a large organisation to bring you an exciting opportunity for a highly skilled and experienced Lead Data Engineer to be responsible for leading data engineering projects, ensuring the efficient processing and management of data across the organisation. … other benefits. Flexible, hybrid working from Swansea with options for condensed hours. You will get to lead a team working on a modern data platform, collaborating with engineering product teams delivering services and working closely with reporting teams and data scientists to help understand and optimise … their digital services. This role will be instrumental in shaping their data infrastructure, ensuring it supports the strategic objectives. You will leverage cutting-edge technologies, primarily focusing on Microsoft Azure and Databricks, to develop and maintain robust, scalable data solutions. Responsibilities: Leadership and Strategy: Lead the more »
already hold active DV or SC clearance to be considered for this position. Role Outline: Assist with the continued development and maintenance of datapipelines using NiFi and signature updates using Elastic/Kibana . System administration on specific cyber defence applications and systems to include installation, configuration more »
Newbury, Berkshire, South East, United Kingdom Hybrid / WFH Options
Spectrum IT Recruitment
Review clients' requirements Working closely with key business stakeholders Experience: Around 3 years' experience developing with SQL MSSQL, T-SQL Troubleshooting and resolving data quality issues Documentation Advantageous: Telecoms billing systems or processes Knowledge of Java or Kotlin NoSQL Databases Experience of CRM systems Azure Data Lake/DataPipelines This role offers a salary of up to £40,000 including generous benefits including a company bonus, remote working, private healthcare, and much more! Hit apply, email or call 02380 765 301 Spectrum IT Recruitment (South) Limited is acting as an Employment more »
Truth Builds Trust If our mission, values, and purpose align with your own, we would love to hear from you! Your opportunity The Data Engineer is a crucial member of the Investments Data, Technology team, tasked with developing, managing, and optimizing datapipelines within … our data ecosystem. Reporting to the Investments Data Lead, this role is instrumental in implementing the data strategy that supports front office stakeholders, systems, and clients. The Data Engineer will leverage cutting-edge technology in Snowflake, Python, SQL, and Azure to enhance … within our subsidized onsite canteen Must have skills Proven experience as a Data Engineer, with a strong background in datapipeline construction, data architecture, and data warehousing. Expertise in Snowflake, Python, SQL, and cloud-native ETL/ELT tools. Familiarity with more »
Data Engineer Remote working (1 day per month on-site) Competitive salary plus an industry leading bonus and pension scheme My client is looking for a Data Engineer to work on innovative projects that balance essential services, optimise cutting-edge technologies, and safeguard valuable assets. As … a Data Engineer, you will develop future-proof pipelines and uncover deep insights from diverse datasets, collaborating with technical teams and clients to design, build, and leverage data architectures for ambitious projects. Why You Should Apply Work on impactful and diverse projects. Flexible working hours and … Architecture, Data Insights, Cloud Platforms, Python Programming, Linux, Windows, Version Control, Git, Unit Testing, Integration Testing, Relational Databases, SQL, ETL Pipelines, Data Cleaning, Data Merging, Data Visualisation, Exploratory Data Analysis, Technology Readiness, Proofs-Of-Concept, Data-Driven more »
Greater London, England, United Kingdom Hybrid / WFH Options
Cera
This is an exciting opportunity to join one of Europe's fastest growing health-tech startups, working in a Data team that has made huge steps in moving our innovative platform forwards This role is Hybrid but only requires 1-2 days a month in our London office … the UK leader in revolutionising affordable care for the better, for our service users and society and the perfect coordination across care practice, data, science and technology is how we’ll get there. About Data at Cera Data has a critical role to play … in two distinct areas. First, data about our operations, covering everything from hiring and retaining carers to the delivery of care, helps us to observe and understand how we are performing, what is working and what we could do to better deliver care. Second, data about more »
Truth Builds Trust If our mission, values, and purpose align with your own, we would love to hear from you! Your opportunity The Data Engineer is a crucial member of the Finance Analytics technology team, tasked with developing, managing, and optimizing datapipelines within our data ecosystem. The Data Engineer will leverage cutting-edge technology in Snowflake, Python, Informatica, and Azure to improve our data capabilities and support our Finance business partners in their decision-making processes. Key responsibilities include: Design, build, and maintain efficient, reliable datapipelines … use within our subsidized onsite canteen Must have skills Experience as a Data Engineer, with a solid background in datapipeline construction and data architecture using Snowflake, Python, and Informatica Familiarity with Azure and other cloud-native technologies Understanding of finance industry datamore »
Data Technical Lead Location: Hybrid - London Package: Negotiable + Benefits Brown & Brown, Inc. is one of the largest insurance brokers globally having in excess of $4bn revenue in 2023 by providing risk management solutions to help protect what our customers value most. We have experienced significant growth over … size of the business by improving customer experience, growing organically and acquiring, accelerating knowledge and capabilities of teammates and using technology with purpose. Data and Data services is a key strategic enabler in Brown and Brown Europe as we look to integrate acquired business, enable them … to provide great customer service and grow through the provision of superior data services As a Technical Lead for our Data Team, you will play a critical role in shaping our data strategy, driving technical excellence, and ensuring the successful delivery of datamore »
are a tech for good business dedicated to promoting transparency in the corporate world they are seeking a highly skilled and experienced Lead Data Engineer. You will be the first hire in the Data Engineering team allowing you to take the drivers seat regarding the direction … taken by the data team to follow. Key Responsibilities: Platform Development: Design, develop, maintain, test, and optimise a scalable and reliable data platform for collecting, processing, and transforming large volumes of company data from various sources. Collaboration: Work with cross-functional teams to understand … comprehensive documentation for data processes, schemas, and data dictionaries. Process Improvement: Identify and implement enhancements to improve datapipeline efficiency and reliability. Cloud Management: Actively manage cloud-based data services to balance performance and cost. Mentorship: Mentor and guide junior datamore »
Requirements At least two years of professional experience as a data engineer with a proven ability to provide effective data solutions in a production setting. Knowledge of developing real-time data stream systems (ideally Kafka). Proven track record in developing data systems using PySpark and Apache Spark for batch processing. Capable of managing data intake from various sources, including data streams, unstructured data, relational databases, and NoSQL databases. Extensive knowledge in distributed systems, cloud architecture, and data pipelines. Proficiency in Python … programming (knowledge of Scala or Rust is a plus). Familiarity with ETL principles in contemporary data applications (Dagster, Airflow, Perfect). Familiarity with AWS services such as Glue, Redshift, Athena, and S3. Proficiency with Terraform, Kubernetes, and ArgoCD (expertise not required; cloud team support available). Thorough more »
AWS Data Architect (Must have current SC) Amber Labs was born out of the recognition that many organisations require support on their data maturity journey. Our team is passionate and experienced in delivering cutting-edge data capability. Amber Labs was established with a vision … to provide specialist services, automation, accelerators competitively to organisations across EMEA. Job Summary We are seeking a highly skilled AWS Data Architect to join our dynamic team. The ideal candidate will be responsible for designing, implementing, and optimizing our data architecture on the AWS platform. You … will work closely with various teams to ensure data integrity, security, and availability, driving our data strategy forward Key Responsibilities Design and Implement Data Solutions: Develop scalable and efficient data architecture solutions on AWS. Create and maintain datapipelines using more »