teams to align goals and prioritise initiatives Design, write, optimise, and troubleshoot SQL queries to support marketing analysis and reporting. Work with cloud-based data platforms, particularly Snowflake, to extractand manipulate large datasets Deliver actionable insights through data storytelling, visualisation, and clear presentations. Build dashboards, visualisations, and analytical prototypes using tools such as Tableau or Sigma Leverage CRM data … from Salesforce for segmentation, campaign analytics, and reporting use cases Contribute to the design and execution of ETL/ELT workflows and data transformation processes Conduct rigorous data validation to ensure the accuracy, completeness, and reliability of reporting outputs Support data governance and documentation practices using tools such as Atlan to ensure transparency and accessibility Requirements Strong proficiency in SQL … visualisation tools such as Tableau or Sigma; experience building interactive dashboards and reports Familiarity with Salesforce or other CRM platforms, particularly for segmentation and campaign analytics Working knowledge of ETL/ELT processes and data transformation best practices High attention to detail with strong data validation and quality assurance skills Excellent communication and presentation skills, with the ability to translate More ❯
possible. Ensure dashboards are user-friendly, automated and provide actionable insights for stakeholders. Ensure all dashboards follow a strict peer review, QA and UAT process. Utilize Alteryx to create ETL pipelines, ensuring data accuracy and completeness throughout the process. Ensure all workflows follow a strict peer review, QA and UAT process. Utilise data services, whether that be dashboards, ETL pipelines More ❯
We are seeking an experienced Senior Developer to support the integration of Banking Book and Trading Book Accounting onto a single platform. This role requires expertise in ETL, Java, accounting, finance , and banking domain knowledge to understand and enhance both accounting engines. You will collaborate with cross-functional teams to design, develop, and optimize solutions using modern techniques. Qualifications include … ETL (Ab Initio/PL SQL) Angular Java development Deployment tools and pipelines Additional desirable skills: Financial Services and Banking Accounting experience Cloud technologies such as AWS/Azure Python development This role is based in Northampton. Role Purpose: Design, develop, and improve software solutions that enhance business, platform, and technology capabilities for our customers and colleagues. Key Responsibilities: Develop More ❯
and chair multi-functional delivery teams, mentoring engineers, scientists and analysts. Partner with our delivery provider to onboard specialist roles per your recommendations. Oversee implementation of new data platforms, ETL pipelines, and analytical models. Standards, Compliance & Quality Monitor application of data standards, governance frameworks and security controls. Embed strong data quality, lineage and protection practices across all systems. Contribute to … engineering, data science or analytics; proven track record setting/executing data strategies in complex organisations. Technical Breadth : Cloud-based data platforms AWS must (Azure, AWS or equivalent) Modern ETL/ELT tools and data pipeline frameworks Data modelling, warehousing and transformation best practices Data science/ML lifecycle from prototype to production Government/Public Sector : Experience to GDS More ❯
Pay rate: up to 41.35ph PAYE, depending on location - must be UK based Key Responsibilities Data Infrastructure Development Design, build, visualise, and maintain scalable data pipelines andETL processes. Collaborate with engineers to integrate new data sources into existing systems. Data Analytics Support Partner with data scientists to deliver clean, reliable datasets for reporting and analysis. Develop and maintain dashboards … or smaller tasks alongside main projects. Success will be measured by the quality and timeliness of your project deliverables. Qualifications A minimum of 4 years' experience in: Data processing (ETL) A basic understanding of Python programming Experience with databases and data infrastructure tools This role is open for a limited time. Next steps will be shared with shortlisted candidates ASAP. More ❯
Lead integration of data from SAP (e.g., S/4HANA, BW/4HANA) and external platforms using cloud and API technologies. Oversee data governance, access control, and compliance. Optimize ETL pipelines and support both real-time and batch data processing. Monitor system performance and troubleshoot issues proactively. Collaborate with cross-functional teams to continuously improve data products and services. Act … data management ecosystems, including at least 3 years working with SAP Analytics Cloud (SAC). Deep expertise in SAP SAC architecture, SAP Data Services, SAP HANA, and BW. Strong ETL/data modelling skills and proficiency in cloud-based data platforms. Experience integrating data from SAP and non-SAP systems. Programming proficiency in Python, SQL, or Java . Proven leadership More ❯
complex requirements into data driven solutions. Beyond building new tools, you'll also optimise and enhance existing Qlik assets, ensuring they remain scalable and efficient. You'll manage robust ETL pipelines from a wide range of data sources, playing a critical role in maintaining the accuracy, consistency, and quality of data at every stage. On top of that, you'll … scripting and dashboard design Hands on experience with Snowflake Experience with QlikView is beneficial but not essential Proficiency in SQL and working with relational databases Experience designing and implementing ETL processes Understanding of data warehousing and building data lakehouses Effective communication skills and the ability to collaborate across technical and non technical teams Why apply? This is an excellent opportunity More ❯
complex requirements into data driven solutions. Beyond building new tools, you'll also optimise and enhance existing Qlik assets, ensuring they remain scalable and efficient. You'll manage robust ETL pipelines from a wide range of data sources, playing a critical role in maintaining the accuracy, consistency, and quality of data at every stage. On top of that, you'll … scripting and dashboard design Hands on experience with Snowflake Experience with QlikView is beneficial but not essential Proficiency in SQL and working with relational databases Experience designing and implementing ETL processes Understanding of data warehousing and building data lakehouses Effective communication skills and the ability to collaborate across technical and non technical teams Why apply? This is an excellent opportunity More ❯
Python and Flask. You will work on building scalable, high-performance web applications with a focus on backend technologies. Besides that, you will develop and maintain complex data pipelines (ETL) that power YouGov core products. If you thrive in an environment that values collaboration and customer happiness above all, you'll find yourself at home on our team. Key Responsibilities … applications using Python, Flask, Cloud technologies, relational and NoSQL databases Design and implement RESTful APIs and integrate with third-party services Design, implement and maintain high-availability data pipelines (ETL) Break down complex problems and make informed decisions based on thorough analysis to create realistic estimates for engineering tasks Optimize applications for performance, reliability, security, maintainability, monitoring and scalability Define More ❯
the organisation Shape and deliver scalable data architecture that improves how data is shared and stored, aligned to the organisation’s goals Guide database and data integration work, using ETL tools and database systems to ensure quality and ease of access Support and guide the data architecture team, encouraging collaboration, learning and shared success Your skills and experiences: Essential Bachelor … entity-relationship (ER) diagrams and data normalisation techniques Desirable Knowledge of database technologies including RDBMS (e.g. Oracle, SQL Server), NoSQL (e.g. MongoDB, Cassandra), and data warehouse solutions Understanding of ETL processes, including data extraction, transformation, and loading Familiarity with cloud platforms such as AWS, Azure, and Google Cloud Platform TOGAF (The Open Group Architecture Framework), CDMP (Certified Data Management Professional More ❯
Python and Flask. You will work on building scalable, high-performance web applications with a focus on backend technologies. Besides that, you will develop and maintain complex data pipelines (ETL) that power YouGov core products. If you thrive in an environment that values collaboration and customer happiness above all, you'll find yourself at home on our team. Key Responsibilities … applications using Python, Flask, Cloud technologies, relational and NoSQL databases Design and implement RESTful APIs and integrate with third-party services Design, implement and maintain high-availability data pipelines (ETL) Break down complex problems and make informed decisions based on thorough analysis to create realistic estimates for engineering tasks Optimize applications for performance, reliability, security, maintainability, monitoring and scalability Define More ❯
Supply Chain Command Center (SC3) and Warehouse Automation and Optimization (WAO) • Work hands-on with new Services, including AI/Machine Learning, Serverless/Lambda IoT, Analytics (Map Reduce, ETL), Data Warehouse, BI and Security Services to build products and solutions with our customers • Structure and Guide our Customer's through transformation journey, collaborating widely with business stakeholders to understand … AI/ML or Data Science solutions - Hands on experience leading large-scale global data warehousing and analytics projects. - Experience designing and coding in Python or Scala or building ETL or data ingestion pipelines Amazon est un employeur engagé pour l'égalité des chances. Nous sommes convaincus qu'une main d'oeuvre diversifée est essentielle à notre réussite. Nous prenons More ❯
platform and next generation Business Rules platform using latest innovative Abinitio technologies. The Candidate is required to possess relevant design and development experience in the Tools and technologies on ETL programming. Person should be a strong team player. Exposure to Finance OR Risk Functions on the Retail Banking products or Wholesale/Investment banking is preferred. This is a significant More ❯
Wolverhampton, West Midlands, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
You would be coming into a team of 10 Data Engineers as a Senior Data Engineer, and be responsible for maintaining the SQL Server, using T-SQL and building ETL pipelines. The ideal candidate will have experience working in the financial sector and working with end-to-end processes. Role and Responsibilities: You will mentor junior members in the team … You will build ETL in SQL server You will use SSIS and SSRS Skills and Experience: Essential to have experience with: Expert level SQL and T-SQL ETL pipelines SSRS Stakeholder skills Interview Process Interview with Data Engineering Manager Interview with Data Engineering Manager and Senior Data Engineer on the team, including T-SQL tech test and soft skills More ❯
Leicester, Leicestershire, England, United Kingdom
CPS Group
with senior stakeholders, however, we do require someone who has come from a technical Datawarehouse testing background. You'll lead testing efforts across data warehouse solutions, driving quality across ETL processes, data pipelines, and system integrations.What You'll Do:* Define and execute testing strategies for data accuracy, quality, and performance* Lead a high-performing testing team and collaborate across Agile … teams* Implement automated testing frameworks integrated into CI/CD pipelines* Ensure data validation, consistency, and completeness across complex systemsWhat You'll Bring:* Background within data warehouse testing (ETL, pipelines, cloud-native platforms like Snowflake)* Experience building and maintaining test harnesses using tools like dbt (Data Build Tool) for automating data validation across ETL processes and ensuring test integration with More ❯
Design, develop, and maintain scalable data pipelines using SPARQL and Python. Create and optimize complex SPARQL queries for data retrieval and analysis. Develop graph-based applications and models to extract insights and solve real-world problems. Collaborate with data scientists and analysts to translate data requirements into effective pipelines and models. Ensure data quality and integrity throughout the ETL process. More ❯
Design, develop, and maintain scalable data pipelines using SPARQL and Python. Create and optimize complex SPARQL queries for data retrieval and analysis. Develop graph-based applications and models to extract insights and solve real-world problems. Collaborate with data scientists and analysts to translate data requirements into effective pipelines and models. Ensure data quality and integrity throughout the ETL process. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Lorien
of Snowflake [and Redshift] based data warehousing solutions within an AWS environment Extensive experience in AWS services, e.g. EC2, S3, RDS, DynamoDB, Redshift, Lambda, API Gateway Develop and optimize ETL processes using AWS services (e.g. AWS Glue, Lambda) to ensure efficient data ingestion, transformation, storage, and cost optimization. Strong SQL skills for complex data queries and transformations ETL Pipe-lining More ❯
on the lookout for a skilled Data Engineer to join their Data & Analytics team on an initial 6-month contract . This role is perfect for someone with strong ETL expertise , deep experience in Google Cloud Platform (GCP) , and a passion for building scalable, cloud-native data pipelines. You'll work with cutting-edge tech in a fast-paced environment … helping to deliver critical insights and analytics to the business. What You'll Be Doing: Designing and developing scalable ETL pipelines to process and deliver large volumes of data. Working hands-on with GCP services including BigQuery , Pub/Sub , and Dataflow . Automating infrastructure using Terraform , Ansible , and CI/CD tooling. Writing clean, efficient code in Python , Go … Collaborating with stakeholders to ensure data pipelines meet business needs and SLAs. What We're Looking For: Proven experience in data engineering with a strong focus on cloud-based ETL workflows . Solid background with Google Cloud Platform (GCP) and associated data tools. Skilled in Infrastructure as Code - Terraform and Ansible preferred. Confident working with CI/CD pipelines (Jenkins More ❯
A leading Construction organisation is seeking a skilled SQL Server/SSIS Engineer to design, develop, and maintain robust ETL solutions within a dynamic data and reporting environment. This role is ideal for someone who thrives on problem-solving, ensuring data quality, and collaborating across teams to deliver seamless integrations. Key Responsibilities: Develop and maintain SSIS packages and T-SQL … procedures to support ETL processes. Ensure data accuracy and performance across systems. Monitor and troubleshoot daily data loads and SSIS jobs. Collaborate with analysts, architects, and engineers to align integration efforts. Act as deputy to the Integrations Team lead when required. Provide expertise on best practices for SQL Server databases in BI/reporting contexts. What Youll Bring: Strong experience More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Adecco
install, test, and maintain highly scalable data management systems. Ensure systems meet business requirements and adhere to industry best practises. Develop, implement, automate, and maintain large-scale enterprise data ETL processes. Build high-performance algorithms, prototypes, predictive models, and proof of concepts. Collaborate with data architects, modellers, and IT team members to achieve project goals. Execute plans and policies that … control, protect, and enhance the value of the organisation’s data assets. Communicate effectively with team members and stakeholders to ensure clarity and alignment. Qualifications Skills: Proficient in ETL processes and SQL. Experience in at least one programming language (Python preferred). Familiar with reporting tools and data visualisation. Strong understanding of security, trust, and safety in data management. Ability More ❯
install, test, and maintain highly scalable data management systems. Ensure systems meet business requirements and adhere to industry best practises. Develop, implement, automate, and maintain large-scale enterprise data ETL processes. Build high-performance algorithms, prototypes, predictive models, and proof of concepts. Collaborate with data architects, modellers, and IT team members to achieve project goals. Execute plans and policies that … control, protect, and enhance the value of the organisation’s data assets. Communicate effectively with team members and stakeholders to ensure clarity and alignment. Qualifications Skills: Proficient in ETL processes and SQL. Experience in at least one programming language (Python preferred). Familiar with reporting tools and data visualisation. Strong understanding of security, trust, and safety in data management. Ability More ❯
most well-funded and exciting InsurTech scale-ups in the UK. ROLE AND RESPONSIBILITIES Build and design robust data models, working end-to-end across modelling in DBT andETL processes Develop and maintain scalable data pipelines using SQL and Snowflake Work closely with analysts, data scientists and product teams to ensure data is reliable and well-structured Own the … EXPERIENCE Required: 3+ years in an Analytics Engineering role Solid experience with DBT or similar modern modelling tools Advanced SQL skills and hands-on Snowflake experience Strong understanding of ETL concepts and building scalable pipelines Good communication skills — able to collaborate across functions and explain technical concepts clearly Insurance or motor industry experience is a plus, but not essential Interview More ❯
most well-funded and exciting InsurTech scale-ups in the UK. ROLE AND RESPONSIBILITIES Build and design robust data models, working end-to-end across modelling in DBT andETL processes Develop and maintain scalable data pipelines using SQL and Snowflake Work closely with analysts, data scientists and product teams to ensure data is reliable and well-structured Own the … EXPERIENCE Required: 3+ years in an Analytics Engineering role Solid experience with DBT or similar modern modelling tools Advanced SQL skills and hands-on Snowflake experience Strong understanding of ETL concepts and building scalable pipelines Good communication skills — able to collaborate across functions and explain technical concepts clearly Insurance or motor industry experience is a plus, but not essential Interview More ❯
who thrives in fast-paced, high-growth environments and is ready to take full ownership of the data function. Key Responsibilities: Build and maintain a robust data warehouse andETL processes to support rapid growth and market expansion. Develop and own dashboards in Looker Studio; provide clear, actionable insights using SQL and related tools. Partner with key stakeholders (Head of … within e-commerce, gaming, or similarly fast-paced sectors, with advanced SQL skills, Python and proficiency in Looker Studio (or equivalent tools). Builder mindset: Hands-on experience designing ETL pipelines and data warehousing solutions from the ground up, with the structure and initiative to scale processes in a growing business. Commercially driven: Able to translate data into clear, actionable More ❯