Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
clients to gather requirements and deliver solutions. Be willing to engage and assist in pre-sales activities, bids, proposals etc. Use key techniques such as Governance, Architecture, Data Modelling, ETL/ELT, Data Lakes, Data Warehousing, Master Data, and BI. Consistently utilise key processes in the engineering delivery cycle including Agile and DevOps, Git, APIs, Containers, Microservices and Data Pipelines. More ❯
clients to gather requirements and deliver solutions. Be willing to engage and assist in pre-sales activities, bids, proposals etc. Use key techniques such as Governance, Architecture, Data Modelling, ETL/ELT, Data Lakes, Data Warehousing, Master Data, and BI. Consistently utilise key processes in the engineering delivery cycle including Agile and DevOps, Git, APIs, Containers, Microservices and Data Pipelines. More ❯
that impact the data warehouse. Ensure data accuracy, consistency, and integrity across warehouse and source systems. Maintain and evolve the data dictionary and associated metadata for the warehouse andETL systems. Mentor and support team members to build a high-performing, resilient data function. Keep up to date with industry developments and maintain relevant technical expertise. Complete all mandatory training More ❯
Modelling Design and implement scalable, high-performance data warehouse and data lake architectures. Develop conceptual, logical, and physical data models to support analytical requirements. Build and optimise data pipelines (ETL/ELT) using tools such as Azure Synapse, Snowflake, Redshift, or similar. Ensure robust data governance, security, and quality management practices. Support cloud data migrations and architecture modernisation initiatives. Front More ❯
platform for data analytics, including design and deployment of infrastructure. Expertise in creating CI/CD pipelines. Experience in creating FTP (SFTP/FTPS) configurations. Experience in working with ETL/ELT workflows for data analytics. Degree in Computer Science, Mathematics or related subject. Highly desirable skills & exposure: Working collaboratively as part of an Agile development squad. Experience and knowledge More ❯
plus. Experience in data modeling and warehouse architecture design. Experience with data orchestration and workflow management tools such as Airflow. Solid understanding of data ingestion, transformation, and ELT/ETL practices. Expertise on data transformation frameworks such as DBT. Experience with Infrastructure as Code (IaC) toolsTerraform preferred. Experience with version control systems, GitHub preferred. Experience with continuous integration and delivery More ❯
Aladdin by BlackRock, Yardi Investment Platform, Temenos Multifonds, or portfolio management tools. Understanding of fund structures, regulatory reporting requirements (e.g., UCITS, AIFMD), and investment operations. Experience with data migration, ETL, or systems integration. Familiarity with project delivery methodologies (e.g. Agile, Waterfall, Hybrid). Experience working in the Banking, Insurance, Asset Management, Healthcare, Life Science & Public Sector preferred. Life at Grant More ❯
Business Intelligence solutions using Alteryx for data preparation, SQL for data transformation and storage, Tableau for data visualization, and R/Python for advanced analytics. •Design, implement, and maintain ETL pipelines using Alteryx to extract data from diverse source systems (e.g., relational databases, APIs, flat files), transform it according to defined business rules and data quality standards, andload it … or Master's degree in Computer Science, Statistics, Mathematics, Engineering, or a related quantitative field. •Experience: 1+ years of hands-on experience in Business Intelligence and Digital Transformation projects. •ETL Expertise: Proficiency in Alteryx Designer, including the ability to design and implement complex ETL workflows, and optimize performance. •Data Visualization Proficiency: Proficiency in Tableau Desktop, including the ability to create More ❯
technical concepts into clear, actionable insights. ● Collaborate cross-functionally to align data strategy with business objectives. What we are looking for: ● 2-3+ years experience in Data Engineering, ETL Development, or Database Administration. ● Prior experience working with business intelligence, analytics, or machine learning teams. ● Experience in cloud-native data solutions and real-time data processing. ● Proficiency in Databricks, Python … SQL (for ETL & data transformation). ● Knowledge of GDPR, data security best practices, and access control policies ● Strong problem-solving and analytical skills to optimise data processes. ● Excellent collaboration and communication skills to work with cross-functional teams. ● Ability to translate business requirements into technical solutions. ● Strong attention to detail and commitment to data quality. More ❯
Required Skills: Proficiency in BI tools (Power BI, Tableau, QlikView) and data visualization techniques Strong SQL skills and experience with relational databases (e.g., SQL Server, Oracle, MySQL) Familiarity with ETL processes and data warehousing concepts Experience in data modeling and designing effective reporting structures Knowledge of programming languages such as Python or R for data manipulation and analysis Strong analytical More ❯
of actuaries, data scientist and developers. Our role in this mission is to pioneer advancements in the field of pensions and beyond, leveraging state-of-the-art technology to extract valuable and timely insights from data. This enables the consultant to better advise Trustees and Corporate clients on a wide range of actuarial-related areas. The Role As a Machine … model drifts, data-quality alerts, scheduled r-training pipelines. Data Management and Preprocessing. Collect, clean and preprocess large datasets to facilitate analysis and model training. Implement data pipelines andETL processes to ensure data availability and quality. Software Development. Write clean, efficient and scalable code in Python. Utilize CI/CD practices for version control, testing and code review. Work More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Datatech Analytics
SQL, Spark SQL, and Python for data processing and automation Knowledge of Microsoft Fabric and Azure Data Factory would be useful but not essential Power BI Solid understanding of ETL/ELT workflows, data modelling, and structuring datasets for analytics Experience working with large, complex datasets and APIs across formats Familiarity with workflow automation tools (eg, Power Automate) and/ More ❯
of actuaries, data scientist and developers. Our role in this mission is to pioneer advancements in the field of pensions and beyond, leveraging state-of-the-art technology to extract valuable and timely insights from data. This enables the consultant to better advise Trustees and Corporate clients on a wide range of actuarial-related areas. The Role As a Machine … model drifts, data-quality alerts, scheduled r-training pipelines. Data Management and Preprocessing. Collect, clean and preprocess large datasets to facilitate analysis and model training. Implement data pipelines andETL processes to ensure data availability and quality. Software Development. Write clean, efficient and scalable code in Python. Utilize CI/CD practices for version control, testing and code review. Work More ❯
of actuaries, data scientist and developers. Our role in this mission is to pioneer advancements in the field of pensions and beyond, leveraging state-of-the-art technology to extract valuable and timely insights from data. This enables the consultant to better advise Trustees and Corporate clients on a wide range of actuarial-related areas. The Role As a Machine … model drifts, data-quality alerts, scheduled r-training pipelines. Data Management and Preprocessing. Collect, clean and preprocess large datasets to facilitate analysis and model training. Implement data pipelines andETL processes to ensure data availability and quality. Software Development. Write clean, efficient and scalable code in Python. Utilize CI/CD practices for version control, testing and code review. Work More ❯
of actuaries, data scientist and developers. Our role in this mission is to pioneer advancements in the field of pensions and beyond, leveraging state-of-the-art technology to extract valuable and timely insights from data. This enables the consultant to better advise Trustees and Corporate clients on a wide range of actuarial-related areas. The Role As a Machine … model drifts, data-quality alerts, scheduled r-training pipelines. Data Management and Preprocessing. Collect, clean and preprocess large datasets to facilitate analysis and model training. Implement data pipelines andETL processes to ensure data availability and quality. Software Development. Write clean, efficient and scalable code in Python. Utilize CI/CD practices for version control, testing and code review. Work More ❯
Guildford, Surrey, United Kingdom Hybrid / WFH Options
Actica Consulting Limited
discuss how your expertise could strengthen our growing data practice. As a data engineer/scientist, you will: Data Engineering Focus: Design, implement, and maintain scalable data pipelines andETL processes Develop and maintain data warehouses and data lakes Implement data quality monitoring and validation systems Create and maintain data documentation and cataloguing systems Optimize data storage and retrieval systems More ❯
of actuaries, data scientist and developers. Our role in this mission is to pioneer advancements in the field of pensions and beyond, leveraging state-of-the-art technology to extract valuable and timely insights from data. This enables the consultant to better advise Trustees and Corporate clients on a wide range of actuarial-related areas. The Role As a Machine … model drifts, data-quality alerts, scheduled r-training pipelines. Data Management and Preprocessing. Collect, clean and preprocess large datasets to facilitate analysis and model training. Implement data pipelines andETL processes to ensure data availability and quality. Software Development. Write clean, efficient and scalable code in Python. Utilize CI/CD practices for version control, testing and code review. Work More ❯
agenda, we should discuss how your expertise could strengthen our growing data practice. As a data engineer/scientist, you will: Design, implement, and maintain scalable data pipelines andETL processes Develop and maintain data warehouses and data lakes Implement data quality monitoring and validation systems Create and maintain data documentation and cataloguing systems Optimize data storage and retrieval systems More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Actica Consulting
agenda, we should discuss how your expertise could strengthen our growing data practice. As a data engineer/scientist, you will: Design, implement, and maintain scalable data pipelines andETL processes Develop and maintain data warehouses and data lakes Implement data quality monitoring and validation systems Create and maintain data documentation and cataloguing systems Optimize data storage and retrieval systems More ❯
East Horsley, Surrey, United Kingdom Hybrid / WFH Options
Actica Consulting
agenda, we should discuss how your expertise could strengthen our growing data practice. As a data engineer/scientist, you will: Design, implement, and maintain scalable data pipelines andETL processes Develop and maintain data warehouses and data lakes Implement data quality monitoring and validation systems Create and maintain data documentation and cataloguing systems Optimize data storage and retrieval systems More ❯
of actuaries, data scientist and developers. Our role in this mission is to pioneer advancements in the field of pensions and beyond, leveraging state-of-the-art technology to extract valuable and timely insights from data. This enables the consultant to better advise Trustees and Corporate clients on a wide range of actuarial-related areas. The Role As a Machine … model drifts, data-quality alerts, scheduled r-training pipelines. Data Management and Preprocessing. Collect, clean and preprocess large datasets to facilitate analysis and model training. Implement data pipelines andETL processes to ensure data availability and quality. Software Development. Write clean, efficient and scalable code in Python. Utilize CI/CD practices for version control, testing and code review. Work More ❯
BI reporting. * Conveying technical concepts to non-technical stakeholders across the business. Senior Data Engineer - What will you need? * Extensive Data Warehouse Experience. * Proficiency in Azure: Azure Data Factory (ETL), Azure SQL Server, Azure DevOps, Key Vaults and Logic Apps. * Strong Data Modeling Skills. * Familiarity with CI/CD pipelines and agile workflows. * Experience integrating data from diverse sources, ensuring … data accuracy and consistency throughout the ETL process. * Ambition to reduce processing times and improving efficiency. Robert Walters are proud to specialise in BI & Data recruitment across the UK, offering amazing opportunities on a permanent and interim basis. We are also proud to be heading up the Data Leaders roundtables and PBI BRUM meetup group, bringing new and exciting networking More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Leading Pharmaceutical Company - Manchester(Tech Stack: Data Engineer, Databricks, Python, Power BI, Azure, TSQL, ETL, Agile Methodologies)About the Role: We are seeking a talented and experienced Data Engineer on behalf of our client, a leading Software House. This is a fully remote position, offering the opportunity to work with cutting-edge technologies and contribute to exciting projects … an experienced Data Engineer to join their team in Manchester. This hybrid position involves working within the pharmaceutical industry, focusing on the design, development, and maintenance of data pipelines, ETL processes, and databases. The role is ideal for someone passionate about improving processes, ensuring data quality, and maintaining compliance with regulatory standards. focusing on designing, developing, and maintaining data pipelines … ETL processes, and databases. If you are passionate about driving continuous improvement and ensuring data quality and compliance, we want to hear from you.Key Responsibilities:Design, develop, maintain, and optimise data pipelines, ETL processes, and databases.Drive continuous improvement by refining processes, products, and identifying new tools, standards, and practices.Collaborate with teams across the business to define solutions, requirements, and testing More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Leading Pharmaceutical Company - Manchester (Tech Stack: Data Engineer, Databricks, Python, Power BI, Azure, TSQL, ETL, Agile Methodologies) About the Role: We are seeking a talented and experienced Data Engineer on behalf of our client, a leading Software House. This is a fully remote position, offering the opportunity to work with cutting-edge technologies and contribute to exciting projects … an experienced Data Engineer to join their team in Manchester. This hybrid position involves working within the pharmaceutical industry, focusing on the design, development, and maintenance of data pipelines, ETL processes, and databases. The role is ideal for someone passionate about improving processes, ensuring data quality, and maintaining compliance with regulatory standards. focusing on designing, developing, and maintaining data pipelines … ETL processes, and databases. If you are passionate about driving continuous improvement and ensuring data quality and compliance, we want to hear from you. Key Responsibilities: Design, develop, maintain, and optimise data pipelines, ETL processes, and databases. Drive continuous improvement by refining processes, products, and identifying new tools, standards, and practices. Collaborate with teams across the business to define solutions More ❯
Gaming, and TransferGo. Our products are recognised by industry leaders like Gartner's Magic Quadrant, Forrester Wave and Frost Radar. Our tech stack: Superset and similar data visualisation tools. ETL tools: Airflow, DBT, Airbyte, Flink, etc. Data warehousing and storage solutions: ClickHouse, Trino, S3. AWS Cloud, Kubernetes, Helm. Relevant programming languages for data engineering tasks: SQL, Python, Java, etc. What … you will be doing: Designing and developing scalable and efficient data pipelines, ETL processes, and data integration solutions to support data ingestion, processing, and storage needs. Ensuring data quality and reliability by implementing data validation, data cleansing, and data quality monitoring processes. Optimising database performance by tuning queries, implementing indexing strategies, and monitoring and analysing system performance metrics. Collaborating with … stakeholders to define data strategies, implement data governance policies, and ensure data security and compliance. About you: Strong technical proficiency in data engineering technologies, such as Apache Airflow, ClickHouse, ETL tools, and SQL databases. Deep understanding of data modeling, ETL processes, data integration, and data warehousing concepts. Proficiency in programming languages commonly used in data engineering, such as Python, Java More ❯