Skills & Competencies Advanced Salesforce DevOps (2GP, CI/CD, modular architecture). Deep expertise in Salesforce data architecture, Flows, Apex, Lightning Web Components (LWC), OmniStudio. Strong integration knowledge (APIs, ETL, asynchronous/event-driven patterns). Proficiency in Salesforce design patterns and scalable component-based development. UI/UX development aligned with accessibility and responsive design standards. Experience with AppExchange More ❯
Skills & Competencies Advanced Salesforce DevOps (2GP, CI/CD, modular architecture). Deep expertise in Salesforce data architecture, Flows, Apex, Lightning Web Components (LWC), OmniStudio. Strong integration knowledge (APIs, ETL, asynchronous/event-driven patterns). Proficiency in Salesforce design patterns and scalable component-based development. UI/UX development aligned with accessibility and responsive design standards. Experience with AppExchange More ❯
london (city of london), south east england, united kingdom
Qurated
Skills & Competencies Advanced Salesforce DevOps (2GP, CI/CD, modular architecture). Deep expertise in Salesforce data architecture, Flows, Apex, Lightning Web Components (LWC), OmniStudio. Strong integration knowledge (APIs, ETL, asynchronous/event-driven patterns). Proficiency in Salesforce design patterns and scalable component-based development. UI/UX development aligned with accessibility and responsive design standards. Experience with AppExchange More ❯
to detail, with a commitment to delivering high-quality work. Self-starter with the ability to work independently in a fast-paced environment with minimal supervision. Desirable: Experience using ETL tools Experience loading data on mass Experience with Master Data Management (MDM). Familiarity with SAP ERP. A degree relevant to this field. What we can offer you! 25 days More ❯
Warrington, Cheshire, England, United Kingdom Hybrid / WFH Options
Brookson
a minimum requirement of 2 days in the office and the flexibility to work from home the rest of the week. What will you be doing as Data Engineer: ETL Maintenance: Collaborating with the Senior Data Engineer to ensure that the ETL process between source systems and the Data Warehouse remains fully operational, with minimal downtime and blockages. Implementing new … What are the qualities that can help you thrive as a Data Engineer? Essential Experience and Qualifications: Strong SQL skills - Tables, Stored Procedures, performance tuning etc. Experience with Azure ETL tools - Azure data Factory/Synapse Strong experience in Data movement methodologies and standards, ELT & ETL. A self-motivated, enthusiastic problem solver, with the ability to work under pressure andMore ❯
in IT or an equivalent technical subject. Experience Required: Required - Minimum 5 years experience working in a SQL Developer role or similar title Required - Minimum 5 years working on ETL Data Migrations Required - Minimum 3 years working in Azure SQL environments in a SQL Development capacity Highly Desirable - Experience of Insolvency or Financial Services sectors. Technical Competencies Required: Advanced T … SQL MS SQL Server ETLand Migration Development SSIS Azure Data Factory Databricks Microsoft Fabric Advanced Microsoft Excel Highly Desirable: Azure Devops GIT Source control Jira Confluence mySQL Desirable: SSRS & Power BI SSAS SQL Utilities such as Redgate Other development experience includes C#, VB .NET, PHP, web applications development. Be able to demonstrate the following through relevant work experience: Organisation More ❯
developing, and optimising modern data platforms on Azure Cloud — with a focus on an exciting Microsoft Fabric implementation. What you’ll do Design and build robust data pipelines andETL processes using Python and SQL. Contribute to the rollout and optimisation of Microsoft Fabric, shaping the next-generation data platform. Work across Azure services (Data Factory, Synapse, Databricks, Data Lake … Factory, Synapse, Databricks, etc.). Exposure to or keen interest in Microsoft Fabric and its integration into enterprise data strategies. Solid understanding of data architecture, pipelines, and cloud-based ETL/ELT frameworks. Familiarity with version control (Git) and CI/CD tools. Strong problem-solving skills and the ability to work in an agile, collaborative environment Why join us More ❯
london (city of london), south east england, united kingdom
Harrington Starr
developing, and optimising modern data platforms on Azure Cloud — with a focus on an exciting Microsoft Fabric implementation. What you’ll do Design and build robust data pipelines andETL processes using Python and SQL. Contribute to the rollout and optimisation of Microsoft Fabric, shaping the next-generation data platform. Work across Azure services (Data Factory, Synapse, Databricks, Data Lake … Factory, Synapse, Databricks, etc.). Exposure to or keen interest in Microsoft Fabric and its integration into enterprise data strategies. Solid understanding of data architecture, pipelines, and cloud-based ETL/ELT frameworks. Familiarity with version control (Git) and CI/CD tools. Strong problem-solving skills and the ability to work in an agile, collaborative environment Why join us More ❯
Atherstone, Warwickshire, West Midlands, United Kingdom Hybrid / WFH Options
Aldi Stores
industry level best practices. Reporting to the Platform and Engineering Manager, the candidate will be required to design and manage data warehousing solutions, including the development of data models, ETL processes and data integration pipelines for efficient data consolidation, storage and retrieval, providing technical guidance and upskilling for the team, and conducting monitoring and optimisation activities. If youre a hardworking … winning employer, apply to join #TeamAldi today! Your New Role: Project Management of demands and initiatives Lead the design and implementation of data warehousing Design and develop data models, ETL processes and data integration pipelines Complete Data Engineering end-to-end ownership of demand delivery Provide technical guidance for team members Providing 2nd or 3rd level technical support About You More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
Chapman Tate Associates
single source of truth. You’ll: Design and build the enterprise data lake using Azure Data Lake Gen2 with a Bronze/Silver/Gold layer model. Develop robust ETL/ELT pipelines in Azure Data Factory to integrate data from key systems (AS400, Tagetik, CRM, Esker, Slimstock). Shape the near-term batch data integration strategy and prepare for … with Microsoft Azure Data Lake Gen2, Azure Data Factory, and Azure Active Directory. Expert-level SQL and data modelling skills (conceptual, logical, physical). Proven ability to build scalable ETL/ELT pipelines and implement strong data governance and security controls. Experience with batch and real-time integrations , ODBC connectors, and REST APIs. Exposure to Databricks or Microsoft Fabric (desirable More ❯
Develop clean, responsive front-end interfaces using frameworks like Vue.js or React, to present complex datasets and user workflows. •Collaborate with data scientists and engineers to integrate ML models, ETL pipelines, and cloud-based data storage solutions. •Optimise system performance and reliability for high-volume data operations across distributed environments. •Implement secure, compliant and scalable data access layers, ensuring compliance … . •Experience working with containerised applications (e.g. Docker, Swarm or Kubernetes) in a Linux-based environment. •Solid understanding of RESTful API design, microservices architectures, and asynchronous workflows. •Familiarity with ETL processes, data warehousing and distributed systems. Not necessary for you apply, but would be great if you also have: •Experience with data visualisation tools such as Apache Superset, Jupytr or More ❯
Develop clean, responsive front-end interfaces using frameworks like Vue.js or React, to present complex datasets and user workflows. •Collaborate with data scientists and engineers to integrate ML models, ETL pipelines, and cloud-based data storage solutions. •Optimise system performance and reliability for high-volume data operations across distributed environments. •Implement secure, compliant and scalable data access layers, ensuring compliance … . •Experience working with containerised applications (e.g. Docker, Swarm or Kubernetes) in a Linux-based environment. •Solid understanding of RESTful API design, microservices architectures, and asynchronous workflows. •Familiarity with ETL processes, data warehousing and distributed systems. Not necessary for you apply, but would be great if you also have: •Experience with data visualisation tools such as Apache Superset, Jupytr or More ❯
Develop clean, responsive front-end interfaces using frameworks like Vue.js or React, to present complex datasets and user workflows. •Collaborate with data scientists and engineers to integrate ML models, ETL pipelines, and cloud-based data storage solutions. •Optimise system performance and reliability for high-volume data operations across distributed environments. •Implement secure, compliant and scalable data access layers, ensuring compliance … . •Experience working with containerised applications (e.g. Docker, Swarm or Kubernetes) in a Linux-based environment. •Solid understanding of RESTful API design, microservices architectures, and asynchronous workflows. •Familiarity with ETL processes, data warehousing and distributed systems. Not necessary for you apply, but would be great if you also have: •Experience with data visualisation tools such as Apache Superset, Jupytr or More ❯
Develop clean, responsive front-end interfaces using frameworks like Vue.js or React, to present complex datasets and user workflows. •Collaborate with data scientists and engineers to integrate ML models, ETL pipelines, and cloud-based data storage solutions. •Optimise system performance and reliability for high-volume data operations across distributed environments. •Implement secure, compliant and scalable data access layers, ensuring compliance … . •Experience working with containerised applications (e.g. Docker, Swarm or Kubernetes) in a Linux-based environment. •Solid understanding of RESTful API design, microservices architectures, and asynchronous workflows. •Familiarity with ETL processes, data warehousing and distributed systems. Not necessary for you apply, but would be great if you also have: •Experience with data visualisation tools such as Apache Superset, Jupytr or More ❯
existing and new advanced Power BI dashboards and reports. Assume ownership of data within Microsoft Dynamics 365, including cleansing, validation, and reconciliation. Design scalable data models and implement robust ETL processes. Develop and maintain APIs and data interfaces to enhance system connectivity. Ownership of the Data team ticketing system, to track and resolve issues efficiently. Collaborate with technical and non … sales modules). Demonstrable full-stack development skills in SQL, Turbo C and Python, with proven experience in API development and data integration. Experience designing scalable data architectures andETL pipelines. Proven ability to manage multiple priorities and deliver projects successfully. Exceptional communication skills, with the ability to translate complex technical concepts for non-technical audiences. Demonstrable understanding of data More ❯
existing and new advanced Power BI dashboards and reports. Assume ownership of data within Microsoft Dynamics 365, including cleansing, validation, and reconciliation. Design scalable data models and implement robust ETL processes. Develop and maintain APIs and data interfaces to enhance system connectivity. Ownership of the Data team ticketing system, to track and resolve issues efficiently. Collaborate with technical and non … sales modules). Demonstrable full-stack development skills in SQL, Turbo C and Python, with proven experience in API development and data integration. Experience designing scalable data architectures andETL pipelines. Proven ability to manage multiple priorities and deliver projects successfully. Exceptional communication skills, with the ability to translate complex technical concepts for non-technical audiences. Demonstrable understanding of data More ❯
london (city of london), south east england, united kingdom
Puma Investments
existing and new advanced Power BI dashboards and reports. Assume ownership of data within Microsoft Dynamics 365, including cleansing, validation, and reconciliation. Design scalable data models and implement robust ETL processes. Develop and maintain APIs and data interfaces to enhance system connectivity. Ownership of the Data team ticketing system, to track and resolve issues efficiently. Collaborate with technical and non … sales modules). Demonstrable full-stack development skills in SQL, Turbo C and Python, with proven experience in API development and data integration. Experience designing scalable data architectures andETL pipelines. Proven ability to manage multiple priorities and deliver projects successfully. Exceptional communication skills, with the ability to translate complex technical concepts for non-technical audiences. Demonstrable understanding of data More ❯
Job Summary: We are looking for a highly skilled Senior Data Engineering Developer with strong hands-on experience in Azure Data Factory , SQL development , andETL tools . The ideal candidate will design and implement data pipelines, optimize performance, and collaborate with stakeholders in an Agile environment to deliver robust data solutions. Key Responsibilities: Design and Develop Pipelines: Build and … complex stored procedures , functions , and queries for high-performance data processing. Agile Delivery: Work in an Agile model , participate in sprint planning, and ensure timely delivery of data solutions. ETL Expertise: Utilize prior experience with ETL tools such as SSIS , Informatica , or DataStage for data integration tasks. Optional Skills: Exposure to SAP BODS is a plus. Technical Solutioning: Provide technical … Cloud & Data Warehouse Concepts: Apply knowledge of cloud architecture and data warehouse principles to design scalable solutions. Required Skills & Qualifications: 8–10 years of experience in data engineering andETL development . Strong hands-on experience with Azure Data Factory and CI/CD implementation . Advanced proficiency in SQL , including performance tuning and optimization. Experience with ETL tools (SSIS More ❯
Job Summary: We are looking for a highly skilled Senior Data Engineering Developer with strong hands-on experience in Azure Data Factory , SQL development , andETL tools . The ideal candidate will design and implement data pipelines, optimize performance, and collaborate with stakeholders in an Agile environment to deliver robust data solutions. Key Responsibilities: Design and Develop Pipelines: Build and … complex stored procedures , functions , and queries for high-performance data processing. Agile Delivery: Work in an Agile model , participate in sprint planning, and ensure timely delivery of data solutions. ETL Expertise: Utilize prior experience with ETL tools such as SSIS , Informatica , or DataStage for data integration tasks. Optional Skills: Exposure to SAP BODS is a plus. Technical Solutioning: Provide technical … Cloud & Data Warehouse Concepts: Apply knowledge of cloud architecture and data warehouse principles to design scalable solutions. Required Skills & Qualifications: 8–10 years of experience in data engineering andETL development . Strong hands-on experience with Azure Data Factory and CI/CD implementation . Advanced proficiency in SQL , including performance tuning and optimization. Experience with ETL tools (SSIS More ❯
to specialise in one of the fastest-growing technology markets and collaborate with some of the most experienced leaders in the field. Responsibilities: Develop and maintain data pipelines andETL processes Analyse client challenges and design tailored solutions Collaborate on implementing AI and machine learning models Serve as a point of contact, providing consultative guidance and support Requirements: Proven programming More ❯
to specialise in one of the fastest-growing technology markets and collaborate with some of the most experienced leaders in the field. Responsibilities: Develop and maintain data pipelines andETL processes Analyse client challenges and design tailored solutions Collaborate on implementing AI and machine learning models Serve as a point of contact, providing consultative guidance and support Requirements: Proven programming More ❯
process Strong specialist experience across several of the following areas: Cloud services (SaaS, PaaS) (AWS preferred) Enterprise integration patterns and tooling (MuleSoft preferred) Enterprise data, analytics and information management, ETL knowledge High volume transactional online systems Security and identity management Service and micro-service architecture Knowledge of continuous integration tools and techniques Design and/or development of applications using More ❯
london (city of london), south east england, united kingdom
TechYard
to specialise in one of the fastest-growing technology markets and collaborate with some of the most experienced leaders in the field. Responsibilities: Develop and maintain data pipelines andETL processes Analyse client challenges and design tailored solutions Collaborate on implementing AI and machine learning models Serve as a point of contact, providing consultative guidance and support Requirements: Proven programming More ❯
the heart of a fast-paced trading environment, blending operational expertise with technical precision to ensure seamless trading execution and reliable market data delivery. Key Responsibilities Oversee and validate ETL pipelines to ensure accurate and timely pricing data using Python and SQL Bring trading systems online and provide Tier 1 and Tier 2 operational support across trading sessions Streamline, automate More ❯
london (city of london), south east england, united kingdom
Bonhill Partners
the heart of a fast-paced trading environment, blending operational expertise with technical precision to ensure seamless trading execution and reliable market data delivery. Key Responsibilities Oversee and validate ETL pipelines to ensure accurate and timely pricing data using Python and SQL Bring trading systems online and provide Tier 1 and Tier 2 operational support across trading sessions Streamline, automate More ❯