london, south east england, united kingdom Hybrid / WFH Options
iO Associates
tools such as dbt and Airflow. Additional skills in The successful candidate should have the following skills: Extensive hands-on experience with Snowflake data platform. Proficiency in SQL andETL/ELT processes. Strong programming skills in Python. Extended skills across AWS, Azure, dbt, Airflow etc. Could this be of interest? If so, please get in touch with Alex at More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
iO Associates
tools such as dbt and Airflow. Additional skills in The successful candidate should have the following skills: Extensive hands-on experience with Snowflake data platform. Proficiency in SQL andETL/ELT processes. Strong programming skills in Python. Extended skills across AWS, Azure, dbt, Airflow etc. Could this be of interest? If so, please get in touch with Alex at More ❯
services Design solutions for client problem sets based on their descriptions and by exploration of the data landscape Assist in the design, development, and maintenance of data pipelines andETL processes to build data and action models to address the workflow needs Build and edit operational workflows, including front ends and decision-support toolsets, inclusive of native Palantir tooling andMore ❯
services Design solutions for client problem sets based on their descriptions and by exploration of the data landscape Assist in the design, development, and maintenance of data pipelines andETL processes to build data and action models to address the workflow needs Build and edit operational workflows, including front ends and decision-support toolsets, inclusive of native Palantir tooling andMore ❯
services Design solutions for client problem sets based on their descriptions and by exploration of the data landscape Assist in the design, development, and maintenance of data pipelines andETL processes to build data and action models to address the workflow needs Build and edit operational workflows, including front ends and decision-support toolsets, inclusive of native Palantir tooling andMore ❯
services Design solutions for client problem sets based on their descriptions and by exploration of the data landscape Assist in the design, development, and maintenance of data pipelines andETL processes to build data and action models to address the workflow needs Build and edit operational workflows, including front ends and decision-support toolsets, inclusive of native Palantir tooling andMore ❯
Data Engineer, you will have: Strong experience with SQL (e.g., MS SQL, Oracle). Knowledge of NoSQL technologies (e.g., MongoDB, InfluxDB, Neo4J). Experience with data exchange and processing (ETL, ESB, APIs). Proficiency in Python or similar programming languages. Familiarity with big data frameworks (e.g., Hadoop ecosystem). Desirable Skills: Understanding of NLP (Natural Language Processing) and OCR (Optical More ❯
collaborative, forward-thinking environment that values creativity, accountability, and growth-where your contributions will directly shape the future of real estate technology. Role Responsibilities Architect, build, and optimise scalable ETL pipelines for diverse datasets. Onboard, document, and curate external datasets for internal use. Perform data validation, forensic analysis, and troubleshooting. Deliver high-quality, maintainable Python code and participate in peer More ❯
/X3/other variant) or similar systems. Strong IT Administration background - understanding of networks, permissions, system setups, and troubleshooting. Experience in data migration (Excel, CSV imports, SQL, or ETL tools). Ability to work with stakeholders across IT, Finance, and Operations. Strong analytical and problem-solving skills. Excellent communication and documentation abilities. Company Global real estate organisation Global organisation More ❯
will work closely with data scientists, analysts, and engineers to build efficient data solutions and enable data-driven decision-making. Key Responsibilities: Develop, optimize, and maintain data pipelines andETL processes using Apache Spark and Scala. Design scalable and robust data processing solutions for batch and real-time data. Collaborate with cross-functional teams to gather requirements and translate them More ❯
will work closely with data scientists, analysts, and engineers to build efficient data solutions and enable data-driven decision-making. Key Responsibilities: Develop, optimize, and maintain data pipelines andETL processes using Apache Spark and Scala. Design scalable and robust data processing solutions for batch and real-time data. Collaborate with cross-functional teams to gather requirements and translate them More ❯
london (city of london), south east england, united kingdom
Capgemini
will work closely with data scientists, analysts, and engineers to build efficient data solutions and enable data-driven decision-making. Key Responsibilities: Develop, optimize, and maintain data pipelines andETL processes using Apache Spark and Scala. Design scalable and robust data processing solutions for batch and real-time data. Collaborate with cross-functional teams to gather requirements and translate them More ❯
will work closely with data scientists, analysts, and engineers to build efficient data solutions and enable data-driven decision-making. YOUR PROFILE Develop, optimize, and maintain data pipelines andETL processes using Apache Spark and Scala. Design scalable and robust data processing solutions for batch and real-time data. Collaborate with cross-functional teams to gather requirements and translate them More ❯
will work closely with data scientists, analysts, and engineers to build efficient data solutions and enable data-driven decision-making. YOUR PROFILE Develop, optimize, and maintain data pipelines andETL processes using Apache Spark and Scala. Design scalable and robust data processing solutions for batch and real-time data. Collaborate with cross-functional teams to gather requirements and translate them More ❯
will work closely with data scientists, analysts, and engineers to build efficient data solutions and enable data-driven decision-making. YOUR PROFILE Develop, optimize, and maintain data pipelines andETL processes using Apache Spark and Scala. Design scalable and robust data processing solutions for batch and real-time data. Collaborate with cross-functional teams to gather requirements and translate them More ❯
london, south east england, united kingdom Hybrid / WFH Options
Gravitas Recruitment Group (Global) Ltd
and the ability to engage stakeholders at all levels Commitment to continuous improvement and eagerness to learn Desirable Skills Integrations (Stu-talk) Strong knowledge of Oracle and SQL databases & ETL processes Knowledge of Azure development environment (Azure Data factory, Logic Apps) Ability to translate non-technical requirements to technical solutions Why Apply? Flexibility: Remote-first or hybrid roles available Be More ❯
Sunderland, Tyne and Wear, England, United Kingdom Hybrid / WFH Options
Reed
cause analysis to prevent future occurrences. Required Skills & Qualifications: Proven experience in Database Administration, particularly with SQL databases. Familiarity with high availability solutions and disaster recovery techniques. Experience with ETL tools and processes (e.g., Informatica, Talend, SSIS, Azure Data Factory). Proficiency in version control systems (GitHub, Bitbucket). Strong experience in writing, maintaining, and troubleshooting Transact SQL (TSQL). More ❯
Sunderland, Tyne and Wear, England, United Kingdom Hybrid / WFH Options
Reed
cause analysis to prevent future occurrences. Required Skills & Qualifications: Proven experience in Database Administration, particularly with SQL databases. Familiarity with high availability solutions and disaster recovery techniques. Experience with ETL tools and processes (e.g., Informatica, Talend, SSIS, Azure Data Factory). Proficiency in version control systems (GitHub, Bitbucket). Strong experience in writing, maintaining, and troubleshooting Transact SQL (TSQL). More ❯
which is part of a much larger IT function. Role: Data Movement & Transformation processes between Application/Services/Solutions. Azure Data Factory Pipelines & SSRS reporting solutions - maintenance & optimization ETL - Design, implement & manage ETL processes - ensure accuracy, quality & consistency. Monitor daily Data Loads & ETL workflows. AWS - Support the migration of Data Services to AWS - scalable & cloud-first solutions. Deliver UK … Liaison with IT Business Systems & Business Teams & Third Parties. Drive Continuous Improvement. Technical Skills Required: Azure Data Factory. AWS Data Services (Redshift, S3, Lambda) - Desirable. SSRS - hands-on experience. ETL - Design, Data Quality Frameworks & Pipeline Management. SQL - Star Schema - Data Modelling. Data Warehouse Design, Development & Testing. Data Warehousing Methodologies. User Support - Power BI - desirable experience. Undertake & Support IT Governance processes … Being Programme/Cycle to Work Scheme/Electric Car Salary Sacrifice Scheme. Keywords: Data Engineer, Azure Data Factory, AWS, AWS Data Services, Redshift, S3, Lambda, Azure BI Technologies, ETL, Data Warehousing, Data Modelling & Kimball Methodology, Data Engineer, Power BI, SQL, ETL, SSRS, Data Engineer. Cambridge, Permanent, T6/MN/1292110. More ❯
Cambridgeshire, Newmarket, Suffolk, United Kingdom
Mackenzie Jones IT
which is part of a much larger IT function. Role: Data Movement & Transformation processes between Application/Services/Solutions. Azure Data Factory Pipelines & SSRS reporting solutions - maintenance & optimization ETL - Design, implement & manage ETL processes - ensure accuracy, quality & consistency. Monitor daily Data Loads & ETL workflows. AWS - Support the migration of Data Services to AWS - scalable & cloud-first solutions. Deliver UK … Liaison with IT Business Systems & Business Teams & Third Parties. Drive Continuous Improvement. Technical Skills Required: Azure Data Factory. AWS Data Services (Redshift, S3, Lambda) - Desirable. SSRS - hands-on experience. ETL - Design, Data Quality Frameworks & Pipeline Management. SQL - Star Schema - Data Modelling. Data Warehouse Design, Development & Testing. Data Warehousing Methodologies. User Support - Power BI - desirable experience. Undertake & Support IT Governance processes … Being Programme/Cycle to Work Scheme/Electric Car Salary Sacrifice Scheme. Keywords: Data Engineer, Azure Data Factory, AWS, AWS Data Services, Redshift, S3, Lambda, Azure BI Technologies, ETL, Data Warehousing, Data Modelling & Kimball Methodology, Data Engineer, Power BI, SQL, ETL, SSRS, Data Engineer. Cambridge, Permanent, T6/MN/(phone number removed). More ❯
Rugby, Warwickshire, England, United Kingdom Hybrid / WFH Options
Gleeson Recruitment Group
data solutions. Support the evaluation, design and implementation of data platforms and tools. Structure and prepare data for analytics, data mining, machine learning and application use. Manage and monitor ETL processes, data migrations and conversions. Ensure compliance with data governance, security, and privacy standards. Contribute to continuous improvement, automation and best practice adoption across DataOps functions. Skills & Experience Solid experience … Server database development, with strong T-SQL skills and query performance optimisation. Demonstrated expertise in data modelling and data mapping , including conceptual, logical and physical design. Strong understanding of ETL design and data warehousing principles. Proficiency working with data in multiple formats (JSON, XML, CSV, etc.). Experience with BI or reporting tools such as Power BI or Tableau . More ❯
Be Doing Data Architecture & Platform Design Design and implement an enterprise data lake on Azure Data Lake Gen2, using Bronze/Silver/Gold architecture. Build and maintain scalable ETL/ELT pipelines in Azure Data Factory to integrate data from core systems (AS400, Tagetik, CRM, Esker, Slimstock). Develop the overall data model, data dictionaries, and lineage documentation. Deliver … Gen2, Azure Data Factory, and Azure Active Directory. Advanced skills in data modelling (conceptual, logical, physical) and SQL for complex transformations. Proven ability to design and build high-performance ETL/ELT pipelines. Understanding of data governance, security, and access control frameworks. Knowledge of batch and real-time data integration and experience with ODBC connectors or REST APIs. Familiarity with More ❯