are looking for a Lead Data Solutions Architect to work within a dynamic, remote-first data architectural capability to deliver cloud based data solutions using best-in-class RDBMS, ETL/ELT, and Cloud platforms for blue-chip customers across a range of sectors. You will lead cross-functional teams of Data Engineers, Architects, Business Analysts and Quality Assurance Analysts … solid understanding of big data technologies such as Apache Spark, and knowledge of Hadoop ecosystems. Knowledge of programming languages such as Python, R, or Java is beneficial. Exposure to ETL/ELT processes, SQL, NoSQL databases is a nice-to-have, providing a well-rounded background. Experience with data visualization tools and DevOps principles/tools is advantageous. Familiarity with More ❯
Deep technical knowledge of database development, design and migration Experience of deployment in cloud using Terraform or CloudFormation Automation or Scripting experience using languages such as Python & Bash etc. ETLand workflow management knowledge Experience of Agile methodologies Experience in the Financial Services Sector Data Engineering or Data Science experience Job responsibilities Interface with client project sponsors to gather, assess … and principles from the AWS Well-Architected Framework Assess, document and translate goals, objectives, problem statements, etc. to offshore/onshore management teams Advise on database performance, altering the ETL process, providing SQL transformations, discussing API integration, and deriving business and technical KPIs Guide the transition of solutions into the hands of the client, providing documentation to operate and maintain More ❯
technologies (Azure, AWS, GCP) and tools like Databricks, Snowflake, Synapse. Shape cloud migration and modernization strategies with a strong focus on DevOps practices. Architect scalable data models and robust ETL/ELT pipelines using industry-standard frameworks. Implement data governance and quality frameworks to ensure data integrity and compliance. Collaborate with clients’ senior leadership to influence data-driven transformation initiatives. More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Focused Futures Consultancy LTD
technologies (Azure, AWS, GCP) and tools like Databricks, Snowflake, Synapse. Shape cloud migration and modernization strategies with a strong focus on DevOps practices. Architect scalable data models and robust ETL/ELT pipelines using industry-standard frameworks. Implement data governance and quality frameworks to ensure data integrity and compliance. Collaborate with clients’ senior leadership to influence data-driven transformation initiatives. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
The business is investing heavily in its platform over the next 12 months, offering significant career growth for curious and ambitious talent. Key Responsibilities Maintain and improve Python-based ETL pipelines (~60 currently in production) Support the migration from on-premise to cloud-based SQL environments Keep dashboards and reporting chartbooks (Power BI & PDFs) up to date Implement basic data More ❯
and DevOps practices related to database development. Desirable Skills: Experience working with legal systems and understanding the data requirements within the legal industry. Knowledge of data warehousing concepts andETL processes. Experience with automation and scripting (e.g., PowerShell) for database management tasks. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent work experience). More ❯
Fabric or expertise in related technologies such as Power BI, Azure Synapse, Data Factory, Azure Data Lake, Notebooks etc. A solid understanding of data engineering principles, including data modelling, ETL/ELT processes, and data warehousing. Strong communication and stakeholder management skills, with the ability to explain complex technical concepts to non-technical audiences. A proactive, client-focused mindset with More ❯
Fabric or expertise in related technologies such as Power BI, Azure Synapse, Data Factory, Azure Data Lake, Notebooks etc. A solid understanding of data engineering principles, including data modelling, ETL/ELT processes, and data warehousing. Strong communication and stakeholder management skills, with the ability to explain complex technical concepts to non-technical audiences. A proactive, client-focused mindset with More ❯
experience Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Strong Knowledge in Python and SQL Understanding of ETL best practices, data portioning and schema evolution Experience with data modelling and working with large-scale datasets and a solid understanding of data lake architecture and data warehousing Preferred qualifications More ❯
Fabric or strong expertise in related technologies such as Power BI, Azure Synapse, Data Factory, Azure Data Lake etc. A solid understanding of data engineering principles, including data modelling, ETL/ELT processes, and data warehousing. Hands-on experience with Power BI and DAX, and ideally some exposure to Notebooks, Pipelines, or Lakehouses within Fabric. Strong communication and stakeholder management More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
Fabric or strong expertise in related technologies such as Power BI, Azure Synapse, Data Factory, Azure Data Lake etc. A solid understanding of data engineering principles, including data modelling, ETL/ELT processes, and data warehousing. Hands-on experience with Power BI and DAX, and ideally some exposure to Notebooks, Pipelines, or Lakehouses within Fabric. Strong communication and stakeholder management More ❯
tools such as GA4 and Adobe Analytics to track the right data Data engineering: For smaller clients: centralise and clean marketing data using proprietary tools For larger clients: manage ETL processes and build scalable, clean tables Lay strong data foundations to enable analytics, reporting and modelling Consulting & insight activation: Translate analysis into actionable guidance for media, CRO, CRM and creative More ❯
tools such as GA4 and Adobe Analytics to track the right data Data engineering: For smaller clients: centralise and clean marketing data using proprietary tools For larger clients: manage ETL processes and build scalable, clean tables Lay strong data foundations to enable analytics, reporting and modelling Consulting & insight activation: Translate analysis into actionable guidance for media, CRO, CRM and creative More ❯
Lambda, CloudWatch) is a significant plus. Excellent SQL skills and confident Python programming. Knowledge of Kotlin and Golang , and the ability to work with unfamiliar codebases. Experience building robust ETL/ELT pipelines . Understanding of data warehouse (DWH) design principles and data normalization . Ability to work under uncertainty , prioritize tasks , and propose practical solutions. Skill in reading documentation More ❯
minute delivery service of Amazon BASIC QUALIFICATIONS - Bachelor's degree or equivalent - Experience defining requirements and using data and metrics to draw business insights - Experience with SQL or ETL - 2+ years of Excel or Tableau (data manipulation, macros, charts and pivot tables) experience - Knowledge of Microsoft Excel at an advanced level, including: pivot tables, macros, index/match, vlookup, VBA More ❯
Cardiff, Wales, United Kingdom Hybrid / WFH Options
SCG Connected
Relevant Microsoft certifications in Dynamics 365 and/or Power Platform. Experience working in agile delivery environments, particularly at scale. Knowledge of SSRS, FetchXML, SSIS, or other reporting andETL tools. Experience with Power Pages (Portals) and customer-facing apps. Familiarity with data compliance standards (e.g., GDPR) and security models within Dynamics. Soft Skills & Traits A confident problem-solver with More ❯
Cardiff, South Glamorgan, Wales, United Kingdom Hybrid / WFH Options
Southern Communications Ltd
Relevant Microsoft certifications in Dynamics 365 and/or Power Platform. Experience working in agile delivery environments, particularly at scale. Knowledge of SSRS, FetchXML, SSIS, or other reporting andETL tools. Experience with Power Pages (Portals) and customer-facing apps. Familiarity with data compliance standards (e.g., GDPR) and security models within Dynamics. Soft Skills & Traits A confident problem-solver with More ❯
key. Your Experience Demonstrated experience in senior roles related to data engineering or data platform development. Proficient in Python and SQL. Familiar with data integration tools and frameworks (e.g., ETL/ELT, streaming technologies). Experience working with cloud infrastructure (e.g., AWS). Strong knowledge of data modeling, warehousing, and big data platforms. Skilled communicator and team collaborator. Background in More ❯
Newcastle Upon Tyne, Tyne And Wear, United Kingdom Hybrid / WFH Options
Cybit
technologies, emerging tools, and best practices to stay ahead in the data and analytics field and bring new ideas into the team An understanding of core data concepts including ETL processes, data warehousing, relational databases, and data modelling Capability to work collaboratively in cross-functional teams and contribute to continuous improvement in delivery processes Project delivery experience Stakeholder management experience More ❯
causes that drive business impact We're excited if you have 4+ years of relevant work experience in Analytics, Business Intelligence, or Technical Operations Master in SQL, Python, andETL using big data tools (HIVE/Presto, Redshift) Previous experience with web frameworks for Python such as Django/Flask is a plus Experience writing data pipelines using Airflow Fluency More ❯
monitor robust data pipelines and transformative models using our modern data stack, including GCP, Airflow, dbt, and potentially emerging technologies like real-time streaming platforms Develop and manage reverse ETL processes to seamlessly integrate data with our commercial systems, ensuring operational efficiency Maintain and optimise our Customer Data Platform (CDP, ensuring effective data collection, unification, and routing Administer and support More ❯
term strategic planning. Job Title : Data Analyst Job Focus : Software Reports To : Product Manager Location : Purfleet, Essex Region of Work : Technical Team Key Responsibilities Develop, implement, and manage efficient ETL (Extract/Transform/Load) processes. Manage existing data sources and investigate/develop new sources. Clean and collate data from our Marketplace platform. Produce analytics for our products, services More ❯
infrastructure to ensure the reliability and efficiency of our data and systems used by our Machine Learning team. Your role will include creating and maintaining data pipelines that transformandload data from various products and managing the AWS infrastructure for our machine learning platform. Additionally, you will work with engineers, product managers, and data scientists to design and implement … privacy. In this role, you will Interact with product teams to understand how our safety systems interact with their data systems. Design and implement an automated end-to-end ETL process, including data anonymization, to prepare data for machine learning and ad hoc analysis. Manage and scale the tools and technologies we use to label data running on AWS. Devise … BSc or MSc in Computer Science/Software Engineering or related subject - candidates without a degree are welcome as long as they have extensive hands-on experience. Experience in ETL technical design, automated data quality testing, QA and documentation, data warehousing, and data modeling. Experience with Python for interaction with Web Services (e.g., Rest and Postman). Experience with using More ❯
been awardedActuarial Software of the Year. The role in this mission is to pioneer advancements in the field of pensions and beyond, leveraging state-of-the-art technology to extract valuable and timely insights from data. This enables the consultant to better advise Trustees and Corporate clients on a wide range of actuarial-related areas. ML Ops Engineer Main Duties … and maintain monitoring of model drifts, data-quality alerts, scheduled r-training pipelines. Collect, clean and preprocess large datasets to facilitate analysis and model training. Implement data pipelines andETL processes to ensure data availability and quality. Write clean, efficient and scalable code in Python. Utilize CI/CD practices for version control, testing and code review. ML Ops Engineer More ❯
awarded Actuarial Software of the Year. The role in this mission is to pioneer advancements in the field of pensions and beyond, leveraging state-of-the-art technology to extract valuable and timely insights from data. This enables the consultant to better advise Trustees and Corporate clients on a wide range of actuarial-related areas. ML Ops Engineer Main Duties … and maintain monitoring of model drifts, data-quality alerts, scheduled r-training pipelines. Collect, clean and preprocess large datasets to facilitate analysis and model training. Implement data pipelines andETL processes to ensure data availability and quality. Write clean, efficient and scalable code in Python. Utilize CI/CD practices for version control, testing and code review. ML Ops Engineer More ❯