City of London, London, United Kingdom Hybrid / WFH Options
Focused Futures Consultancy LTD
technologies (Azure, AWS, GCP) and tools like Databricks, Snowflake, Synapse. Shape cloud migration and modernization strategies with a strong focus on DevOps practices. Architect scalable data models and robust ETL/ELT pipelines using industry-standard frameworks. Implement data governance and quality frameworks to ensure data integrity and compliance. Collaborate with clients’ senior leadership to influence data-driven transformation initiatives. More ❯
to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients Excellent ETL skills, Data Modeling Skills Excellent communication skills Ability to define the monitoring, alerting, deployment strategies for various services. Experience providing solution for resiliency, fail over, monitoring etc. Good to have More ❯
to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients Excellent ETL skills, Data Modeling Skills Excellent communication skills Ability to define the monitoring, alerting, deployment strategies for various services. Experience providing solution for resiliency, fail over, monitoring etc. Good to have More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
The business is investing heavily in its platform over the next 12 months, offering significant career growth for curious and ambitious talent. Key Responsibilities Maintain and improve Python-based ETL pipelines (~60 currently in production) Support the migration from on-premise to cloud-based SQL environments Keep dashboards and reporting chartbooks (Power BI & PDFs) up to date Implement basic data More ❯
Fabric or strong expertise in related technologies such as Power BI, Azure Synapse, Data Factory, Azure Data Lake etc. A solid understanding of data engineering principles, including data modelling, ETL/ELT processes, and data warehousing. Hands-on experience with Power BI and DAX, and ideally some exposure to Notebooks, Pipelines, or Lakehouses within Fabric. Strong communication and stakeholder management More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
Fabric or strong expertise in related technologies such as Power BI, Azure Synapse, Data Factory, Azure Data Lake etc. A solid understanding of data engineering principles, including data modelling, ETL/ELT processes, and data warehousing. Hands-on experience with Power BI and DAX, and ideally some exposure to Notebooks, Pipelines, or Lakehouses within Fabric. Strong communication and stakeholder management More ❯
tools such as GA4 and Adobe Analytics to track the right data Data engineering: For smaller clients: centralise and clean marketing data using proprietary tools For larger clients: manage ETL processes and build scalable, clean tables Lay strong data foundations to enable analytics, reporting and modelling Consulting & insight activation: Translate analysis into actionable guidance for media, CRO, CRM and creative More ❯
tools such as GA4 and Adobe Analytics to track the right data Data engineering: For smaller clients: centralise and clean marketing data using proprietary tools For larger clients: manage ETL processes and build scalable, clean tables Lay strong data foundations to enable analytics, reporting and modelling Consulting & insight activation: Translate analysis into actionable guidance for media, CRO, CRM and creative More ❯
Lambda, CloudWatch) is a significant plus. Excellent SQL skills and confident Python programming. Knowledge of Kotlin and Golang , and the ability to work with unfamiliar codebases. Experience building robust ETL/ELT pipelines . Understanding of data warehouse (DWH) design principles and data normalization . Ability to work under uncertainty , prioritize tasks , and propose practical solutions. Skill in reading documentation More ❯
field, with at least 5 years in a leadership role. Proven experience in managing large teams and complex data engineering projects. Strong knowledge of data architecture, data modeling, andETL/ELT processes. Proficiency in programming languages such as Python, Java, or Scala. Experience with big data technologies such as Hadoop, Spark, and Kafka. Familiarity with cloud platforms like AWS More ❯
Wandsworth, Greater London, UK Hybrid / WFH Options
Supermercados Guanabara
collaboration, with the ability to work closely with technical and non-technical teams. Desired Skills Python expertise, especially with data libraries like Pandas, NumPy, and Scikit-learn. Familiarity with ETL tools and data pipeline orchestration. Exposure to data science tools like SAS or Posit. Experience in performance-critical environments, particularly with large-scale data platforms. Seniority level Seniority level Mid More ❯
london, south east england, united kingdom Hybrid / WFH Options
Supermercados Guanabara
collaboration, with the ability to work closely with technical and non-technical teams. Desired Skills Python expertise, especially with data libraries like Pandas, NumPy, and Scikit-learn. Familiarity with ETL tools and data pipeline orchestration. Exposure to data science tools like SAS or Posit. Experience in performance-critical environments, particularly with large-scale data platforms. Seniority level Seniority level Mid More ❯
key. Your Experience Demonstrated experience in senior roles related to data engineering or data platform development. Proficient in Python and SQL. Familiar with data integration tools and frameworks (e.g., ETL/ELT, streaming technologies). Experience working with cloud infrastructure (e.g., AWS). Strong knowledge of data modeling, warehousing, and big data platforms. Skilled communicator and team collaborator. Background in More ❯
key. Your Experience Demonstrated experience in senior roles related to data engineering or data platform development. Proficient in Python and SQL. Familiar with data integration tools and frameworks (e.g., ETL/ELT, streaming technologies). Experience working with cloud infrastructure (e.g., AWS). Strong knowledge of data modeling, warehousing, and big data platforms. Skilled communicator and team collaborator. Background in More ❯
causes that drive business impact We're excited if you have 4+ years of relevant work experience in Analytics, Business Intelligence, or Technical Operations Master in SQL, Python, andETL using big data tools (HIVE/Presto, Redshift) Previous experience with web frameworks for Python such as Django/Flask is a plus Experience writing data pipelines using Airflow Fluency More ❯
monitor robust data pipelines and transformative models using our modern data stack, including GCP, Airflow, dbt, and potentially emerging technologies like real-time streaming platforms Develop and manage reverse ETL processes to seamlessly integrate data with our commercial systems, ensuring operational efficiency Maintain and optimise our Customer Data Platform (CDP, ensuring effective data collection, unification, and routing Administer and support More ❯
/few-shot learning). Collaborate with business stakeholders to translate “big problems” into technically feasible AI solutions. Data & Infrastructure Oversee the creation and maintenance of scalable data pipelines (ETL/ELT) and data lakes/warehouses. Establish best practices for data labeling, versioning, and governance to ensure high data quality. Implement ML Ops processes: CI/CD for model More ❯
/few-shot learning). Collaborate with business stakeholders to translate “big problems” into technically feasible AI solutions. Data & Infrastructure Oversee the creation and maintenance of scalable data pipelines (ETL/ELT) and data lakes/warehouses. Establish best practices for data labeling, versioning, and governance to ensure high data quality. Implement ML Ops processes: CI/CD for model More ❯
looking for a Senior Data Engineer to join our growing team. Our data team is responsible for migrating clients' legacy data onto the Intapp DealCloud platform using Extract, TransformandLoad processes. They provide expert guidance to clients, execute on the technical aspects of a migration delivery, and are knowledgeable on a wide variety of legacy data structures and best … practices to ensure we are delivering a first-class service across the board. What you will do: Migrating legacy client data into Intapp products utilizing python, SQL and other ETL pipeline processes. Lead Data Services workstreams as part of a Project team to deliver Data Services for Intapp's customers. Lead Internal development processes and consult on ETL best practices. … JSON, XML, Parquet T-SQL (relational, queries, joins, procedures, performance) Familiarity with Python and the Pandas library or similar Familiarity with RESTful and SOAP APIs Ability to build and execute ETL processes An understanding of software development methodologies Understanding of a wide range of relational data models Excellent hands-on skills with office productivity tools Data management and transformational mapping More ❯
infrastructure to ensure the reliability and efficiency of our data and systems used by our Machine Learning team. Your role will include creating and maintaining data pipelines that transformandload data from various products and managing the AWS infrastructure for our machine learning platform. Additionally, you will work with engineers, product managers, and data scientists to design and implement … privacy. In this role, you will Interact with product teams to understand how our safety systems interact with their data systems. Design and implement an automated end-to-end ETL process, including data anonymization, to prepare data for machine learning and ad hoc analysis. Manage and scale the tools and technologies we use to label data running on AWS. Devise … BSc or MSc in Computer Science/Software Engineering or related subject - candidates without a degree are welcome as long as they have extensive hands-on experience. Experience in ETL technical design, automated data quality testing, QA and documentation, data warehousing, and data modeling. Experience with Python for interaction with Web Services (e.g., Rest and Postman). Experience with using More ❯
infrastructure to ensure the reliability and efficiency of our data and systems used by our Machine Learning team. Your role will include creating and maintaining data pipelines that transformandload data from various products and managing the AWS infrastructure for our machine learning platform. Additionally, you will work with engineers, product managers, and data scientists to design and implement … privacy. In this role, you will Interact with product teams to understand how our safety systems interact with their data systems. Design and implement an automated end-to-end ETL process, including data anonymization, to prepare data for machine learning and ad hoc analysis. Manage and scale the tools and technologies we use to label data running on AWS. Devise … BSc or MSc in Computer Science/Software Engineering or related subject - candidates without a degree are welcome as long as they have extensive hands-on experience. Experience in ETL technical design, automated data quality testing, QA and documentation, data warehousing, and data modeling. Experience with Python for interaction with Web Services (e.g., Rest and Postman). Experience with using More ❯
awarded Actuarial Software of the Year. The role in this mission is to pioneer advancements in the field of pensions and beyond, leveraging state-of-the-art technology to extract valuable and timely insights from data. This enables the consultant to better advise Trustees and Corporate clients on a wide range of actuarial-related areas. ML Ops Engineer Main Duties … and maintain monitoring of model drifts, data-quality alerts, scheduled r-training pipelines. Collect, clean and preprocess large datasets to facilitate analysis and model training. Implement data pipelines andETL processes to ensure data availability and quality. Write clean, efficient and scalable code in Python. Utilize CI/CD practices for version control, testing and code review. ML Ops Engineer More ❯
been awardedActuarial Software of the Year. The role in this mission is to pioneer advancements in the field of pensions and beyond, leveraging state-of-the-art technology to extract valuable and timely insights from data. This enables the consultant to better advise Trustees and Corporate clients on a wide range of actuarial-related areas. ML Ops Engineer Main Duties … and maintain monitoring of model drifts, data-quality alerts, scheduled r-training pipelines. Collect, clean and preprocess large datasets to facilitate analysis and model training. Implement data pipelines andETL processes to ensure data availability and quality. Write clean, efficient and scalable code in Python. Utilize CI/CD practices for version control, testing and code review. ML Ops Engineer More ❯
awarded Actuarial Software of the Year. The role in this mission is to pioneer advancements in the field of pensions and beyond, leveraging state-of-the-art technology to extract valuable and timely insights from data. This enables the consultant to better advise Trustees and Corporate clients on a wide range of actuarial-related areas. ML Ops Engineer Main Duties … and maintain monitoring of model drifts, data-quality alerts, scheduled r-training pipelines. Collect, clean and preprocess large datasets to facilitate analysis and model training. Implement data pipelines andETL processes to ensure data availability and quality. Write clean, efficient and scalable code in Python. Utilize CI/CD practices for version control, testing and code review. ML Ops Engineer More ❯
HAVES OR EXCITED TO LEARN: Some experience designing, building and maintaining SQL databases (and/or NoSQL) Some experience with designing efficient physical data models/schemas and developing ETL/ELT scripts Some experience developing data solutions in cloud environments such as Azure, AWS or GCP - Azure Databricks experience a bonus 30 minute video interview with the People & Operations More ❯