City of London, London, United Kingdom Hybrid / WFH Options
Focused Futures Consultancy LTD
technologies (Azure, AWS, GCP) and tools like Databricks, Snowflake, Synapse. Shape cloud migration and modernization strategies with a strong focus on DevOps practices. Architect scalable data models and robust ETL/ELT pipelines using industry-standard frameworks. Implement data governance and quality frameworks to ensure data integrity and compliance. Collaborate with clients’ senior leadership to influence data-driven transformation initiatives. More ❯
to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients Excellent ETL skills, Data Modeling Skills Excellent communication skills Ability to define the monitoring, alerting, deployment strategies for various services. Experience providing solution for resiliency, fail over, monitoring etc. Good to have More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
The business is investing heavily in its platform over the next 12 months, offering significant career growth for curious and ambitious talent. Key Responsibilities Maintain and improve Python-based ETL pipelines (~60 currently in production) Support the migration from on-premise to cloud-based SQL environments Keep dashboards and reporting chartbooks (Power BI & PDFs) up to date Implement basic data More ❯
Fabric or strong expertise in related technologies such as Power BI, Azure Synapse, Data Factory, Azure Data Lake etc. A solid understanding of data engineering principles, including data modelling, ETL/ELT processes, and data warehousing. Hands-on experience with Power BI and DAX, and ideally some exposure to Notebooks, Pipelines, or Lakehouses within Fabric. Strong communication and stakeholder management More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
Fabric or strong expertise in related technologies such as Power BI, Azure Synapse, Data Factory, Azure Data Lake etc. A solid understanding of data engineering principles, including data modelling, ETL/ELT processes, and data warehousing. Hands-on experience with Power BI and DAX, and ideally some exposure to Notebooks, Pipelines, or Lakehouses within Fabric. Strong communication and stakeholder management More ❯
tools such as GA4 and Adobe Analytics to track the right data Data engineering: For smaller clients: centralise and clean marketing data using proprietary tools For larger clients: manage ETL processes and build scalable, clean tables Lay strong data foundations to enable analytics, reporting and modelling Consulting & insight activation: Translate analysis into actionable guidance for media, CRO, CRM and creative More ❯
tools such as GA4 and Adobe Analytics to track the right data Data engineering: For smaller clients: centralise and clean marketing data using proprietary tools For larger clients: manage ETL processes and build scalable, clean tables Lay strong data foundations to enable analytics, reporting and modelling Consulting & insight activation: Translate analysis into actionable guidance for media, CRO, CRM and creative More ❯
Lambda, CloudWatch) is a significant plus. Excellent SQL skills and confident Python programming. Knowledge of Kotlin and Golang , and the ability to work with unfamiliar codebases. Experience building robust ETL/ELT pipelines . Understanding of data warehouse (DWH) design principles and data normalization . Ability to work under uncertainty , prioritize tasks , and propose practical solutions. Skill in reading documentation More ❯
Wandsworth, Greater London, UK Hybrid / WFH Options
Supermercados Guanabara
collaboration, with the ability to work closely with technical and non-technical teams. Desired Skills Python expertise, especially with data libraries like Pandas, NumPy, and Scikit-learn. Familiarity with ETL tools and data pipeline orchestration. Exposure to data science tools like SAS or Posit. Experience in performance-critical environments, particularly with large-scale data platforms. Seniority level Seniority level Mid More ❯
key. Your Experience Demonstrated experience in senior roles related to data engineering or data platform development. Proficient in Python and SQL. Familiar with data integration tools and frameworks (e.g., ETL/ELT, streaming technologies). Experience working with cloud infrastructure (e.g., AWS). Strong knowledge of data modeling, warehousing, and big data platforms. Skilled communicator and team collaborator. Background in More ❯
causes that drive business impact We're excited if you have 4+ years of relevant work experience in Analytics, Business Intelligence, or Technical Operations Master in SQL, Python, andETL using big data tools (HIVE/Presto, Redshift) Previous experience with web frameworks for Python such as Django/Flask is a plus Experience writing data pipelines using Airflow Fluency More ❯
monitor robust data pipelines and transformative models using our modern data stack, including GCP, Airflow, dbt, and potentially emerging technologies like real-time streaming platforms Develop and manage reverse ETL processes to seamlessly integrate data with our commercial systems, ensuring operational efficiency Maintain and optimise our Customer Data Platform (CDP, ensuring effective data collection, unification, and routing Administer and support More ❯
/few-shot learning). Collaborate with business stakeholders to translate “big problems” into technically feasible AI solutions. Data & Infrastructure Oversee the creation and maintenance of scalable data pipelines (ETL/ELT) and data lakes/warehouses. Establish best practices for data labeling, versioning, and governance to ensure high data quality. Implement ML Ops processes: CI/CD for model More ❯
looking for a Senior Data Engineer to join our growing team. Our data team is responsible for migrating clients' legacy data onto the Intapp DealCloud platform using Extract, TransformandLoad processes. They provide expert guidance to clients, execute on the technical aspects of a migration delivery, and are knowledgeable on a wide variety of legacy data structures and best … practices to ensure we are delivering a first-class service across the board. What you will do: Migrating legacy client data into Intapp products utilizing python, SQL and other ETL pipeline processes. Lead Data Services workstreams as part of a Project team to deliver Data Services for Intapp's customers. Lead Internal development processes and consult on ETL best practices. … JSON, XML, Parquet T-SQL (relational, queries, joins, procedures, performance) Familiarity with Python and the Pandas library or similar Familiarity with RESTful and SOAP APIs Ability to build and execute ETL processes An understanding of software development methodologies Understanding of a wide range of relational data models Excellent hands-on skills with office productivity tools Data management and transformational mapping More ❯
infrastructure to ensure the reliability and efficiency of our data and systems used by our Machine Learning team. Your role will include creating and maintaining data pipelines that transformandload data from various products and managing the AWS infrastructure for our machine learning platform. Additionally, you will work with engineers, product managers, and data scientists to design and implement … privacy. In this role, you will Interact with product teams to understand how our safety systems interact with their data systems. Design and implement an automated end-to-end ETL process, including data anonymization, to prepare data for machine learning and ad hoc analysis. Manage and scale the tools and technologies we use to label data running on AWS. Devise … BSc or MSc in Computer Science/Software Engineering or related subject - candidates without a degree are welcome as long as they have extensive hands-on experience. Experience in ETL technical design, automated data quality testing, QA and documentation, data warehousing, and data modeling. Experience with Python for interaction with Web Services (e.g., Rest and Postman). Experience with using More ❯
awarded Actuarial Software of the Year. The role in this mission is to pioneer advancements in the field of pensions and beyond, leveraging state-of-the-art technology to extract valuable and timely insights from data. This enables the consultant to better advise Trustees and Corporate clients on a wide range of actuarial-related areas. ML Ops Engineer Main Duties … and maintain monitoring of model drifts, data-quality alerts, scheduled r-training pipelines. Collect, clean and preprocess large datasets to facilitate analysis and model training. Implement data pipelines andETL processes to ensure data availability and quality. Write clean, efficient and scalable code in Python. Utilize CI/CD practices for version control, testing and code review. ML Ops Engineer More ❯
been awardedActuarial Software of the Year. The role in this mission is to pioneer advancements in the field of pensions and beyond, leveraging state-of-the-art technology to extract valuable and timely insights from data. This enables the consultant to better advise Trustees and Corporate clients on a wide range of actuarial-related areas. ML Ops Engineer Main Duties … and maintain monitoring of model drifts, data-quality alerts, scheduled r-training pipelines. Collect, clean and preprocess large datasets to facilitate analysis and model training. Implement data pipelines andETL processes to ensure data availability and quality. Write clean, efficient and scalable code in Python. Utilize CI/CD practices for version control, testing and code review. ML Ops Engineer More ❯
awarded Actuarial Software of the Year. The role in this mission is to pioneer advancements in the field of pensions and beyond, leveraging state-of-the-art technology to extract valuable and timely insights from data. This enables the consultant to better advise Trustees and Corporate clients on a wide range of actuarial-related areas. ML Ops Engineer Main Duties … and maintain monitoring of model drifts, data-quality alerts, scheduled r-training pipelines. Collect, clean and preprocess large datasets to facilitate analysis and model training. Implement data pipelines andETL processes to ensure data availability and quality. Write clean, efficient and scalable code in Python. Utilize CI/CD practices for version control, testing and code review. ML Ops Engineer More ❯
HAVES OR EXCITED TO LEARN: Some experience designing, building and maintaining SQL databases (and/or NoSQL) Some experience with designing efficient physical data models/schemas and developing ETL/ELT scripts Some experience developing data solutions in cloud environments such as Azure, AWS or GCP - Azure Databricks experience a bonus 30 minute video interview with the People & Operations More ❯
and technologies, particularly Google BigQuery. ( Experience with Amazon Redshift can also be considered) Understanding of a variety of databases including both SQL and NoSQL databases Hands-on experience with ETL tools and processes to move andtransform data between systems Experience with Google Cloud Platform (GCP) is highly preferred.(Experience with other cloud platforms like AWS, Azure can be considered. More ❯
and solving real world problem, and building metrics and business cases to improve the customer's experience. The responsibilities include, but not limited to: • Identify and build data sources • Extract, manipulate, assess, maintain data quality and convert data to facilitate and conduct analysis Key job responsibilities • Enable effective decision making by retrieving and aggregating data from multiple sources and compiling … and pivot tables) experience - Bachelor's degree or equivalent - Experience defining requirements and using data and metrics to draw business insights - Experience creating complex SQL queries joining multiple datasets, ETL DW concepts - Experience in Excel (macros, index, conditional list, arrays, pivots, lookups) - Experience scripting for automation (e.g., Python, Perl, Ruby) - Experience using Python or R for data analysis or statistical More ❯
and curiosity Experience with a scientific computing language (such as R or Python) Experience with BI/Visualization tools like Tableau, Superset and Looker Experience with data modeling andETL pipelines Experience working with product teams Experience leveraging AI tools to boost efficiency and creativity across the data science workflow - from ideation and coding to analysis and communication We are More ❯
driven processes across business intelligence and analytics platforms Key Requirements: Proven experience as a Data Engineer within insurance (essential) Strong proficiency in SQL and Python Hands-on experience with ETL frameworks and modern data tools Familiarity with cloud platforms, ideally Microsoft Azure Solid understanding of data modelling, data warehousing, and data architecture principles This is an excellent opportunity to join More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Sanderson
efficient data migration, and supporting our data warehousing needs. Key Responsibilities : Design, develop, and maintain data pipelines using Data Stage, Red Shift, QuickSight, and S3. Perform data migration andETL processes for both batch and real-time data. Develop and optimise data warehouses to support business intelligence and analytic. Implement DevSecOps practices to ensure data security and compliance. Collaborate with … AWS services to support data integration and processing. Key Skills and Qualifications : Proven experience with Data Stage, Red Shift, Quick Sight, and S3. Strong knowledge of data migration andETL processes (both batch and real-time). Experience in data warehouse development and optimisation. Familiarity with DevSecOps practices and principles. Proficiency in Java, SQL, and relational databases. Understanding of data More ❯
Employment Type: Contractor
Rate: £425 - £462 per day, Negotiable, Inc benefits, OTE
Proven experience designing scalable data models, architectures, and pipelines, with proficiency in cloud platforms (AWS, GCP, or Azure) and data warehousing solutions. Hands-on experience with data integration tools, ETL processes, and statistical analysis tools (e.g., SAS, SPSS, STATA, R, Matlab) or general programming. Familiarity with SQL for large-scale datasets; training provided if needed. Prior experience at an insights … Architecture Leadership: Design and implement robust data architectures to support generative AI capabilities, ensuring scalability, performance, and compliance with data governance and security standards. Develop and maintain data pipelines, ETL processes, and integration tools to enable seamless data flow for AI-driven initiatives. Collaborate with data scientists, engineers, and business stakeholders to define data strategies and roadmaps aligned with business More ❯