Mathematics Preferred : Experience in the financial services industry, preferably with exposure to market risk or counterparty credit risk applications Experience in ETL (extract, transformandLoad) development Some Cloud experience will be a nice plus. Personal Attributes: Strong analytical, verbal and written communication skills Self-starter and entrepreneurial in approach More ❯
services. Knowledge of using PySpark in notebooks for data analysis and manipulation. Strong proficiency with SQL and data modelling. Experience with modern ELT/ETL tools within the Microsoft ecosystem. Solid understanding of data lake and lakehouse architectures. Hands-on experience with Power BI for data integration and visualisation. Familiarity More ❯
services. Knowledge of using PySpark in notebooks for data analysis and manipulation. Strong proficiency with SQL and data modelling. Experience with modern ELT/ETL tools within the Microsoft ecosystem. Solid understanding of data lake and lakehouse architectures. Hands-on experience with Power BI for data integration and visualisation. Familiarity More ❯
transparency and standardization across teams. Data Migration & Transformation: - Lead data migration efforts, particularly during system upgrades or transitions to new platforms. - Define and implement ETL (Extract, Transform, Load) processes for transforming data into usable formats for analytics and reporting. Documentation and Reporting: - Document data architecture designs, processes, and standards for … big data platforms (e.g., Hadoop, Spark, Kafka). • Familiarity with cloud-based data platforms and services (e.g., AWS, Azure, Google Cloud). • Expertise in ETL tools and processes (e.g., Apache NiFi, Talend, Informatica). • Proficiency in data integration tools and technologies. • Familiarity with data visualization and reporting tools (e.g., Tableau More ❯
slough, south east england, United Kingdom Hybrid / WFH Options
Substance Global
which will enable users across the organization to self-serve analytics. Strong Python (Pandas/SQL Alchemy) & SQL(BigQuery/Snowflake) skills for maintaining ETL/ELT pipelines. Developing tools and processes to ensure the quality and observability of data Monitoring and improving the performance and efficiency of the data More ❯
and deliver sustainable solutions. Monitor and troubleshoot data pipeline issues to maintain data integrity and accuracy. Assist in the development, maintenance, and optimization of ETL (Extract, Transform, Load) processes for efficiency and reliability. Project & Improvement: Assist in gathering, documenting, and managing data engineering requirements and workflows. Contribute to the development … reviews of designs, prototypes, and other work products to ensure requirements are met. Skills & Experience: Essential: Basic understanding of data engineering concepts, such as ETL processes, data pipelines, and data quality management. Hands-on experience with SQL (e.g., writing queries, basic database management). Familiarity with data tools and platforms More ❯
and deliver sustainable solutions. Monitor and troubleshoot data pipeline issues to maintain data integrity and accuracy. Assist in the development, maintenance, and optimization of ETL (Extract, Transform, Load) processes for efficiency and reliability. Project & Improvement: Assist in gathering, documenting, and managing data engineering requirements and workflows. Contribute to the development … reviews of designs, prototypes, and other work products to ensure requirements are met. Skills & Experience: Essential: Basic understanding of data engineering concepts, such as ETL processes, data pipelines, and data quality management. Hands-on experience with SQL (e.g., writing queries, basic database management). Familiarity with data tools and platforms More ❯
senior stakeholders Comfortable in agile, fast-paced delivery environments ⭐ Desirable: Insurance sector experience Familiarity with cloud platforms (Azure, AWS, GCP) Experience with DataLake/ETL tools (e.g. Databricks, Synapse, Dataiku) Documenting technical data requirements and designing data tests 🎓 Qualifications: BSc or higher in Computer Science, Maths, Economics, or related field More ❯
slough, south east england, United Kingdom Hybrid / WFH Options
Methods
and machine learning workloads. - Write clean, efficient, and maintainable code in Python and C# for data processing, transformation, and integration tasks. - Implement data ingestion, ETL/ELT processes, and data warehousing solutions. - Collaborate with data scientists, analysts, and stakeholders to understand data requirements and deliver solutions. - Ensure data quality, security More ❯
IF YOU ARE Experienced with Python/PySpark Proficient working with Databricks Lakehouse architecture and principles Having 2+ years of designing data models, building ETL pipelines, and wrangling data to solve business problems Experienced with Azure cloud technologies Modern Data Estate such as Azure Data Factory, Azure DevOps, Azure Synapse More ❯
IF YOU ARE Experienced with Python/PySpark Proficient working with Databricks Lakehouse architecture and principles Having 2+ years of designing data models, building ETL pipelines, and wrangling data to solve business problems Experienced with Azure cloud technologies Modern Data Estate such as Azure Data Factory, Azure DevOps, Azure Synapse More ❯
collaborate with many technology partners like AWS, GCP, Microsoft, Databricks, Snowflake, Confluent, and others. IF YOU ARE A technology professional focused on Data Warehouses, ETL, and BI solutions development Experienced in eliciting business requirements to address the customer’s data visualization needs Ready to dive into a customer's subject More ❯
collaborate with many technology partners like AWS, GCP, Microsoft, Databricks, Snowflake, Confluent, and others. IF YOU ARE A technology professional focused on Data Warehouses, ETL, and BI solutions development Experienced in eliciting business requirements to address the customer’s data visualization needs Ready to dive into a customer's subject More ❯
using Azure D&A stack, Databricks, and Azure Open AI. Proficiency in coding (Python, PL/SQL, Shell Script), relational and non-relational databases, ETL tools like Informatica, and scalable data platforms. Experience with Azure Data and Analytics stack; familiarity with AWS and GCP data solutions. Knowledge of deploying AI More ❯
CCPA), and data security best practices. Proven experience in enterprise-level architecture design and implementation. Hands-on knowledge of database systems (SQL/NoSQL), ETL/ELT processes, and data modeling techniques. Exceptional leadership, communication, and stakeholder management skills. Ability to work in fast-paced, agile environments and balance long More ❯
slough, south east england, United Kingdom Hybrid / WFH Options
Pace
ll be expected to lead by example across TDD practices, paired programming, CI/CD integration, and infrastructure automation. Key responsibilities: Develop and deploy ETL pipelines for data cleansing using Azure-based tooling. Build and maintain outbound/inbound API endpoints for downstream systems (e.g. reporting tools, finance systems). More ❯
initiatives and contribute to the development of innovative solutions. Responsibilities: Web Crawling and Data Extraction: Develop, deploy, and maintain web crawlers using Python to extract data from websites and social media platforms. Ensure the scalability, reliability, and efficiency of web scraping processes. Data Cleaning and Preprocessing: Perform data cleaning, standardization … to provide insights and support decision-making processes. Work with financial datasets to identify trends, patterns, and anomalies. Data Pipeline Development: Design and maintain ETL (Extract, Transform, Load) pipelines to streamline data workflows. Integrate data from multiple sources and ensure seamless data flow across systems. Collaboration and Communication: Work closely More ❯
slough, south east england, United Kingdom Hybrid / WFH Options
Careerwise
large datasets efficiently. Collaborate with cross-functional teams to understand business requirements and deliver data-driven insights. Design and build scalable data pipelines andETL processes. Perform data exploration, preprocessing, and feature engineering. Conduct statistical analysis and machine learning model development. Communicate findings and insights to stakeholders through data visualization More ❯
slough, south east england, United Kingdom Hybrid / WFH Options
Radley James
Lake Formation, and other standard data engineering tools. Familiarity with DevOps practices and infrastructure-as-code (e.g., Terraform, CloudFormation) Solid understanding of data modeling, ETL frameworks, and big data technologies Experience working in financial services or regulated industries is a plus What’s on Offer A collaborative and inclusive work More ❯
using Azure D&A stack, Databricks, and Azure Open AI. Proficiency in coding (Python, PL/SQL, Shell Script), relational and non-relational databases, ETL tools (e.g., Informatica), and scalable data platforms. Knowledge of Azure Data and Analytics stack; familiarity with AWS and GCP data solutions. Experience deploying AI solutions More ❯
slough, south east england, United Kingdom Hybrid / WFH Options
Careerwise
in Databricks, PySpark, Knowledge Graphs, Neo4J (Graph database and analytics), Power BI, SSRS, Azure Data Factory, and AI technologies. Strong understanding of data architecture, ETL processes, and data governance. Leadership Skills: Demonstrated ability to lead and inspire teams, manage complex projects, and drive organisational change. Analytical Skills: Strong analytical andMore ❯
Principal Data Scientist to operationalise RAG systems, fine-tune data retrieval processes, and optimise training datasets for AI model development. Automation and Optimisation: Automate ETL (Extract, Transform, Load) processes, reduce manual intervention, and continuously identify opportunities to enhance the efficiency and reliability of data workflows. Support Research and Prototyping: Build More ❯
reading, south east england, United Kingdom Hybrid / WFH Options
Ingentive
On a daily basis your varied role will include, but will not be limited to: Design, build, and optimize high-performance data pipelines andETL workflows using tools like Azure Synapse, Azure Databricks or Microsoft Fabric. Implement scalable solutions to ingest, store, andtransform vast datasets, ensuring data availability andMore ❯
slough, south east england, United Kingdom Hybrid / WFH Options
Ingentive
On a daily basis your varied role will include, but will not be limited to: Design, build, and optimize high-performance data pipelines andETL workflows using tools like Azure Synapse, Azure Databricks or Microsoft Fabric. Implement scalable solutions to ingest, store, andtransform vast datasets, ensuring data availability andMore ❯
reporting development (advanced/expert levels only). Key Tools & Technologies Visualization Tools: Tibco Spotfire, Tableau, Power BI, or QlikView Data Management & Querying: SQL, ETL pipelines, Data Warehousing (e.g., ODW) Scripting/Programming: Iron Python, R, R Shiny, SAS Collaboration & Platforms: SharePoint, clinical trial data platforms Qualifications Entry Level: Bachelor More ❯