data architecture principles, big data technologies (e.g., Hadoop, Spark), and cloud platforms like AWS, Azure, or GCP. • Data Management Skills : Advanced proficiency in data modelling, SQL/NoSQL databases, ETL processes, and data integration techniques. • Programming & Tools : Strong skills in Python or Java, with experience in data visualization tools and relevant programming frameworks. • Governance & Compliance : Solid understanding of data governance More ❯
data processing and reporting. Data Modelling using the Kimball Methodology. Experience in developing CI/CD pipelines using Gitlab or similar. Comprehensive knowledge of data engineering, data modelling andETL best practice Experience of working within a global team. Experience of working with multiple stakeholders as part of an Agile team. Experience in developing production-ready data ingestion and processing More ❯
software release and change management process using industry best practices to help DSS maintain legal compliance. Basic Qualifications: 5+ years of experience working on mission critical data pipelines andETL systems. 5+ years of hands-on experience with big data technology, systems and tools such as AWS, Hadoop, Hive, and Snowflake Expertise with common Software Engineering languages such as Python More ❯
the generation of LEIs and ISINs for the UK to the regulatory reporting systems like EMIR Trade Repository and MiFIR Reporting. All are underpinned by the PTRR Platform, an ETLand visualisation suite developed in-house since 2006. More information can be found here LSEG's Regulatory Reporting LSEG The PTRR Solutions Engineering team traditionally manages the configuration layer of More ❯
Loving Heart . These values guide how we serve our clients, grow our business, and support each other. Key Responsibilities Design, develop, and maintain interactive Power BI dashboards and reports Extract, transform, andload (ETL) data from Salesforce , Simpro , Unleashed and other systems into the Microsoft Fabric Data Lake (OneLake) Build and manage data pipelines into Fabric using tools like … on experience extracting data from systems like Salesforce , Simpro , and ERP platforms into a data lake environment Strong DAX, Power Query (M), and SQL skills Familiarity with data modeling, ETL frameworks , and structured/unstructured data handling Knowledge of Power BI administration, service workspaces, and security practices Understanding of business processes and workflows across CRM, ERP, and field service systems More ❯
Loving Heart . These values guide how we serve our clients, grow our business, and support each other. Key Responsibilities Design, develop, and maintain interactive Power BI dashboards and reports Extract, transform, andload (ETL) data from Salesforce , Simpro , Unleashed and other systems into the Microsoft Fabric Data Lake (OneLake) Build and manage data pipelines into Fabric using tools like … on experience extracting data from systems like Salesforce , Simpro , and ERP platforms into a data lake environment Strong DAX, Power Query (M), and SQL skills Familiarity with data modeling, ETL frameworks , and structured/unstructured data handling Knowledge of Power BI administration, service workspaces, and security practices Understanding of business processes and workflows across CRM, ERP, and field service systems More ❯
project lifecycle execution (design, execution and risk assessment) - Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers - Interface with other technology teams to extract, transform, andload (ETL) data from a wide variety of data sources - Own the functional and nonfunctional scaling of software systems in your ownership area. - Implement big data solutions for … in AWS as well as the latest in distributed systems, forecasting algorithms, and data mining. BASIC QUALIFICATIONS - 3+ years of data engineering experience - Experience with data modeling, warehousing and building ETL pipelines - Experience with SQL PREFERRED QUALIFICATIONS - Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions - Experience with non-relational databases More ❯
the initial sentiment and topic classification models for review analysis Develop merchant benchmarking methodology across various industry categories Create data pipelines that integrate with our existing systems Implement efficient ETL processes to ensure data quality and reliability Design the data architecture with future expansion in mind Transform our product data into a strategic asset that delivers measurable value to merchants … data science roles Strong programming skills in Python and SQL Experience building NLP/ML models, particularly for text classification or sentiment analysis Background in developing data pipelines andETL processes Proven ability to translate complex data models into business insights Experience working with large datasets (preferably e-commerce or review data) Strong communication skills to collaborate with product andMore ❯
roles or similar. Designed and implemented efficient and scalable data models that support reporting and analytics requirements. Used Power Query (or similar tools) for data extraction, transformation, and loading (ETL) processes to prepare data for analysis. Demonstrable skill in performing data analysis on large datasets and preferably worked in data engineering roles to analyse andtransform data for data science. More ❯
South West London, London, United Kingdom Hybrid / WFH Options
TALENT INTERNATIONAL UK LTD
design and deliver robust, scalable data solutions that support mission-critical services across the UK. As a Senior Data Engineer, you will take ownership of building and optimising our ETL pipelines, managing large-scale datasets, and implementing cutting-edge data engineering practices using Python and modern Big Data technologies. As a Data Engineer you will be responsible for: Design, build … and maintain robust ETL/ELT pipelines across multiple data sources and formats Develop and optimise scalable data solutions using Python and distributed computing tools Work with structured and unstructured data across on-prem and cloud-based data platforms Support data quality, data governance, and compliance standards across projects Collaborate with data scientists, analysts, and key stakeholders to ensure data … following experience and skills: Proven experience as a Data Engineer, ideally in a senior or lead capacity Strong proficiency in Python for data processing and automation Deep knowledge of ETL/ELT frameworks and best practices Hands-on experience with Big Data tools (e.g. Hadoop, Spark, Kafka, Hive) Familiarity with cloud data platforms (e.g. AWS, Azure, GCP) Strong understanding of More ❯
London, England, United Kingdom Hybrid / WFH Options
TALENT INTERNATIONAL UK LTD
design and deliver robust, scalable data solutions that support mission-critical services across the UK. As a Senior Data Engineer, you will take ownership of building and optimising our ETL pipelines, managing large-scale datasets, and implementing cutting-edge data engineering practices using Python and modern Big Data technologies. As a Data Engineer you will be responsible for: Design, build … and maintain robust ETL/ELT pipelines across multiple data sources and formats Develop and optimise scalable data solutions using Python and distributed computing tools Work with structured and unstructured data across on-prem and cloud-based data platforms Support data quality, data governance, and compliance standards across projects Collaborate with data scientists, analysts, and key stakeholders to ensure data … following experience and skills: Proven experience as a Data Engineer, ideally in a senior or lead capacity Strong proficiency in Python for data processing and automation Deep knowledge of ETL/ELT frameworks and best practices Hands-on experience with Big Data tools (e.g. Hadoop, Spark, Kafka, Hive) Familiarity with cloud data platforms (e.g. AWS, Azure, GCP) Strong understanding of More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Investment Banking - London/Hybrid (Data Engineer, SQL Data Engineer, Java, Python, Spark, Scala, SQL, Snowflake, OO programming, Snowflake, Databricks, Data Fabric, design patterns, SOLID principles, ETL, Unit testing, NUnit, MSTest, Junit, Microservices Architecture, Continuous Integration, Azure DevOps, AWS, Jenkins, Agile, Data Engineer, SQL Data Engineer) We have several fantastic new roles including a Data Engineer position to …/Databricks and Java/Python, as well as experience with Microservices Architecture and Continuous Integration. Exposure to NUnit, MSTest and Junit would be beneficial, along with knowledge of ETL, Azure DevOps, AWS, and Jenkins. There is a vast amount of knowledge and experience within the business to share with the right person, so the right attitude, cultural fit, collaboration More ❯
platform serves Amazon's finance, tax and accounting functions across the globe. As a Data Engineer, you should be an expert with data warehousing technical components (e.g. Data Modeling, ETLand Reporting), infrastructure (e.g. hardware and software) and their integration. You should have deep understanding of the architecture for enterprise level data warehouse solutions using multiple platforms (RDBMS, Columnar, Cloud … develop and define key business questions, and to build data sets that answer those questions. The candidate is expected to be able to build efficient, flexible, extensible, and scalable ETLand reporting solutions. You should be enthusiastic about learning new technologies and be able to implement solutions using them to provide new functionality to the users or to scale the … ambiguity environment, making use of both quantitative analysis and business judgment. BASIC QUALIFICATIONS - Experience with SQL - 1+ years of data engineering experience - Experience with data modeling, warehousing and building ETL pipelines - Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) - Experience with one or more scripting language (e.g., Python, KornShell) PREFERRED QUALIFICATIONS More ❯
a rota. WHAT YOU WILL BE DOING Lead the architectural design, implementation, and ongoing optimization of our international data platform, aligned with enterprise-wide standards. Build and maintain scalable ETL/ELT data pipelines using Databricks, Spark, and related technologies. Optimize Databricks clusters, workflows, and jobs to ensure cost efficiency and high performance. Design and manage data lakes, data warehouses … control systems, particularly Git. Strong background in designing and optimizing complex data pipelines and infrastructure. Experience leading and mentoring technical teams of data engineers. Deep understanding of data warehousing, ETL/ELT patterns, and real-time data processing. Strong problem-solving and analytical skills, with meticulous attention to detail. Ability to work proactively, identify challenges, and implement improvements. Excellent communication More ❯
As well as good understanding of data architecture. To the level that you are able to engage with technical staff and data architectures or experts. Good knowledge of SQL, ETL technologies and data modelling. Knowledge of programming languages useful for data analytics such as Python Good knowledge of the Azure cloud data platform and the potential to use its services More ❯
As well as good understanding of data architecture. To the level that you are able to engage with technical staff and data architectures or experts. Good knowledge of SQL, ETL technologies and data modelling. Knowledge of programming languages useful for data analytics such as Python Good knowledge of the Azure cloud data platform and the potential to use its services More ❯
trades efficiently, integrating with existing backend databases. • Enable secure and efficient data file exchange between users. • Database Integration: • Work with SQL databases to handle data extraction, transformation, and loading (ETL), ensuring real-time accuracy. • Build capabilities to handle input and output of data in a range of formats including Excel for seamless data exchange and communication with corporate clients. • Azure … maintenance. Candidate Specification: Technical Skills: • Programming Languages: Strong proficiency in Python, Rust, C#, or Java, ASP.NET, or Django for backend development. • Database Expertise: Advanced skills in SQL (querying, optimization, ETL processes) and working knowledge of integrating relational databases. • File Handling: Hands-on experience processing and manipulating Excel files programmatically. • Cloud Technologies: Proficiency in Azure services, including Azure Functions, Azure SQL More ❯
or risk. Technical expertise in: SQL, data analysis, and data visualization tools (Power BI/Tableau/Qlik) APIs (knowledge of JSON, RESTful services, Swagger/OpenAPI specs) Excel, ETL tools, and basic scripting (Python is a plus) Experience working in Agile/Scrum environments. Strong documentation and communication skills. Ability to collaborate with cross-functional global teams. More ❯
data architecture, or analytics, ideally within investment management, financial services, or private markets. Strong expertise in Azure cloud services, Synapse, Databricks, Spark, and data lake architectures. Deep understanding of ETL/ELT processes, data modeling, and high-performance data warehousing. Experience managing large-scale data platforms and optimizing data pipelines for analytics and reporting. Strong strategic mindset with the ability More ❯
models, reports, and dashboards using Power BI Service and Report Server. Designing, developing, and maintaining Power BI paginated reports. Extracting, transforming, and loading data using Power Query and other ETL tools. Creating dynamic and engaging visualizations. Gathering requirements from stakeholders and translating them into technical specifications. Ensuring data accuracy, reliability, and compliance with regulations and standards. Collaborating with cross-functional More ❯
models, reports, and dashboards using Power BI Service and Report Server. Designing, developing, and maintaining Power BI paginated reports. Extracting, transforming, and loading data using Power Query and other ETL tools. Creating dynamic and engaging visualizations. Gathering requirements from stakeholders and translating them into technical specifications. Ensuring data accuracy, reliability, and compliance with regulations and standards. Collaborating with cross-functional More ❯
Your work will be vital to ensuring data is accurate, structured, and available powering both real-time business intelligence and future advanced analytics. Key Responsibilities Design and develop automated ETL/ELT pipelines using SQL and Python Integrate internal/external data sources via APIs and platform connectors Model and structure data for scalable analytics (e.g., star/snowflake schemas … Support groundwork for future data science and machine learning initiatives The successful applicant will be proficient in SQL and Python, with a proven track record of building and maintaining ETL/ELT pipelines. Experience working with Microsoft Fabric, Azure Data Factory, and modern Lakehouse or data warehouse architecture is essential. You’ll demonstrate a strong focus on data quality andMore ❯
are looking for a Lead Data Solutions Architect to work within a dynamic, remote-first data architectural capability to deliver cloud based data solutions using best-in-class RDBMS, ETL/ELT, and Cloud platforms for blue-chip customers across a range of sectors. You will lead cross-functional teams of Data Engineers, Architects, Business Analysts and Quality Assurance Analysts … solid understanding of big data technologies such as Apache Spark, and knowledge of Hadoop ecosystems. Knowledge of programming languages such as Python, R, or Java is beneficial. Exposure to ETL/ELT processes, SQL, NoSQL databases is a nice-to-have, providing a well-rounded background. Experience with data visualization tools and DevOps principles/tools is advantageous. Familiarity with More ❯
to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients Excellent ETL skills, Data Modeling Skills Excellent communication skills Ability to define the monitoring, alerting, deployment strategies for various services. Experience providing solution for resiliency, fail over, monitoring etc. Good to have More ❯
technologies (Azure, AWS, GCP) and tools like Databricks, Snowflake, Synapse. Shape cloud migration and modernization strategies with a strong focus on DevOps practices. Architect scalable data models and robust ETL/ELT pipelines using industry-standard frameworks. Implement data governance and quality frameworks to ensure data integrity and compliance. Collaborate with clients’ senior leadership to influence data-driven transformation initiatives. More ❯