Key Responsibilities Lead the migration of data assets from SQL Server to Databricks , including both infrastructure setup and pipeline engineering Build and maintain parameterized ETL pipelines that support multiple data sets with minimal code duplication (e.g., a single ETL pattern adaptable for 8–12 similar data sets) Own the end … environments to Databricks , including rewriting stored procedures and workflows Deep understanding of data orchestration and automation , with proven ability to engineer reusable, parameter-driven ETL pipelines Solid experience with infrastructure-as-code , deployment automation, and DevOps in data environments (e.g., Azure DevOps, Terraform, ARM templates) Excellent skills in T-SQL More ❯
and data characteristics. Integrate Dash applications with Impala to efficiently query and process large data sets. Implement and manage Oozie job schedulers for maintaining ETL processes to efficiently load, transform, and distribute daily data. Employ agile development practices to develop effective business solutions based on the business needs. Required Skills … Computer Science, Mathematics, Applied Mathematics, Statistics/Data Science, or related Quantitative experience. Significant working experience in process automation, data analysis, and/or ETL development with a focus on banking and financial services. Domain Knowledge & Problem-Solving Skills: Demonstrable experience in the banking industry, with a solid understanding of … development concepts (HTML, CSS, JavaScript). Proficiency in data visualization libraries (Plotly, Seaborn). Solid understanding of database design principles and normalization. Experience with ETL tools and processes and Apache Oozie or similar workflow management tools. Understanding of Machine Learning and AI concepts is a plus. Leadership & Interpersonal Skills: Proven More ❯
the next level. Responsibilities include: Owning business-critical reporting used globally by hundreds of users, from developing and maintaining large-scale data structures andETL pipelines to creating reports in Quicksight or Excel Gathering business requirements from stakeholders in Finance and Business teams and translating them into scalable, automated solutions … data with Redshift, Oracle, NoSQL, etc. Experience with data visualization tools such as Tableau, Quicksight, or similar Experience with data modeling, warehousing, and building ETL pipelines Experience with statistical analysis packages like R, SAS, or Matlab Proficiency in SQL and scripting (Python) for data processing and modeling Preferred qualifications Experience … with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, and working with large, complex datasets in a business environment Client Description Our client is a FTSE 100, multinational technology company renowned for various domains including e-commerce, cloud computing, digital streaming, artificial intelligence, andMore ❯
Design, develop, and maintain scalable, high-performance data pipelines. Ensure consistent and accurate integration of internal and external data sources. Follow best practices in ETL, data modelling (medallion architecture), and pipeline development. Collaborate with Traders and Quants to deliver data solutions aligned with business needs. Maintain data quality, troubleshoot issues … Databricks, including unity catalogue (preferably on Azure) Strong programming in Scala and Python Deep understanding of Spark architecture and performance tuning Proven experience with ETL design, data modelling, and database design Strong grasp on the full SDLC and Agile delivery Solid knowledge of trading, derivatives, and risk systems (nice to More ❯
Compliance, providing an opportunity to influence critical workflows and enhance data-driven decision-making across the organization. Responsibilities Design & implement robust data pipelines andETL processes, ensuring high standards in data modeling, documentation, and testing. Develop and optimize SQL queries for data extraction, aggregation, and reporting. Collaborate with key business … when required. Train & mentor other developers in data engineering best practices. Requirements Strong programming skills in Python and SQL, with experience in data modeling, ETL pipelines, and large datasets. 5+ years of experience working with P&L reporting, financial analytics, or similar data-driven workflows in a financial institution. Experience More ❯
Troubleshoot and resolve pipeline issues to maintain seamless integration and deployment workflows. Responsibility 3: Data Engineering and Management Design and implement data pipelines andETL workflows to process and analyze large datasets. Manage data warehouses and streaming solutions to ensure efficient data storage and retrieval. Collaborate with cross-functional teams … IAM. Expertise in setting up and optimizing CI/CD pipelines using modern tools. Strong background in data engineering, including experience with data pipelines, ETL processes, data warehousing, and data streaming. Proficiency with Infrastructure as Code (IaC) tools such as Terraform, OpenTofu, or CloudFormation. Deep understanding of cloud security best More ❯
slick, automated data pipelines in Python. Key Responsibilities: Lead data migration and integration tasks, ensuring quality and alignment across systems. Design, implement, and optimise ETL processes to streamline data movement from diverse sources. Build and enhance a robust Snowflake data warehouse to support evolving reporting needs. Troubleshoot and resolve data … transformation to validation and analysis. Strong Python skills and confidence building automation for large-scale data tasks. Hands-on Snowflake experience. Deep understanding of ETL/ELT pipelines and data governance standards. A solid grasp of financial data or lending domain knowledge is highly desirable. A passion for mentoring andMore ❯
that underpin our work, and protect our digital security. The Role Within this role you will be responsible for delivering high quality Data applications, ETL processes and operational reports following development best practices and security standards, as well as providing 2nd and 3rd level support for those same applications. This … end-users are regularly updated on progress against their service request/incident/problem/change request. Responsible for monitoring data applications andETL processes and proactively act upon any detected issues. Key Knowledge, Skills and Experience Good organisation skills with a logical, analytical approach to solving IT problems More ❯
all requirements continue to be met. Scope and plan implementation of any customer changes/requirements. Maintain automated data pipelines to support data ingestion, ETL, and storage. What you'll bring: Experience in performing a technical leadership role on projects and experience with transitioning projects into a support program. Experience … a particular focus on libraries and tools commonly used in data engineering, such as Pandas, NumPy, Apache Airflow. Experience with data pipelines, ELT/ETL processes, and data wrangling. Dashboard analytics (PowerBI, Looker Studio or Tableau) experience. Excellent English, written and verbal. Worked in a client-facing role previously andMore ❯
London, England, United Kingdom Hybrid / WFH Options
IDEXX
and React frontends. You’ll utilise tools such as Terraform for infrastructure-as-code ( IaC ), AWS (Lambda, EC2, EKS, Step Functions, VPC etc.) for ETL, Airflow pipelines, Snowflake, and ensure architectural alignment with AI/ML initiatives and data-driven services. You Will Serve As The Go-to Engineer For … bespoke Conversion Framework. Build new and maintain existing bespoke systems. Implement .NET-based microservices with strong observability and integration with data platforms. Develop custom ETL pipelines using AWS, Python, and MySQL. Implement governance, lineage, and monitoring to ensure high availability and traceability. AI & Advanced Analytics Integration: Collaborate with AI/ More ❯
and React frontends. You'll utilise tools such as Terraform for infrastructure-as-code ( IaC ), AWS (Lambda, EC2, EKS, Step Functions, VPC etc.) for ETL, Airflow pipelines, Snowflake, and ensure architectural alignment with AI/ML initiatives and data-driven services. You will serve as the go-to engineer for … bespoke Conversion Framework. Build new and maintain existing bespoke systems. Implement .NET-based microservices with strong observability and integration with data platforms. Develop custom ETL pipelines using AWS, Python, and MySQL. Implement governance, lineage, and monitoring to ensure high availability and traceability. AI & Advanced Analytics Integration: Collaborate with AI/ More ❯
ensuring their successful execution Supervise the planning, management, and execution of analyses to support campaign strategy and planning Data Transformation: Design data architecture andETL processes for efficiently processing large volumes of data periodically Oversee the building and maintenance of data pipelines to ingest data from diverse sources Design and … ensure delivery excellence Qualifications Required Experience in paid digital reporting within a media agency or client-side, and team management experience Proficient in advanced ETL techniques and adept at handling and processing large volumes of data efficiently Campaign and marketing analysis, identifying optimization opportunities, and deriving strategic insights Skilled in More ❯
Power Apps Developer/Power Platform Developer Power Apps, Logic Apps, Power Automate, C#, SQL, Azure Data Factory, Azure Functions; Systems Integration, ETL Development. Permanent, London/Hybrid (3/2), £80k - £100k +Bonus +Benefits Global Law Firm seeks Power Apps Developer/Power Platform Developer to work on a … Azure Data Factory) and Azure Functions Development of low-code solutions and integrations using Logic Apps and related systems - Power Automate, PowerBI, ServiceBus etc ETL routine development using ADF Integration engineering of a range of business systems inc MS Dynamics, InTapp, Elite3E, iManage and Workday etc We are searching for … a Power Apps Developer/Power Platform Developer/ETL Developer/C# Developer with a solid understanding of architecture design and delivery within the Azure space who possesses: Experience of Power Platform Development (Power Apps, Power Automate, PowerBI) Experience of integration and enhancement of systems using Logic Apps Data More ❯
crucial role in designing, developing, and maintaining data architecture and infrastructure. The successful candidate should possess a strong foundation in Python, Pyspark, SQL, andETL processes , with a demonstrated ability to implement solutions in a cloud environment. Position - Jr Data Engineer Location - London Job Type - Hybrid, Permanent Mandatory Skills : Design … build, maintain data pipelines using Python, Pyspark and SQL Develop and maintain ETL processes to move data from various data sources to our data warehouse on AWS/AZURE/GCP. Collaborate with data scientists, business analysts to understand their data needs & develop solutions that meet their requirements. Develop & maintain More ❯
crucial role in designing, developing, and maintaining data architecture and infrastructure. The successful candidate should possess a strong foundation in Python, Pyspark, SQL, andETL processes, with a demonstrated ability to implement solutions in a cloud environment. Position - Senior Data Engineer Experience - 6+ yrs Location - London Job Type - Hybrid, Permanent … Mandatory Skills : Design, build, maintain data pipelines using Python, Pyspark and SQL Develop and maintain ETL processes to move data from various data sources to our data warehouse on AWS/AZURE/GCP . Collaborate with data scientists, business analysts to understand their data needs & develop solutions that meet More ❯
Senior ODI ETL & OAS BI Developer – Banking | £75,000 | 2-Year FTC - could go full time perm | London We’re hiring a Senior Data Developer with strong experience in Oracle Data Integrator (ODI) and Oracle Analytics Server (OAS) to join a growing financial institution on a 2-year fixed-term … core banking systems like Flexcube, Oracle ERP , and ideally Finastra OPICS . This is a hands-on role covering everything from data warehousing andETL design to business intelligence dashboards and stakeholder insights. What we’re looking for: Strong ODI and OAS experience Proficiency in SQL, PL/SQL, andMore ❯
with daily complimentary breakfast, all-day drinks, and a calendar full of social events and networking opportunities. What you will Bring: Proven excellence in ETL , scalable architectures , and automation . Experience in developing and deploying AI/ML pipelines. Excellent in Python , and good exposure to SQL , and both relational … developing and deploying AI/ML pipelines . Familiarity with Airflow , Redshift , and dbt . What you will do: Architect, build, and optimise robust ETL processes for efficient data extraction, transformation, and loading. Develop an automation model and sophisticated data pipelines using Python, Airflow, Redshift, and more. Collaborate with data More ❯
robust systems that manage, process, and deliver business-critical data across the organisation. As a CDS Developer, you’ll work on: Data pipelines andETL processes APIs and internal services that power core business systems Data modelling and architecture Maintaining data quality, security, and governance 📚 What’s in it for … we’re looking for: Experience as a Developer or Data Engineer Strong problem-solving skills and interest in data-driven systems Any knowledge of ETL, APIs, SQL, or data pipelines A passion for learning and growing in a supportive team This is more than just a job—it’s a More ❯
interpreting data with Redshift, Oracle, NoSQL etc. Experience with data visualization using Tableau, Quicksight, or similar tools. Experience with data modeling, warehousing and building ETL pipelines. Experience in Statistical Analysis packages such as R, SAS and Matlab. Experience using SQL to pull data from a database or data warehouse and … Experience in the data/BI space. PREFERRED QUALIFICATIONS Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift. Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets. Amazon is an equal opportunities employer. We believe passionately that employing a More ❯
extraction across websites and social media platforms Perform data cleaning, standardization, and normalization to ensure consistency and quality across all datasets Build and maintain ETL pipelines for processing structured and unstructured data Conduct data analysis and modeling using tools like Pandas, NumPy, Scikit-learn, and TensorFlow Leverage financial data expertise … Proficiency in Python for web crawling using libraries like Scrapy, BeautifulSoup, or Selenium Strong understanding of data cleaning, standardization, and normalization techniques Experience building ETL/ELT pipelines and working with large-scale data workflows Hands-on experience with data analysis and machine learning libraries such as Pandas, NumPy, Scikit More ❯
London, England, United Kingdom Hybrid / WFH Options
Harnham
extraction across websites and social media platforms Perform data cleaning, standardization, and normalization to ensure consistency and quality across all datasets Build and maintain ETL pipelines for processing structured and unstructured data Conduct data analysis and modeling using tools like Pandas, NumPy, Scikit-learn, and TensorFlow Leverage financial data expertise … Proficiency in Python for web crawling using libraries like Scrapy, BeautifulSoup, or Selenium Strong understanding of data cleaning, standardization, and normalization techniques Experience building ETL/ELT pipelines and working with large-scale data workflows Hands-on experience with data analysis and machine learning libraries such as Pandas, NumPy, Scikit More ❯
build key finance models for real-time insights. Create and maintain commercial dashboards using Power BI for executive-level consumption. Maintain SQL databases andETL processes to ensure dataset integrity. Analyze large volumes of finance data to identify trends, risks, and opportunities. Facilitate and create operational budgets, distribute to stakeholders … cost controls; relevant qualifications (ACCA/ACA/CIMA) or QBE (5+ years) preferred. Experience in multi-site hospitality operations. Experience with Easy Morph ETL tool and API integrations is desirable. Experience with IBM Planning Analytics (TM1) is desirable. Knowledge of Aloha, Fourth Hospitality, and Sage accounting systems is desirable. More ❯
actionable intelligence. Required qualifications to be successful in this role Data Analysis experience, using SQL, ideally Hadoop or other Big Data environments. Experience with ETL experience and on ETL projects where you have been involved with data mappings and transformations. Analytical problem-solving capabilities. Ability to self-start new projects More ❯
actionable intelligence Required qualifications to be successful in this role • Data Analysis experience, using SQL, ideally Hadoop or other Big Data environments. • Experience with ETL experience and on ETL projects where you have been involved with data mappings and transformations • Analytical problem-solving capabilities • Ability to self-start new projects More ❯
SD, MM, FI/CO. • MDM and Data Migration: Experience in delivering MDM and data migration programs using both standard and innovative approaches. • SAP ETL Technologies: 8-10 years of experience with recognized enterprise SAP ETL technologies such as SAP Data Services and Migration Cockpit, including migrating data to SAP More ❯