IF YOU ARE Experienced with Python/PySpark Proficient working with Databricks Lakehouse architecture and principles Having 2+ years of designing data models, building ETL pipelines, and wrangling data to solve business problems Experienced with Azure cloud technologies Modern Data Estate such as Azure Data Factory, Azure DevOps, Azure Synapse More ❯
south west london, south east england, United Kingdom
SoftServe
collaborate with many technology partners like AWS, GCP, Microsoft, Databricks, Snowflake, Confluent, and others. IF YOU ARE A technology professional focused on Data Warehouses, ETL, and BI solutions development Experienced in eliciting business requirements to address the customer’s data visualization needs Ready to dive into a customer's subject More ❯
collaborate with many technology partners like AWS, GCP, Microsoft, Databricks, Snowflake, Confluent, and others. IF YOU ARE A technology professional focused on Data Warehouses, ETL, and BI solutions development Experienced in eliciting business requirements to address the customer’s data visualization needs Ready to dive into a customer's subject More ❯
similar warehousing technology, real-time systems). Experience with AWS services such as Lambda, SNS, S3, EKS, API Gateway. Knowledge of data warehouse design, ETL/ELT processes, and big data technologies (e.g., Snowflake, Spark). Understanding of data governance and compliance frameworks (e.g., GDPR, HIPAA). Strong communication andMore ❯
london, south east england, United Kingdom Hybrid / WFH Options
Pace
ll be expected to lead by example across TDD practices, paired programming, CI/CD integration, and infrastructure automation. Key responsibilities: Develop and deploy ETL pipelines for data cleansing using Azure-based tooling. Build and maintain outbound/inbound API endpoints for downstream systems (e.g. reporting tools, finance systems). More ❯
plans What does Leidos need from me? Proven experience in front-end development with a focus on building data-driven applications. Experience working with ETL platforms Experience with Python and its numerical, data and machine learning libraries Experience of working in an agile software development environment Experience estimating task effort More ❯
a culture of technical excellence within the team. Data Engineering Expertise: - Deep understanding of data engineering principles and best practices, including data modeling, observable ETL/ELT processes, data warehousing, and data governance. - Proficiency in data manipulation languages (e.g., SQL/DBT) and programming languages relevant to data engineering (e.g. More ❯
initiatives and contribute to the development of innovative solutions. Responsibilities: Web Crawling and Data Extraction: Develop, deploy, and maintain web crawlers using Python to extract data from websites and social media platforms. Ensure the scalability, reliability, and efficiency of web scraping processes. Data Cleaning and Preprocessing: Perform data cleaning, standardization … to provide insights and support decision-making processes. Work with financial datasets to identify trends, patterns, and anomalies. Data Pipeline Development: Design and maintain ETL (Extract, Transform, Load) pipelines to streamline data workflows. Integrate data from multiple sources and ensure seamless data flow across systems. Collaboration and Communication: Work closely More ❯
initiatives and contribute to the development of innovative solutions. Responsibilities: Web Crawling and Data Extraction: Develop, deploy, and maintain web crawlers using Python to extract data from websites and social media platforms. Ensure the scalability, reliability, and efficiency of web scraping processes. Data Cleaning and Preprocessing: Perform data cleaning, standardization … to provide insights and support decision-making processes. Work with financial datasets to identify trends, patterns, and anomalies. Data Pipeline Development: Design and maintain ETL (Extract, Transform, Load) pipelines to streamline data workflows. Integrate data from multiple sources and ensure seamless data flow across systems. Collaboration and Communication: Work closely More ❯
initiatives and contribute to the development of innovative solutions. Responsibilities: Web Crawling and Data Extraction: Develop, deploy, and maintain web crawlers using Python to extract data from websites and social media platforms. Ensure the scalability, reliability, and efficiency of web scraping processes. Data Cleaning and Preprocessing: Perform data cleaning, standardization … to provide insights and support decision-making processes. Work with financial datasets to identify trends, patterns, and anomalies. Data Pipeline Development: Design and maintain ETL (Extract, Transform, Load) pipelines to streamline data workflows. Integrate data from multiple sources and ensure seamless data flow across systems. Collaboration and Communication: Work closely More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Intec Select
and deliver sustainable solutions. Monitor and troubleshoot data pipeline issues to maintain data integrity and accuracy. Assist in the development, maintenance, and optimization of ETL (Extract, Transform, Load) processes for efficiency and reliability. Project & Improvement: Assist in gathering, documenting, and managing data engineering requirements and workflows. Contribute to the development … reviews of designs, prototypes, and other work products to ensure requirements are met. Skills & Experience: Essential: Basic understanding of data engineering concepts, such as ETL processes, data pipelines, and data quality management. Hands-on experience with SQL (e.g., writing queries, basic database management). Familiarity with data tools and platforms More ❯
west london, south east england, United Kingdom Hybrid / WFH Options
Intec Select
and deliver sustainable solutions. Monitor and troubleshoot data pipeline issues to maintain data integrity and accuracy. Assist in the development, maintenance, and optimization of ETL (Extract, Transform, Load) processes for efficiency and reliability. Project & Improvement: Assist in gathering, documenting, and managing data engineering requirements and workflows. Contribute to the development … reviews of designs, prototypes, and other work products to ensure requirements are met. Skills & Experience: Essential: Basic understanding of data engineering concepts, such as ETL processes, data pipelines, and data quality management. Hands-on experience with SQL (e.g., writing queries, basic database management). Familiarity with data tools and platforms More ❯
south west london, south east england, United Kingdom Hybrid / WFH Options
Intec Select
and deliver sustainable solutions. Monitor and troubleshoot data pipeline issues to maintain data integrity and accuracy. Assist in the development, maintenance, and optimization of ETL (Extract, Transform, Load) processes for efficiency and reliability. Project & Improvement: Assist in gathering, documenting, and managing data engineering requirements and workflows. Contribute to the development … reviews of designs, prototypes, and other work products to ensure requirements are met. Skills & Experience: Essential: Basic understanding of data engineering concepts, such as ETL processes, data pipelines, and data quality management. Hands-on experience with SQL (e.g., writing queries, basic database management). Familiarity with data tools and platforms More ❯
Azure D&A stack, Databricks and Azure Open AI solution. Proficiency in coding (Python, PL/SQL, Shell Script) relational and non-relational databases, ETL tooling (such as Informatica), scalable data platforms. Proficiency in Azure Data and Analytics stack; working knowledge on AWS and GCP data solutions. Good understanding of More ❯
services and hands-on with Microsoft Fabric (OneLake, Lakehouse, DirectLake, Power BI integration, etc.) would be a distinct advantage. Strong understanding of data modelling, ETL/ELT pipelines, and data warehousing principles. Skilled in designing scalable and secure solutions using best practices and industry frameworks. Excellent communication and stakeholder engagement More ❯
services and hands-on with Microsoft Fabric (OneLake, Lakehouse, DirectLake, Power BI integration, etc.) would be a distinct advantage. Strong understanding of data modelling, ETL/ELT pipelines, and data warehousing principles. Skilled in designing scalable and secure solutions using best practices and industry frameworks. Excellent communication and stakeholder engagement More ❯
Employment Type: Permanent
Salary: £90000 - £110000/annum Plus bonus and package
large datasets efficiently. Collaborate with cross-functional teams to understand business requirements and deliver data-driven insights. Design and build scalable data pipelines andETL processes. Perform data exploration, preprocessing, and feature engineering. Conduct statistical analysis and machine learning model development. Communicate findings and insights to stakeholders through data visualization More ❯
large datasets efficiently. Collaborate with cross-functional teams to understand business requirements and deliver data-driven insights. Design and build scalable data pipelines andETL processes. Perform data exploration, preprocessing, and feature engineering. Conduct statistical analysis and machine learning model development. Communicate findings and insights to stakeholders through data visualization More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Careerwise
large datasets efficiently. Collaborate with cross-functional teams to understand business requirements and deliver data-driven insights. Design and build scalable data pipelines andETL processes. Perform data exploration, preprocessing, and feature engineering. Conduct statistical analysis and machine learning model development. Communicate findings and insights to stakeholders through data visualization More ❯
Lake Formation, and other standard data engineering tools. Familiarity with DevOps practices and infrastructure-as-code (e.g., Terraform, CloudFormation) Solid understanding of data modeling, ETL frameworks, and big data technologies Experience working in financial services or regulated industries is a plus What’s on Offer A collaborative and inclusive work More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Radley James
Lake Formation, and other standard data engineering tools. Familiarity with DevOps practices and infrastructure-as-code (e.g., Terraform, CloudFormation) Solid understanding of data modeling, ETL frameworks, and big data technologies Experience working in financial services or regulated industries is a plus What’s on Offer A collaborative and inclusive work More ❯
This role is an opportunity to lead the build of bespoke data systems for our clients. Responsibilities: Design and implement scalable data pipelines andETL processes using Azure and Databricks technologies including Delta Live Tables. Lead technical discussions with clients and stakeholders to gather requirements and propose solutions. Help clients More ❯
in data migration, integration, and processing workflows Contribute to project planning, risk identification, and mitigation strategies Skills & Experience - Essential Strong hands-on experience with ETL tools such as SSIS , AWS Glue , or Azure Data Factory Proven background in setting up/administering cloud-based data platforms (AWS or Azure) Proficient More ❯
Statistics, Mathematics & Engineering Minimum 4 years of experience as a BI developer responsible for: Dimensional modelling (Kimball Methodology) Data Modelling (Kimball Methodology) ELT/ETL Dataverse (Data Flows) Data Factory Pipeline Development Dashboarding and Reporting (PBI) Data Warehouse design (Kimball Methodology) Generating/Documenting technical specifications Effort Estimations Agile/ More ❯
scalable data systems (data warehouses, data lakes, and data pipelines). Proficiency in cloud platforms such as AWS, Azure, or Google Cloud. Experience with ETL/ELT tools and real-time data processing. Strong knowledge of SQL, NoSQL databases, and data modeling techniques. Familiarity with BI tools like Tableau, Power More ❯