Experience with Tableau, Alteryx, Tableau Prep. Hands-on experience with SAP SuccessFactors, SAP Analytics Cloud (SAC). Knowledge of data visualisation best practices and dashboard design principles. Experience with ETL processes and data transformation workflows. Understanding of HR data structures, key metrics, and reporting requirements. Experience supporting HR system migrations or implementations. Experience supporting technology implementations or large-scale transformation More ❯
London, England, United Kingdom Hybrid / WFH Options
Ekimetrics
be a Senior Data Engineer or Backend Software Engineer, with hands-on experience building data pipelines and comfortable doing so autonomously. You are a Subject matter expert on building ETL processes and related technologies on cloud platforms. You’ll ideally have worked across multiple projects end-to-end and have experience partnering with data scientists or other senior stakeholders, both More ❯
Liverpool, England, United Kingdom Hybrid / WFH Options
Evelyn Partners
needs. Extensive use of and fully conversant in SQL. Experience working with programming languages like C#, Python, Java, Spark. Create and maintain ETL/ELT processes to extract, transform, andload data from various sources into the data platforms. Design, develop, and deploy SSIS packages and ADF pipelines. Manage and troubleshoot SSIS packages and ADF pipelines, ensuring data integrity and … error handling is robust. Document data warehouse architecture, ETL processes, and database configurations. Job Description Evelyn Partners Data Services Team At Evelyn Partners, we are expanding our Data Services team, investing in a large-scale modernisation programme to drive innovation and insights. We’re building a new team to enhance data modelling, reporting, AI initiatives, and cloud data platforms like … needs. Extensive use of and fully conversant in SQL. Experience working with programming languages like C#, Python, Java, Spark. Create and maintain ETL/ELT processes to extract, transform, andload data from various sources into the data platforms. Design, develop, and deploy SSIS packages and ADF pipelines. Manage and troubleshoot SSIS packages and ADF pipelines, ensuring data integrity andMore ❯
will transition into a paid full-time role. Job Description: As a Data Scientist at Luupli, you will play a pivotal role in leveraging AWS analytics services to analyse andextract valuable insights from our data sources. You will collaborate with cross-functional teams, including data engineers, product managers, and business stakeholders, to develop data-driven solutions and deliver actionable … analysis strategies using AWS analytics services, such as Amazon Redshift, Amazon Athena, Amazon EMR, and Amazon QuickSight. Design and build robust data pipelines andETL processes to extract, transform, andload data from diverse sources into AWS for analysis. Apply advanced statistical and machine learning techniques to perform predictive and prescriptive analyses, clustering, segmentation, and pattern recognition. Identify key metrics … cloud-based environment using AWS analytics services. 3.Strong proficiency in AWS analytics services, such as Amazon Redshift, Amazon Athena, Amazon EMR, and Amazon QuickSight. 4.Solid understanding of data modelling, ETL processes, and data warehousing concepts. 5.Proficiency in statistical analysis, data mining, and machine learning techniques. 6.Proficiency in programming languages such as Python, R, or Scala for data analysis and modelling. More ❯
London or East Kilbride, East Kilbride, South Lanarkshire, Scotland Hybrid / WFH Options
Government Digital & Data
other Cloud technologies such as Oracle and Amazon Web Services (AWS) are used in FCDO. The successful candidates will build complex data pipelines (both ExtractTransformandLoad {ETL} andExtractLoadandTransform {ELT}) in the Azure cloud platforms. You will work with structured and unstructured data, data lakes, data warehouses to service operational and analytical business needs. Job More ❯
Wilmslow, England, United Kingdom Hybrid / WFH Options
The Citation Group
to support the successful integration of acquisitions and system upgrades. A strong background in Python programming is preferred, as the successful candidate will leverage Python to build and automate ETL (Extract, Transform, Load) processes, perform data validation and reconciliation, and develop custom scripts to facilitate data extraction and transformation between varied systems. Collecting and documenting data requirements. Pro-actively ensuring … platform and pipelines are reliable and robust, fixing issues as they arise and proposing & implementing stability improvements Develop, deploy and maintain ETL/ELT pipelines to automate financial data collection, transformation, validation and exception reporting Actively engaging with stakeholder across multiple departments. Data platforms administration Ensure relevant technical documentation is in place About you: We are looking for someone who … should have: Excellent writing and communication skills with an attention to detail Experience writing, troubleshooting, and debugging advanced SQL queries Outstanding analytical and problem-solving skills Experience working with ETL/ELT, and reporting and analytic tools Experience designing data warehouse technical architectures and infrastructure components Programming experience in Python Skills and experience we’d love you to have... Understanding More ❯
Wolverhampton, West Midlands (County), United Kingdom Hybrid / WFH Options
SF Recruitment
BI Developer will design and implement data-driven solutions that enhance both decision-making and operational efficiency. Key Responsibilities Develop and maintain robust SQL queries and stored procedures for ETL processes across varied data sources. Design and deploy interactive dashboards and reports using tools such as Power BI. Work with stakeholders to gather requirements and translate them into technical deliverables. More ❯
Reading, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Kimball methodology. DP-600 – Fabric Analytics Engineer Associate certification. Responsibilities: Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines andETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, andtransform large datasets, ensuring data quality and availability. Writing clean, efficient, andMore ❯
Cheltenham, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Kimball methodology. DP-600 – Fabric Analytics Engineer Associate certification. Responsibilities: Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines andETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, andtransform large datasets, ensuring data quality and availability. Writing clean, efficient, andMore ❯
Cambridge, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Fabric Analytics Engineer Associate certification. Responsibilities: On a daily basis, your role will include, but is not limited to: Designing, building, and optimizing high-performance data pipelines andETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, andtransform large datasets, ensuring data quality and availability. Writing clean, efficient, and reusable More ❯
Stoke-on-Trent, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Kimball methodology. Certification: DP-600 – Fabric Analytics Engineer Associate. Responsibilities Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines andETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, andtransform large datasets, ensuring data quality and availability. Writing clean, efficient, andMore ❯
Hemel Hempstead, England, United Kingdom Hybrid / WFH Options
Ingentive
Engineer Associate PL-300 – Power BI Data Analyst Associate Responsibilities Your daily responsibilities will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines andETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, andtransform large datasets, ensuring data quality and availability. Writing clean, efficient, andMore ❯
Coventry, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Kimball methodology. DP-600 – Fabric Analytics Engineer Associate certification. Responsibilities Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines andETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, andtransform large datasets, ensuring data quality and availability. Writing clean, efficient, andMore ❯
Chelmsford, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Kimball methodology. Certification: DP-600 – Fabric Analytics Engineer Associate. Responsibilities Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines andETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, andtransform large datasets, ensuring data quality and availability. Writing clean, efficient, andMore ❯
Chester, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Kimball methodology. Certification: DP-600 – Fabric Analytics Engineer Associate. Responsibilities Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines andETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, andtransform large datasets, ensuring data quality and availability. Writing clean, efficient, andMore ❯
Bath, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Kimball methodology. DP-600 – Fabric Analytics Engineer Associate certification. Responsibilities: Your daily activities will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines andETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, andtransform large datasets, ensuring data quality and availability. Writing clean, efficient, andMore ❯
York, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Kimball methodology. DP-600 – Fabric Analytics Engineer Associate certification. Responsibilities Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines andETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, andtransform large datasets, ensuring data quality and availability. Writing clean, efficient, andMore ❯
Chesterfield, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
of the Kimball methodology. Certification: DP-600 – Fabric Analytics Engineer Associate. Responsibilities: Daily responsibilities include, but are not limited to: Designing, building, and optimizing high-performance data pipelines andETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, andtransform large datasets, ensuring data quality and availability. Writing clean, efficient, andMore ❯
knowledge of Microsoft Fabric, Azure Data Factory, Power BI, and related Azure tools Strong proficiency in SQL, Spark SQL, and Python for data processing and automation Solid understanding of ETL/ELT workflows, data modelling, and structuring datasets for analytics Experience working with large, complex datasets and APIs across formats (CSV, JSON, Parquet, etc) Familiarity with workflow automation tools (eg More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Datatech Analytics
knowledge of Microsoft Fabric, Azure Data Factory, Power BI, and related Azure tools Strong proficiency in SQL, Spark SQL, and Python for data processing and automation Solid understanding of ETL/ELT workflows, data modelling, and structuring datasets for analytics Experience working with large, complex datasets and APIs across formats (CSV, JSON, Parquet, etc) Familiarity with workflow automation tools (eg More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Datatech Analytics
knowledge of Microsoft Fabric, Azure Data Factory, Power BI, and related Azure tools Strong proficiency in SQL, Spark SQL, and Python for data processing and automation Solid understanding of ETL/ELT workflows, data modelling, and structuring datasets for analytics Experience working with large, complex datasets and APIs across formats (CSV, JSON, Parquet, etc) Familiarity with workflow automation tools (eg More ❯
South East London, London, United Kingdom Hybrid / WFH Options
Datatech Analytics
knowledge of Microsoft Fabric, Azure Data Factory, Power BI, and related Azure tools Strong proficiency in SQL, Spark SQL, and Python for data processing and automation Solid understanding of ETL/ELT workflows, data modelling, and structuring datasets for analytics Experience working with large, complex datasets and APIs across formats (CSV, JSON, Parquet, etc) Familiarity with workflow automation tools (eg More ❯
London, England, United Kingdom Hybrid / WFH Options
ISx4 Group
and visualisation. Desirable: Proficiency in programming languages commonly used in data science, such as Python or R. Desirable: Experience with data engineering techniques and technologies, including data pipelines andETL processes. Relevant cloud certifications (e.g., Azure Certified Cloud Engineer or AWS Certified Solutions Architect) are a strong plus. Personal Skills Personal Integrity, Stakeholder Management, Project Management, Agile Methodologies, Automation, Solutions More ❯
London, England, United Kingdom Hybrid / WFH Options
Endava
Apache Spark, Databricks, Snowflake or Airflow to automate ingestion, transformation, and delivery. Data Integration & Transformation Work with Data Analysts to understand source-to-target mappings and quality requirements. Build ETL/ELT workflows, validation checks, and cleaning steps for data reliability. Automation & Process Optimization Automate data reconciliation, metadata management, and error-handling procedures. Continuously refine pipeline performance, scalability, and cost More ❯
London, England, United Kingdom Hybrid / WFH Options
Bertelsmann
knowledge of Microsoft Fabric, Azure Data Factory, Power BI, and related Azure tools Strong proficiency in SQL, Spark SQL, and Python for data processing and automation Solid understanding of ETL/ELT workflows, data modelling, and structuring datasets for analytics Experience working with large, complex datasets and APIs across formats (CSV, JSON, Parquet, etc) Familiarity with workflow automation tools (eg More ❯