Erskine, Renfrewshire, Scotland, United Kingdom Hybrid / WFH Options
DXC Technology
deploy models using tools like TensorFlow Serving, TorchServe, ONNX, and TensorRT. Build and manage ML pipelines using MLflow, Kubeflow, and Azure ML Pipelines. Work with large-scale data using PySpark and integrate models into production environments. Monitor model performance and retrain as needed to ensure accuracy and efficiency. Collaborate with cross-functional teams to integrate AI solutions into scalable More ❯
Newcastle Upon Tyne, Tyne and Wear, North East, United Kingdom Hybrid / WFH Options
DXC Technology
deploy models using tools like TensorFlow Serving, TorchServe, ONNX, and TensorRT. Build and manage ML pipelines using MLflow, Kubeflow, and Azure ML Pipelines. Work with large-scale data using PySpark and integrate models into production environments. Monitor model performance and retrain as needed to ensure accuracy and efficiency. Collaborate with cross-functional teams to integrate AI solutions into scalable More ❯
london, south east england, united kingdom Hybrid / WFH Options
Ubique Systems
customer who has an expertise in:- " • Lead the design and implementation of robust data architectures to support business needs and data strategy. • Utilize extensive experience in Azure Synapse, Python, PySpark, and ADF to architect scalable and efficient data solutions. • Oversee and optimize SSIS, SSRS, and SQL Server environments, ensuring high performance and reliability. • Write/review complex SQL queries More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
E.ON
environments You're passionate about using data to create clarity and unlock positive change It would be great if you had: Experience in the energy retail sector Familiarity with PySpark Background in pricing or commercial modelling Here's what else you need to know: This role is open exclusively to internal applicants from E.ON UK and E.ON Next Role More ❯
that resonate You're motivated, adaptable, and love working in empowered, fast-moving environments It would be great if you had: Experience in the energy retail sector Familiarity with PySpark Background in pricing or commercial modelling Here's what else you need to know: This role is open exclusively to internal applicants from E.ON UK and E.ON Next Role More ❯
Deerfield, Illinois, United States Hybrid / WFH Options
Biolife Plasma Services
multiple pricing initiatives simultaneously, balancing tactical execution with strategic vision. - Drive standardization and automation of reporting across pricing KPIs. DIMENSIONS AND ASPECTS Technical expertise : Proven hands-on experience with PySpark, Python, SQL, and BI tools (Power BI or Tableau), with advanced Excel skills for rapid analysis. Analytical leadership : Demonstrated ability to proactively identify new opportunities, design models/experiments More ❯
Streamwood, Illinois, United States Hybrid / WFH Options
Biolife Plasma Services
multiple pricing initiatives simultaneously, balancing tactical execution with strategic vision. - Drive standardization and automation of reporting across pricing KPIs. DIMENSIONS AND ASPECTS Technical expertise : Proven hands-on experience with PySpark, Python, SQL, and BI tools (Power BI or Tableau), with advanced Excel skills for rapid analysis. Analytical leadership : Demonstrated ability to proactively identify new opportunities, design models/experiments More ❯
Woodstock, Illinois, United States Hybrid / WFH Options
Biolife Plasma Services
multiple pricing initiatives simultaneously, balancing tactical execution with strategic vision. - Drive standardization and automation of reporting across pricing KPIs. DIMENSIONS AND ASPECTS Technical expertise : Proven hands-on experience with PySpark, Python, SQL, and BI tools (Power BI or Tableau), with advanced Excel skills for rapid analysis. Analytical leadership : Demonstrated ability to proactively identify new opportunities, design models/experiments More ❯
Vernon Hills, Illinois, United States Hybrid / WFH Options
Biolife Plasma Services
multiple pricing initiatives simultaneously, balancing tactical execution with strategic vision. - Drive standardization and automation of reporting across pricing KPIs. DIMENSIONS AND ASPECTS Technical expertise : Proven hands-on experience with PySpark, Python, SQL, and BI tools (Power BI or Tableau), with advanced Excel skills for rapid analysis. Analytical leadership : Demonstrated ability to proactively identify new opportunities, design models/experiments More ❯
based data solutions. Hands-on knowledge of Microsoft Fabric and Power BI . Strong background in data architecture, modelling, ETL/ELT . Useful: Azure Synapse, Data Factory, Databricks, PySpark ; SSRS/SSAS/SSIS ; Azure DevOps/Git . Previous consulting or pre-sales experience. Clear communication skills with both technical and non-technical audiences. Microsoft Data certification More ❯
Arlington, Virginia, United States Hybrid / WFH Options
540.co
want to talk to you. REQUIRED SKILLS & EXPERIENCE 10+ years of professional experience in data architecture, data engineering, data modeling, or related fields. Strong hands-on experience with Spark (PySpark/Scala), SQL, and Python Experience designing and managing Databricks compute and ETL solutions Deep expertise in enterprise data modeling, database design, and AWS cloud data platforms Deep expertise … in debugging and troubleshooting Spark jobs (PySpark/Scala) for failures, inefficiencies, and resource bottlenecks Familiarity with data governance frameworks and regulatory compliance requirements Strong ability to quickly assess complex projects, systems, and ecosystems, identifying relationships and dependencies Excellent communication skills, with the ability to translate technical concepts into business terms Proven ability to engage senior management and key More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Vermelo RPO
Knowledge of the technical differences between different packages for some of these model types would be an advantage. Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW’s Radar software is preferred Proficient at communicating results in a concise More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Gerrard White
Knowledge of the technical differences between different packages for some of these model types would be an advantage. Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW's Radar software is preferred Proficient at communicating results in a concise More ❯
warrington, cheshire, north west england, united kingdom Hybrid / WFH Options
Gerrard White
Knowledge of the technical differences between different packages for some of these model types would be an advantage. Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW's Radar software is preferred Proficient at communicating results in a concise More ❯
bolton, greater manchester, north west england, united kingdom Hybrid / WFH Options
Gerrard White
Knowledge of the technical differences between different packages for some of these model types would be an advantage. Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW's Radar software is preferred Proficient at communicating results in a concise More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Asset Resourcing Limited
Data QA Engineer – Remote-first - £55-65,000 Overview: As a Data QA Engineer, you will ensure the reliability, accuracy and performance of our client’s data solutions. Operating remotely, you will work closely with Data Engineers, Architects and Analysts More ❯
for Technical Data Architect location: Central London Type : Permanent Hybrid role (2-3 days from client location) We are seeking a highly skilled TechnicalData Architect- with expertise in Databricks, PySpark, and modern data engineering practices. The ideal candidate will lead the design, development, and optimization of scalable data pipelines, while ensuring data accuracy, consistency, and performance across the enterprise … cross-functional teams. ________________________________________ Key Responsibilities Lead the design, development, and maintenance of scalable, high-performance data pipelines on Databricks. Architect and implement data ingestion, transformation, and integration workflows using PySpark, SQL, and Delta Lake. Guide the team in migrating legacy ETL processes to modern cloud-based data pipelines. Ensure data accuracy, schema consistency, row counts, and KPIs during migration … cloud platforms, and analytics. ________________________________________ Required Skills & Qualifications 10-12 years of experience in data engineering, with at least 3+ years in a technical lead role. Strong expertise in Databricks , PySpark , and Delta Lake . DBT Advanced proficiency in SQL, ETL/ELT pipelines, and data modelling. Experience with Azure Data Services (ADLS, ADF, Synapse) or other major cloud platforms More ❯
driven technical solutions, integrating data engineering and software development. Hands-On Development: Build and optimize data pipelines using modern programming languages and frameworks, with a focus on Python/Pyspark/Snowpark for development and integration. Data Pipeline and Warehousing Optimization: Design, implement, and maintain efficient data workflows and data warehousing solutions using tools like Snowflake to ensure seamless … related applications. Hands-on expertise with cloud platforms like Azure, CI/CD tools such as Azure DevOps pipelines, and Infrastructure as Code. Professional-level skills in Python/Pyspark development, database management, and data warehousing. Professional experience with Snowflake, Azure Data Factory for pipelining, Azure Databricks for data processing, and PowerBI for reporting and analytics. Good knowledge in More ❯
driven technical solutions, integrating data engineering and software development. Hands-On Development: Build and optimize data pipelines using modern programming languages and frameworks, with a focus on Python/Pyspark/Snowpark for development and integration. Data Pipeline and Warehousing Optimization: Design, implement, and maintain efficient data workflows and data warehousing solutions using tools like Snowflake to ensure seamless … related applications. Hands-on expertise with cloud platforms like Azure, CI/CD tools such as Azure DevOps pipelines, and Infrastructure as Code. Professional-level skills in Python/Pyspark development, database management, and data warehousing. Professional experience with Snowflake, Azure Data Factory for pipelining, Azure Databricks for data processing, and PowerBI for reporting and analytics. Good knowledge in More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Method Resourcing
Data Analyst/BI Developer - Financial Services (Power BI, PySpark, Databricks) Location: London (Hybrid, 2 days per week onsite) Salary: £65,000 to £75,000 + bonus + benefits Sector: Private Wealth/Financial Services About the Role A leading Financial Services organisation is looking for a Data Analyst/BI Developer to join its Data Insight and Analytics … division. Partner with senior leadership and key stakeholders to translate requirements into high-impact analytical products. Design, build, and maintain Power BI dashboards that inform strategic business decisions. Use PySpark , Databricks or Microsoft Fabric , and relational/dimensional modelling (Kimball methodology) to structure and transform data. Promote best practices in Git , CI/CD pipelines (Azure DevOps), and data … analysis, BI development, or data engineering. Strong knowledge of relational and dimensional modelling (Kimball or similar). Proven experience with: Power BI (advanced DAX, data modelling, RLS, deployment pipelines) PySpark and Databricks or Microsoft Fabric Git and CI/CD pipelines (Azure DevOps preferred) SQL for querying and data transformation Experience with Python for data extraction and API integration. More ❯
driven technical solutions, integrating data engineering and software development. Hands-On Development: Build and optimize data pipelines using modern programming languages and frameworks, with a focus on Python/Pyspark/Snowpark for development and integration. Data Pipeline and Warehousing Optimization: Design, implement, and maintain efficient data workflows and data warehousing solutions using tools like Snowflake to ensure seamless … related applications. Hands-on expertise with cloud platforms like Azure, CI/CD tools such as Azure DevOps pipelines, and Infrastructure as Code. Professional-level skills in Python/Pyspark development, database management, and data warehousing. Professional experience with Snowflake, Azure Data Factory for pipelining, Azure Databricks for data processing, and PowerBI for reporting and analytics. Good knowledge in More ❯
driven technical solutions, integrating data engineering and software development. Hands-On Development: Build and optimize data pipelines using modern programming languages and frameworks, with a focus on Python/Pyspark/Snowpark for development and integration. Data Pipeline and Warehousing Optimization: Design, implement, and maintain efficient data workflows and data warehousing solutions using tools like Snowflake to ensure seamless … related applications. Hands-on expertise with cloud platforms like Azure, CI/CD tools such as Azure DevOps pipelines, and Infrastructure as Code. Professional-level skills in Python/Pyspark development, database management, and data warehousing. Professional experience with Snowflake, Azure Data Factory for pipelining, Azure Databricks for data processing, and PowerBI for reporting and analytics. Good knowledge in More ❯
driven technical solutions, integrating data engineering and software development. Hands-On Development: Build and optimize data pipelines using modern programming languages and frameworks, with a focus on Python/Pyspark/Snowpark for development and integration. Data Pipeline and Warehousing Optimization: Design, implement, and maintain efficient data workflows and data warehousing solutions using tools like Snowflake to ensure seamless … related applications. Hands-on expertise with cloud platforms like Azure, CI/CD tools such as Azure DevOps pipelines, and Infrastructure as Code. Professional-level skills in Python/Pyspark development, database management, and data warehousing. Professional experience with Snowflake, Azure Data Factory for pipelining, Azure Databricks for data processing, and PowerBI for reporting and analytics. Good knowledge in More ❯
driven technical solutions, integrating data engineering and software development. Hands-On Development: Build and optimize data pipelines using modern programming languages and frameworks, with a focus on Python/Pyspark/Snowpark for development and integration. Data Pipeline and Warehousing Optimization: Design, implement, and maintain efficient data workflows and data warehousing solutions using tools like Snowflake to ensure seamless … related applications. Hands-on expertise with cloud platforms like Azure, CI/CD tools such as Azure DevOps pipelines, and Infrastructure as Code. Professional-level skills in Python/Pyspark development, database management, and data warehousing. Professional experience with Snowflake, Azure Data Factory for pipelining, Azure Databricks for data processing, and PowerBI for reporting and analytics. Good knowledge in More ❯
driven technical solutions, integrating data engineering and software development. Hands-On Development: Build and optimize data pipelines using modern programming languages and frameworks, with a focus on Python/Pyspark/Snowpark for development and integration. Data Pipeline and Warehousing Optimization: Design, implement, and maintain efficient data workflows and data warehousing solutions using tools like Snowflake to ensure seamless … related applications. Hands-on expertise with cloud platforms like Azure, CI/CD tools such as Azure DevOps pipelines, and Infrastructure as Code. Professional-level skills in Python/Pyspark development, database management, and data warehousing. Professional experience with Snowflake, Azure Data Factory for pipelining, Azure Databricks for data processing, and PowerBI for reporting and analytics. Good knowledge in More ❯