South West London, London, United Kingdom Hybrid / WFH Options
La Fosse
Collaborate across technical and non-technical teams Troubleshoot issues and support wider team adoption of the platform What You'll Bring: Proficiency in Python, PySpark, Spark SQL or Java Experience with cloud tools (Lambda, S3, EKS, IAM) Knowledge of Docker, Terraform, GitHub Actions Understanding of data quality frameworks Strong More ❯
experience in coding languages e.g. Python, C++, etc.; (Python preferred). Proficiency in database technologies e.g. SQL, No-SQL and Big Data technologies e.g. pySpark, Hive, etc. Experience working with structured and unstructured data e.g. Text, PDFs, jpgs, call recordings, video, etc. Knowledge of machine learning modelling techniques and More ❯
using Azure services including Azure Databricks, Azure Data Factory, Delta Lake, Azure Data Lake (ADLS), Power BI. Solid hands-on experience with Azure Databricks - Pyspark coding and Spark SQL coding - Must have. Very good knowledge of data warehousing skills including dimensional modeling, slowly changing dimension patterns, and time travel. More ❯
for seamless data workflows. Collaborate with cross-functional teams to ensure data integrity, security, and accessibility. Key Skills & Experience: Strong programming skills in Python (PySpark). Hands-on experience with Azure Data Services (Azure Data Factory, Databricks, Synapse, Data Lakes, etc.). Experience with CI/CD pipelines for More ❯
transformation logic, and build of all Power BI dashboards - including testing, optimization & integration to data sources Good exposure to all elements of data engineering (PySpark, Lakehouses, Kafka, etc.) Experience building reports from streaming data Strong understanding of CI/CD pipeline Financial and/or trading exposure, particularly energy More ❯
london, south east england, united kingdom Hybrid / WFH Options
Insight Global
transformation logic, and build of all Power BI dashboards - including testing, optimization & integration to data sources Good exposure to all elements of data engineering (PySpark, Lakehouses, Kafka, etc.) Experience building reports from streaming data Strong understanding of CI/CD pipeline Financial and/or trading exposure, particularly energy More ❯
experience or PhD with 5-7 years of relevant experience Experience in research Data Engineering Experience (DE Path) ETL Big Data Experience and Tooling (PySpark, Databricks) Python Testing Frameworks Data Validation and Data Quality Frameworks Data Handling (SQL & NoSQL) Feature Engineering Chunking Document Ingestion Graph Data Structures (Neo4j) CI More ❯
Ensure compliance with GDPR and other data regulations when handling sensitive information. Support the stability and performance of enterprise data platforms. Requirements: Proficient with PySpark, Delta Lake, Unity Catalog and Python (including unit and integration testing). Deep understanding of software development principles (SOLID, testing, CI/CD, version More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Skills for Care
external data engineers and data scientists, who may not be familiar with the datasets, to accelerate development. Our technology stack consists of: Python and Pyspark AWS glue jobs assembled into Step functions Pydeequ for data quality testing Circle CI for deployment Amazon Athena for querying data Hosted on AWS More ❯
frameworks and data governance practices, with an emphasis on scalability and compliance in research environments. Enterprise exposure to data engineering tools and products (Spark, PySpark, BigQuery, Pub/Sub) with an understanding of product/market fit for internal stakeholders Familiarity with cloud computing environments, including but not limited More ❯
ADF, Synapse, SQL, ADB , etc.. Should be strong in Databricks notebooks development for data ingestion, validation, transformation and metric build. Should be strong in PySpark and SQL. Should be strong in ADF pipeline development, data orchestration techniques, monitoring and troubleshooting Should be strong in stored procedure development. Good knowledge More ❯
shape and implement Shell's strategy What you bring Have substantial experience in technical and process guidance Experience in Python FastAPI development, Spark/pySpark, Typescript/React,T-SQL/SQL/Azure SQL and other programming frameworks and paradigm Able to mix strategic and pragmatic approaches to More ❯
East London, London, United Kingdom Hybrid / WFH Options
McGregor Boyall Associates Limited
Science, Data Science, Mathematics, or related field. 5+ years of experience in ML modeling, ranking, or recommendation systems . Proficiency in Python, SQL, Spark, PySpark, TensorFlow . Strong knowledge of LLM algorithms and training techniques . Experience deploying models in production environments. Nice to Have: Experience in GenAI/ More ❯
systems. Strong expertise in ML/DL/LLM algorithms, model architectures, and training techniques. Proficiency in programming languages such as Python, SQL, Spark, PySpark, TensorFlow, or equivalent analytical/model-building tools. Familiarity with tools and technologies related to LLMs. Ability to work independently while also thriving in More ❯
queries for huge datasets. Has a solid understanding of blockchain ecosystem elements like DeFi, Exchanges, Wallets, Smart Contracts, mixers and privacy services. Databricks and PySpark Analysing blockchain data Building and maintaining data pipelines Deploying machine learning models Use of graph analytics and graph neural networks If this sounds like More ❯
queries for huge datasets. Has a solid understanding of blockchain ecosystem elements like DeFi, Exchanges, Wallets, Smart Contracts, mixers and privacy services. Databricks and PySpark Analysing blockchain data Building and maintaining data pipelines Deploying machine learning models Use of graph analytics and graph neural networks If this sounds like More ❯
Catalogue etc.). Prove experience designing high volume, live data streaming solutions using Azure DLT (Delta Live Tables). Expert with Apache Spark and PySpark (ability to review quality of code and debug issues). Experience with Qlik Replicate to move data from on-prem to the cloud. Background More ❯
essential skills: Typical Data Engineering Experience required (3+ yrs): Strong knowledge and experience: Azure Data Factory and Synapse data solution provision Power BI PythonPySpark (Preference will be given to those who hold relevant certifications) Proficient in SQL. Knowledge of Terraform Ability to develop and deliver complex visualisation, reporting More ❯
in programming languages and data structures such as SAS, Python, R, SQL is key. With Python background, particularly familiarity with pandas/polars/pyspark, pytest; understanding of OOP principles; git version control; knowledge of the following frameworks a plus: pydantic, pandera, sphinx Additionally, experience in any or all More ❯
london, south east england, united kingdom Hybrid / WFH Options
Carnegie Consulting Limited
in programming languages and data structures such as SAS, Python, R, SQL is key. With Python background, particularly familiarity with pandas/polars/pyspark, pytest; understanding of OOP principles; git version control; knowledge of the following frameworks a plus: pydantic, pandera, sphinx Additionally, experience in any or all More ❯
Cardiff, South Glamorgan, Wales, United Kingdom Hybrid / WFH Options
Yolk Recruitment
Azure-based platforms (Synapse, Data Factory, Databricks) Familiarity with data regulations (GDPR, FCA) and SMCR environments Bonus points for experience with Python, R, or PySpark Why You Should Apply: Executive-level visibility and the chance to lead a high-impact transformation Full budget ownership with freedom to shape systems More ❯
requirements. Preferred Skills and Experience Databricks Azure Data Factory Data Lakehouse Medallion architecture Microsoft Azure T-SQL Development (MS SQL Server 2005 onwards) Python, PySpark Experience of the following systems would also be advantageous: Azure DevOps MDS Kimball Dimensional Modelling Methodology Power Bi Unity Catalogue Microsoft Fabric Experience of More ❯
on platforms such as AWS, GCP, and Azure. Extensive hands-on experience with cloud-based AI/ML solutions and programming languages (e.g., Python, PySpark), data modelling, and microservices. Proficient in LLM orchestration on platforms such as OpenAI on Azure, AWS Bedrock, GCP Vertex AI, or Gemini AI. Serve More ❯
AI Platforms: Google Cloud Platform, Amazon Web Services, Microsoft Azure, Databricks. Experience in one or more of the listed Languages or Packages: Python, R, Pyspark, Scala, PowerBI, Tableau. WHAT YOU'LL LOVE ABOUT WORKING HERE? Data Science Consulting brings an inventive quantitative approach to our clients' biggest business and More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Capgemini
AI Platforms: Google Cloud Platform, Amazon Web Services, Microsoft Azure, Databricks. Experience in one or more of the listed Languages or Packages: Python, R, Pyspark, Scala, PowerBI, Tableau. Proven experience in successfully delivering multiple complex data rich workstreams in parallel to supporting wider strategic ambitions and supporting others in More ❯