Role : Data Engineer Location : Glasgow ( 3 days in a week) Key Requirements/Expertise : Primary skills : Data, Python, pyspark Should have expertise in data engineering experience leveraging technologies such as Snowflake, Azure Data Factory, ADLS, Databricks etc. Should have expertise in writing SQL queries against any RDBMS with query More ❯
East London, London, United Kingdom Hybrid / WFH Options
McGregor Boyall Associates Limited
Science, Data Science, Mathematics, or related field. 5+ years of experience in ML modeling, ranking, or recommendation systems . Proficiency in Python, SQL, Spark, PySpark, TensorFlow . Strong knowledge of LLM algorithms and training techniques . Experience deploying models in production environments. Nice to Have: Experience in GenAI/ More ❯
to work collaboratively with offshore teams Previous experience in financial services , ideally banking Leadership or previous mentoring experience (nice to have) Tech Stack: Python , PySpark SQL (data validation, transformation logic) Reasonable Adjustments: We understand that there are a wide range of reasons that you may require reasonable adjustments to More ❯
various data sources, performing complex transformations, and publishing data to Azure Data Lake or other storage services. Write efficient and standardized Spark SQL and PySpark code for data transformations, ensuring data integrity and accuracy across the pipeline. Automate pipeline orchestration using Databricks Workflows or integration with external tools (e.g. … ETL/ELT data pipelines in Azure Databricks, transforming raw data into usable datasets for analysis. Azure Databricks Proficiency: Strong knowledge of Spark (SQL, PySpark) for data transformation and processing within Databricks, along with experience building workflows and automation using Databricks Workflows. Azure Data Services: Hands-on experience with More ❯
Bracknell, Berkshire, South East, United Kingdom Hybrid / WFH Options
Halian Technology Limited
needs Lead the integration and optimisation of large-scale data platforms using Azure Synapse and Databricks Build and maintain robust data pipelines using Python (PySpark) and SQL Collaborate with data engineers, analysts, and stakeholders to ensure data quality, governance, and security Ensure all solutions adhere to financial regulations and … Data Architect within the financial services sector Hands-on expertise with Azure Synapse Analytics and Databricks Strong programming and data engineering skills in Python (PySpark) and SQL Solid understanding of financial data and regulatory compliance requirements Excellent stakeholder communication and documentation skills More ❯