City of London, London, Westminster Abbey, United Kingdom Hybrid / WFH Options
Department of Work & Pensions
currently on-prem, but the direction of travel is cloud engineering, and you'll be executing code across the following tech stack: Azure, Databricks, PySpark and Pandas. DWP Digital is a great place to work, we offer a supportive and inclusive environment where you can grow your career and … make a real difference. Essential criteria: Commercial experience of Databricks, PySpark and Pandas Commercial experience of Azure data engineering tools such as Azure Data Factory, dedicated SQL pools and ADSL Gen 2 Experience of working with data lakes An understanding of dimensional modelling Details. Wages. Perks. You'll join more »
currently on-prem, but the direction of travel is cloud engineering, and you'll be executing code across the following tech stack: Azure, Databricks, PySpark and Pandas. DWP Digital is a great place to work, we offer a supportive and inclusive environment where you can grow your career and … make a real difference. Essential criteria: Commercial experience of Databricks, PySpark and Pandas Commercial experience of Azure data engineering tools such as Azure Data Factory, dedicated SQL pools and ADSL Gen 2 Experience of working with data lakes An understanding of dimensional modelling Details. Wages. Perks. You'll join more »
a strong background in business change and transformation focussed expressly around Data analytics and Big data platforms. 5+ years of Big Data Experience utilising Pyspark 5+ years of managing data analytical projects within a financial domain (Banking/Investments) Background within investment managment, financial services, etc. Project management experience more »
City of London, London, United Kingdom Hybrid / WFH Options
Develop
Modeling within a cloud-based data platform Strong experience with SQL Server Azure data engineering stack, including Azure Synapse and Azure Data Lake Python, PySpark and T-SQL In return you will be offered a competitive salary and benefits package, remote working options and an opportunity to work with more »
Data inventory and data familiarisation Efficient data ingestion and ingestion pipelines Data cleaning and transformation Databricks (ideally with Unity Catalog) & Databricks Notebooks Python and PySpark CI/CD (ideally with Azure Devops) Unit testing (PyTest) If interested, please get in touch Thanks Will Xpertise Recruitment more »
Requirements: 3+ years as a Business Analyst. Proficiency in ERP/CRM solutions and data, including Workday HCM Strong Azure data skills. Proficiency in PySpark, Java, or Python. Familiarity with Kimball data modeling and SQL. Experience with Power BI and CI/CD practices. Nice to Have: B2B supply more »
City of London, London, United Kingdom Hybrid / WFH Options
Concept Resourcing
V2, Azure Databricks, Azure Function Apps& Logic Apps, Azure Stream Analytics, Azure Resource Manager skills (Terraform, Azure Portal, Az CLI and Az PowerShell) Strong PySpark, Delta Lake, Unity Catalog and Python skills. Includes ability to write unit and integration tests in Python with unittest, pytest, etc. Strong understanding of more »