City of London, London, United Kingdom Hybrid / WFH Options
Concept Resourcing
V2, Azure Databricks, Azure Function Apps& Logic Apps, Azure Stream Analytics, Azure Resource Manager skills (Terraform, Azure Portal, Az CLI and Az PowerShell) Strong PySpark, Delta Lake, Unity Catalog and Python skills. Includes ability to write unit and integration tests in Python with unittest, pytest, etc. Strong understanding of more »
data experience Data inventory and data familiarisation Efficient data ingestion and ingestion pipelines Data cleaning and transformation Databricks (ideally with Unity Catalog) Python and PySpark CI/CD (ideally with Azure Devops) Unit testing (PyTest) If you have the above experience and are looking for a new contract role more »
for data engineering ie Azure Functions * Core skills in coding with SQL, Python and Spark * Proven experience using Databricks ie lakehouse, delta live tables, Pyspark etc more »
Data inventory and data familiarisation Efficient data ingestion and ingestion pipelines Data cleaning and transformation Databricks (ideally with Unity Catalog) & Databricks Notebooks Python and PySpark CI/CD (ideally with Azure Devops) Unit testing (PyTest) If interested, please get in touch Thanks Will Xpertise Recruitment more »
Requirements: 3+ years as a Business Analyst. Proficiency in ERP/CRM solutions and data, including Workday HCM Strong Azure data skills. Proficiency in PySpark, Java, or Python. Familiarity with Kimball data modeling and SQL. Experience with Power BI and CI/CD practices. Nice to Have: B2B supply more »
South East London, London, United Kingdom Hybrid / WFH Options
The Bridge (IT Recruitment) Limited
a long term contract, inside IR35 on a remote basis. The key skills required for this Python Developer role are: Python ETL Azure Databricks Pyspark If you do have the required skills for this remote Python Developer contract, please do apply. more »
Python, PySpark, AWS, Oracle, Kafka, Banking Become a key member of an agile team designing and delivering a market-leading secure and scalable reporting product. The team is migrating from Oracle to AWS cloud so experience with either is nice to have. You serve as a seasoned member of … candidates able to work under these requirements without visa sponsorship will be able to be considered at this time. Required skills: - Either Python or PySpark experience. Ideally, you will be working as either a Python Developer (3+ years) or Data Engineer using PySpark - Either AWS or Oracle. Candidates more »
months. What will you be doing; You will focus on the development of reproducible analytical pipelines using a range of coding languages (notably Python, PySpark and R) across a range of cloud based and local platforms. Use cutting edge data science analysis with the aim of producing timelier and … the role: Write code and functions in a way that is reusable, fits best standards guidance, and are robustly unit tested. (Preferably in Python, PySpark or R) Create reproducible pipelines for systematic processing of statistical data, based on initial design documents. This includes writing documentation, designing and conducting tests more »
months. What will you be doing; You will focus on the development of reproducible analytical pipelines using a range of coding languages (notably Python, PySpark and R) across a range of cloud based and local platforms. Use cutting edge data science analysis with the aim of producing timelier and … the role: Write code and functions in a way that is reusable, fits best standards guidance, and are robustly unit tested. (Preferably in Python, PySpark or R) Create reproducible pipelines for systematic processing of statistical data, based on initial design documents. This includes writing documentation, designing and conducting tests more »