snowflake schemas. Knowledge of DevOps practices within a Power BI environment. Familiarity with Microsoft Fabric & Databricks. SQL databases expertise, data engineering with Python and PySpark, and knowledge of geospatial concepts and tools. As part of this engagement, you will work on initiatives that redefine business efficiency through AI. You more »
data experience Data inventory and data familiarisation Efficient data ingestion and ingestion pipelines Data cleaning and transformation Databricks (ideally with Unity Catalog) Python and PySpark CI/CD (ideally with Azure Devops) Unit testing (PyTest) If you have the above experience and are looking for a new contract role more »
data experience Data inventory and data familiarisation Efficient data ingestion and ingestion pipelines Data cleaning and transformation Databricks (ideally with Unity Catalog) Python and PySpark CI/CD (ideally with Azure Devops) Unit testing (PyTest) If you have the above experience and are looking for a new contract role more »
for data engineering ie Azure Functions * Core skills in coding with SQL, Python and Spark * Proven experience using Databricks ie lakehouse, delta live tables, Pyspark etc more »
South East London, London, United Kingdom Hybrid / WFH Options
The Bridge (IT Recruitment) Limited
inside IR35 on a remote basis for an energy provider. The key skills required for this Senior Data Engineer role are: Azure Python Databricks Pyspark If you do have the required skills for this remote Senior Data Engineer contract, please do apply. more »
Apply online only) per day. IR35 Status: Outside of IR35 Required experience will include: - Experience Leveraging the cloud platform using Azure Databricks, Azure Synapse, PySpark, Informatica and Power BI. - Strong knowledge of BI and Warehousing technologies and coinciding methodologies such as Kimball or Inmon. - Python , Pyspark and Matillion more »
Python, PySpark, AWS, Oracle, Kafka, Banking Become a key member of an agile team designing and delivering a market-leading secure and scalable reporting product. The team is migrating from Oracle to AWS cloud so experience with either is nice to have. You serve as a seasoned member of … candidates able to work under these requirements without visa sponsorship will be able to be considered at this time. Required skills: - Either Python or PySpark experience. Ideally, you will be working as either a Python Developer (3+ years) or Data Engineer using PySpark - Either AWS or Oracle. Candidates more »