Banking and Financial Services sector is advantageous. Deep knowledge or experience with using as much of the following: Azure Cloud Data Components | Databricks | Python | PySpark | Terraform | APIs | Lakehouse | Data Mesh | Nosql DBs | GitHub Person Specification Self motivator with a desire to learn new skills and embrace new technologies in more »
North West London, London, United Kingdom Hybrid / WFH Options
Viqu Limited
a Senior Data Engineer with a strong focus on Databricks. Proficiency in Python and SQL for data processing and analysis. SparkPythonAPI/PySpark Hands-on experience with AWS services related to data storage and processing In-depth knowledge of Databricks Delta Lake and Light Tables. Familiarity with more »
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Viqu Limited
a Senior Data Engineer with a strong focus on Databricks. Proficiency in Python and SQL for data processing and analysis. SparkPythonAPI/PySpark Hands-on experience with AWS services related to data storage and processing (e.g., S3, Redshift, Glue). In-depth knowledge of Databricks Delta Lake more »
City of London, London, United Kingdom Hybrid / WFH Options
Concept Resourcing
V2, Azure Databricks, Azure Function Apps& Logic Apps, Azure Stream Analytics, Azure Resource Manager skills (Terraform, Azure Portal, Az CLI and Az PowerShell) Strong PySpark, Delta Lake, Unity Catalog and Python skills. Includes ability to write unit and integration tests in Python with unittest, pytest, etc. Strong understanding of more »
data warehouse design. Cloud data products such as: Data factory Events Hubs Data Lake Synapse Azure SQL Server Experience developing Databricks and coding with PySpark and Spark SQL. Proficient in ETL coding standards Data encryption techniques and standards Knowledge of relevant legislation such as: Data Protection Act, EU Procurement more »
cloud data products like Data Factory, Event Hubs, Data Lake, Synapse, Azure SQL server. Knowledge in developing in Databricks and experience in coding with PySpark, Spark SQL. Experience in design and development of complex data and analytics solutions in an iterative manner for large enterprise business/data warehouse more »
Manchester, North West, United Kingdom Hybrid / WFH Options
Viqu Limited
Engineer/Data Engineer with a strong focus on Databricks. Proficiency in Python and SQL for data processing and analysis. SparkPythonAPI/PySpark Hands-on experience with AWS services related to data storage and processing (e.g., S3, Redshift, Glue). In-depth knowledge of Databricks Delta Lake more »
other stakeholders to develop sophisticated analytical solutions. Must have experience in setting up the data pipelines (ELT) from complex API's using Python/PySpark in Data bricks/ADF ersg are an equal opportunities employer; we are committed to promoting equality of opportunity for all job applicants. We more »
Data inventory and data familiarisation Efficient data ingestion and ingestion pipelines Data cleaning and transformation Databricks (ideally with Unity Catalog) & Databricks Notebooks Python and PySpark CI/CD (ideally with Azure Devops) Unit testing (PyTest) If interested, please get in touch Thanks Will Xpertise Recruitment more »
Mid-or-Senior level Data Scientist Solid knowledge of Data Engineering principles, including productionisation Technical experience with some or all of the following: Python, PySpark, scikit-learn, pandas, Azure Data Services, Databricks. If this sounds of interest, please apply. more »
Requirements: 3+ years as a Business Analyst. Proficiency in ERP/CRM solutions and data, including Workday HCM Strong Azure data skills. Proficiency in PySpark, Java, or Python. Familiarity with Kimball data modeling and SQL. Experience with Power BI and CI/CD practices. Nice to Have: B2B supply more »
Senior Databricks Migration Consultant Remote This role demands in-depth knowledge of data engineering, cloud technologies (preferably AWS), and a successful record in enterprise level data migrations into Databricks. You will ensure the efficient and secure transition of our data more »
South East London, London, United Kingdom Hybrid / WFH Options
The Bridge (IT Recruitment) Limited
a long term contract, inside IR35 on a remote basis. The key skills required for this Python Developer role are: Python ETL Azure Databricks Pyspark If you do have the required skills for this remote Python Developer contract, please do apply. more »
City of London, London, United Kingdom Hybrid / WFH Options
TALENT INTERNATIONAL UK LTD
Azure services such as Data Factory, Event Hubs, Data Lake, Synapse, and Azure SQL Server. Create and optimize data processing workflows in Databricks using PySpark and Spark SQL. Ensure ETL coding standards are met, including self-documenting code and reliable testing. Apply best practice data encryption techniques and standards … design. Extensive experience with Azure data products including Data Factory, Event Hubs, Data Lake, Synapse, and Azure SQL Server. Proficient in developing with Databricks, PySpark, and Spark SQL. Strong understanding of ETL coding standards, including standardized, self-documenting code and reliable testing. Knowledge of data encryption techniques and standards. more »
City of London, London, United Kingdom Hybrid / WFH Options
TALENT INTERNATIONAL UK LTD
Azure services such as Data Factory, Event Hubs, Data Lake, Synapse, and Azure SQL Server. Create and optimize data processing workflows in Databricks using PySpark and Spark SQL. Ensure ETL coding standards are met, including self-documenting code and reliable testing. Apply best practice data encryption techniques and standards … design. Extensive experience with Azure data products including Data Factory, Event Hubs, Data Lake, Synapse, and Azure SQL Server. Proficient in developing with Databricks, PySpark, and Spark SQL. Strong understanding of ETL coding standards, including standardized, self-documenting code and reliable testing. Knowledge of data encryption techniques and standards. more »
Senior Azure Data Engineer/Developer - 12 months - INSIDE IR35. Senior Azure Data Engineer/Developer required for a 12 months contract. You will be office based in South East London 2x days per week. The rest remote. Experienced in more »
hybrid data warehouse design principles. Utilize cloud data products such as Data Factory, Event Hubs, Data Lake, Synapse, and Azure SQL Server. Databricks and PySpark Development Develop in Databricks with experience coding in PySpark and Spark SQL. Ensure ETL code is standardized, self-documenting, and can be reliably … using cloud data products like Data Factory, Event Hubs, Data Lake, Synapse, and Azure SQL Server. Experienced in developing with Databricks and coding in PySpark and Spark SQL. Thorough understanding of coding standards for ETL processes. Knowledgeable about best practice data encryption techniques and standards. Familiar with relevant legislation more »
Daily rate: £430 inside IR35 Location: hybrid/2d/w in London Duration: 12 months with possible extension Key Knowledge/Skills Detailed working knowledge of ETL/ELT, data warehousing/business intelligence methodologies and best practice including more »