City of London, London, United Kingdom Hybrid / WFH Options
83zero Ltd
Databricks Data Engineer - Contract Day Rate: £550 - £650 P/day Location: Hybrid (Remote/2 Days a week in London) Start: Immediate Start Note: Capital Markets, or at least Banking experience is a MUST! Key expertise and experience we're looking for: Data Engineering in Databricks - Spark … programming with Scala, Python, SQL Ideally experience with Delta Lake Databricks workflows, jobs, etc. Familiarity with Azure Data Lake: experience with dataingestion and ETL/ELT frameworks Data Governance experience - Metadata, Data Quality, Lineage, Data Access Models Good understanding of Data Modelling … concepts, Data Products and Data Domains Unity Catalog experience is a key differentiator - if not then experience with a similar Catalog/Data Governance Management component MS Purview (Metadata and Data Quality tool) experience is a bonus - experience in similar tools is valuable (Collibra, Informatica DataMore ❯
in London) Start: Immediate Start Note: Capital Markets, or at least Banking experience is a MUST! Key expertise and experience we're looking for: Data Engineering in Databricks - Spark programming with Scala, Python, SQL Ideally experience with Delta Lake Databricks workflows, jobs, etc. Familiarity with Azure Data Lake … experience with dataingestion and ETL/ELT frameworks Data Governance experience - Metadata, Data Quality, Lineage, Data Access Models Good understanding of Data Modelling concepts, Data Products and Data Domains Unity Catalog experience is a key differentiator - if not then experience with … a similar Catalog/Data Governance Management component MS Purview (Metadata and Data Quality tool) experience is a bonus - experience in similar tools is valuable (Collibra, Informatica Data Quality/MDM/Axon etc.) Data Architecture experience is a bonus Python, Scala, Databricks Spark and Pyspark More ❯
Data Engineer 4 months - with extensions Remote Active SC clearance required £640 per day inside ir35 REQUIRED Strong understanding of data concepts - data types, data structures, schemas (both JSON and Spark), schema management etc Strong understanding of complex JSON manipulation Experience working with Data Pipelines … using a custom Python/PySpark frameworks Strong understanding of the 4 core Data categories (Reference, Master, Transactional, Freeform) and the implications of each, particularly managing/handling Reference Data. Strong understanding of Data Security principles - data owners, access controls - row and column level, GDPR etc including … both CLI usage and scripting) Git Markdown Scala DESIRABLE Azure SQL Server as a HIVE Metastore DESIRABLE TECHNOLOGIES Azure Databricks Apache Spark Delta Tables Data processing with Python PowerBI (Integration/DataIngestion) JIRA If you meet the above requirements, please apply for the vacancy to be More ❯
pd The Role Our Insurance client are looking for a Senior Python Engineer to play a key role in building their scalable data capabilities; designing ingestion pipelines, high-performance APIs, and real-time data processing systems. Key Responsibilities Stack: Python, PySpark, Linux, PostgreSQL, SQL Server, Databricks, Azure … AWS Design and implement large-scale dataingestion, transformation, and analysis solutions Model datasets and develop indicators to improve data quality Collaborate on Infrastructure as Code, CI/CD pipelines, and application/API security standards Evaluate new data sources and automate quality assurance in the More ❯
london (city of london), south east england, United Kingdom
Wilson Brown
pd The Role Our Insurance client are looking for a Senior Python Engineer to play a key role in building their scalable data capabilities; designing ingestion pipelines, high-performance APIs, and real-time data processing systems. Key Responsibilities Stack: Python, PySpark, Linux, PostgreSQL, SQL Server, Databricks, Azure … AWS Design and implement large-scale dataingestion, transformation, and analysis solutions Model datasets and develop indicators to improve data quality Collaborate on Infrastructure as Code, CI/CD pipelines, and application/API security standards Evaluate new data sources and automate quality assurance in the More ❯
self-directed, action-oriented, and comfortable making high-impact decisions quickly. Your focus will include consulting on new product developments, refining complex real-time data pipelines handling millions of data points, and driving operational excellence across our SaaS platform. Key Responsibilities: Production Ownership: Manage and optimise our live … and product development, integrating these into the existing production infrastructure with minimal disruption. Performance Optimisation: Continuously monitor, benchmark, and enhance system performance and reliability. Data Pipeline Management: Oversee and improve our extensive real-time dataingestion pipelines, ensuring they are resilient, scalable, and future proof. Process Innovation … growth startup or scale-up businesses is highly desirable. Deep technical expertise in: Databases such as MySQL, Singlestore, and BigQuery. Real-time, high-throughput data pipelines. Cloud platforms, particularly Google Cloud. Strong understanding of software engineering best practices, feature deployment processes and testing. Ability to drive rapid decision-making More ❯