Experience working in a distributed team. Background in software development or in a hands-on technical role Desirable: Spark data processing Cloud technologies – Azure, DataBricks, Kubernetes Software Architectural skills DevOps Agile/Scrum certified. At Kantar we have an integrated way of rewarding our people based around a simple, clear more »
and coordinating release schedules Preferred Qualifications Experience owning/driving roadmap strategy and definition Understanding of data warehouses or Lakehouses, such as on BigQuery, Databricks or Snowflake Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems WHO WE ARE: Do Your Best Work The opportunity more »
Greenford, London, United Kingdom Hybrid / WFH Options
Brompton Bicycle
unstructured datasets, and different types of data pertaining to a number of departments across Brompton (Finance, Planning, Commercial). Experience of working with Azure Databricks, B2B/B2C/D2C business models and FiveTran will stand you in good stead. Our team work in 2-week sprints, so experience of … data models that enhance data accessibility and facilitate deeper analysis. Implement and manage Unity Catalog for centralized data governance and unified access controls across Databricks Maintain technical documentation for the entirety of code base. End to end ownership of the Data Engineering Lifecycle. Implement and manage Fivetran for efficient and … equivalent experience. Extensive experience as a Senior Data Engineer/Cloud Data Architect, or similar role. Deep knowledge of Azure Cloud architecture and Azure Databricks, DevOps and CI/CD Extensive experience migrating on-premises data warehouses to the cloud. Proficiency with Spark, SQL, Python, R, and other data engineering more »
South East London, London, United Kingdom Hybrid / WFH Options
The Bridge (IT Recruitment) Limited
on a long term contract, inside IR35 on a remote basis. The key skills required for this Python Developer role are: Python ETL Azure Databricks Pyspark If you do have the required skills for this remote Python Developer contract, please do apply. more »
CX with potential extension Requirements: 5+ Years as an Azure Data Architect Must have Active SC Clearance Background in financial services Experience working with Databricks Extensive data modelling knowledge Will have designed data lakes more »
using Azure AI services for various use cases. Integrating Azure AI services with other Azure products and services, such as Azure Machine Learning, Azure Databricks, Azure App Service, and Azure Cognitive Search. Enabling data capture, ingestion, processing, and storage to serve AI and analytics using Azure technologies Optimising the performance more »
have experience working on ML infrastructure using Docker and Kubernetes Minimum of 3 to 5 years experience Extensive experience with Python Familiarly with Terraform, Databricks and Azure Ability to implement machine learning and software best practices Ideally exposure to deep learning approaches and frameworks Unfortunately sponsorship is not provided for more »
the achievement of prioritised Use Cases and Business Objectives. Technical and Business Metadata Management: Develop a strategy for technical and business metadata capture on Databricks Unity Catalog and Purview, ensuring seamless integration between the two. Define the technical and business metadata required at different layers (Bronze, Silver, Gold). Implement more »
London, Liverpool, Merseyside, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
competitive day rate of £250-£400, falling inside IR35 regulations. Key Responsibilities: Design, develop, and maintain scalable data pipelines and ETL processes using AWS, Databricks, Python, Spark, and SQL. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions. Optimize and troubleshoot … Engineer or in a similar role. Strong proficiency in AWS services related to data engineering (e.g., S3, Redshift, Glue). Hands-on experience with Databricks for data processing and analytics. Proficient in Python programming for data manipulation and automation. Solid understanding of Apache Spark for big data processing. Strong SQL more »
Senior Data Engineer | DataBricks | AWS/Azure | Full Remote Outside IR35 up to £500 per day 6 months initial Start - Immediate FRG Data are looking for a Senior Data Engineer with strong DataBricks experience. Skills & Experience: We are looking for someone that has a strong mix of the following skills … experience utilising the Microsoft Azure stack for data engineering ie Azure Functions * Core skills in coding with SQL, Python and Spark * Proven experience using DataBricks ie lakehouse, delta live tables, Pyspark etc more »
designing architectures and leading technical delivery for Data focused AWS projects Knowledge of other Cloud Platforms and SaaS offerings such as Azure, GCP, Snowflake & Databricks Relevant industry recognised certifications, such as TOGAF, AWS Solution Architect etc Experience of delivering projects using relevant Data Architectural paradigms such as Kimball, Data Vault more »
in size but the company are also investing in the technology that they use with an emphasis on Microsoft tech such as Azure Synapse, Databricks and Data Factory. As part of this role, you will be responsible for some of the following areas. Develop and maintain the organisation Azure data … alongside other benefits. To be successful in this role you will have. Previous experience working in a Data Engineering role. Excellent understanding of Azure Databricks Experience creating data pipelines using Azure Data Factory Experience working with Azure Data Lake storage. Good working knowledge of other Azure cloud services and technologies more »
Senior Data Engineering Lead - Analytics & Infrastructure Financial Services £120,000 + 20% Bonus 2 days in the office per week (Central London) A leading global insurer is on a transformative journey to become the benchmark for quality across all its more »
Greater London, England, United Kingdom Hybrid / WFH Options
Anson McCade
Senior Data Engineer £65,000 - £75,000 + 10% Bonus London based - Hybrid working A consultancy at the heart of ground-breaking innovation and transformative solutions where expertise meets ambition. With a focus on collaborative brilliance shaping industries and propelling more »
Principal Data Engineer £80,000 - £103,000 + 15% Bonus London based - Hybrid working A consultancy at the heart of ground-breaking innovation and transformative solutions where expertise meets ambition. With a focus on collaborative brilliance shaping industries and propelling more »
Version 1 has celebrated over 26 years in the Technology industry and continues to be trusted by global brands to deliver IT solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat more »
position. Required Experience and Skills: Proven track record in deploying emerging data technology stacks such as MS Fabric and AI. Expertise in MS Azure, Databricks, MS Fabric, and related technologies. more »
some greenfield project wins. I'm looking for an experienced Lead Data Engineer to play a pivotal role in building state-of-the-art Databricks and Snowflake Lakehouse solutions from scratch. If you have experience with Azure, Databricks or Snowflake, SQL, ETL, and a passion for consulting, please get in … touch. Key Responsibilities: Design, develop, and maintain cutting-edge Databricks or Snowflake Lakehouse solutions from scratch Collaborate with cross-functional teams to understand data requirements and translate them into efficient ETL processes. Work with Azure services to ensure data storage, security, and scalability. Develop and optimize SQL queries for data … changes within the insurance sector to inform data strategies and solutions. Experience: Proven experience in building and maintaining data lakes or data warehouses using Databricks on Azure. Strong expertise in SQL and ETL processes. Recent experience working within consulting, with a deep understanding of how to be a true 'consultant more »
Star, Datavault, etc). Experience with SQL and query design on large, complex datasets. Experience with cloud and big-data tools and frameworks like Databricks/Spark, Airflow, Snowflake, etc. Expertise designing and developing with distributed data processing platforms like Databricks/Spark. Experience using ELT/ETL tools such more »
Lead Azure Data Engineer | PySpark (Python) & Synapse | Tech for Good/Charity Essential skills required: Azure – solid experience required of the Azure Data ecosystem Python - ESSENTIAL as PySpark is used heavily. You will be tested on PySpark Azure Synapse - ESSENTIAL more »
offers competitive compensation and ample room for professional growth. Key Responsibilities: Engineering: Design, develop, and maintain data pipelines using Azure Data Factory (ADF) and Databricks, ensuring the seamless flow of data across various systems and platforms. BI Reporting: Utilize Power BI to create visually appealing and insightful reports and dashboards … a strong background in building and maintaining data pipelines and Power BI reports. Skills: Proficiency in Azure services, including Azure Data Factory (ADF) and Databricks, is essential. Experience with Power BI is required, along with a solid understanding of SQL and data modelling principles. Mindset: Strong analytical and problem-solving more »
play a crucial role in designing, developing, and maintaining high-quality software solutions. Your primary focus will be on leveraging Azure, .NET, Python, and Databricks to build robust and scalable applications. You will collaborate with cross-functional teams to understand business requirements and deliver solutions that meet the needs of … with Azure services. Proficiency in .NET framework and C# programming. Solid experience with Python for data processing and automation tasks. Hands-on experience with Databricks for big data processing and analytics. Excellent problem-solving skills and a keen attention to detail. Strong communication skills and the ability to work effectively more »
About Tredence: - Tredence focuses on last mile delivery of insights into actions by uniting its strengths in business analytics, data science, and software engineering. The largest companies across industries are engaging with Tredence and deploying its prediction and optimization solutions more »
Python & Spark experience Must have strong AWS experience Must have Terraform experience SQL & NoSQL experience Have built out Data Warehouses & built Data Pipelines Strong Databricks & Snowflake experience Docker, ECS, Kubernetes & Orchestration tools like Airflow or Step Functions are nice to have Contracts are running for 6 months initially, paying up more »
the ground to $500 million acquisition previously - Company founders are Ex-VP’s for market leading companies - Investor Leadership team invested in likes of Databricks, AppDynamics, Coinbase etc.. - All executive team are ‘Start-up guru’s’ with multiple use cases of exit’s between them - VP Sales has previously scaled more »