Data Engineer (6-Month Contract) - £450 per day inside IR35 – Birmingham/Hybrid We are recruiting on behalf of a leading company dedicated to innovative solutions in waste management. As part of our ongoing commitment to excellence, we are currently more »
Python & Spark experience Must have strong AWS experience Must have Terraform experience SQL & NoSQL experience Have built out Data Warehouses & built Data Pipelines Strong Databricks & Snowflake experience Docker, ECS, Kubernetes & Orchestration tools like Airflow or Step Functions are nice to have Contracts are running for 6 months initially, paying up more »
the ground to $500 million acquisition previously - Company founders are Ex-VP’s for market leading companies - Investor Leadership team invested in likes of Databricks, AppDynamics, Coinbase etc.. - All executive team are ‘Start-up guru’s’ with multiple use cases of exit’s between them - VP Sales has previously scaled more »
and marketing purposes around pricing, promotions, recommendations, Computer Vision, NLP, forecasting and media performance optimisation Experience with cloud platforms such as Snowflake, AWS and DataBricks Excellent problem solving skills to deliver best in class data science products and solutions for end clients Exposure to cloud based analytical platforms such as … Databricks, Snowflake, Google BigQuery etc. more »
business benefits. Regularly launching products and services resulting from your work and actively contributing to their success. Experience we are seeking: Good understanding of DataBricks and PySpark. Proficient knowledge of SQL and CosmosDB (or similar) databases. Expertise in designing, constructing, administering, and maintaining data warehouses and data lakes. Familiarity with … Catalog, and Master Data Management. (You will be skilled up in this sought after area) Understanding of Advanced Analytics and Model Management, encompassing Azure Databricks, Azure ML/MLFlow, and deployment of models using Azure Kubernetes Service. Excellent oral and written communication skills. Please apply if of interest. more »
on designing and developing data solutions within an Azure cloud native environment 🌩️. The landscape is a greenfield setting, where a new Azure and Databricks-based platform will be designed and implemented from scratch. Strong skills in Azure tooling (ADF), SQL, and Python are essential 🛠️. 🎯 Key Responsibilities: 🔹 Expertise in … cloud-native environment ☁️ 🔹 Proven ability to design, implement, and utilize various database structures with a focus on cloud-based data services such as Azure, Databricks, and Snowflake ❄️ 🔹 Experience in building ETL/ELT data pipelines and applying DevOps (CI/CD) concepts to test, schedule, and deploy to a production more »
Eames Consulting are delighted to be supporting an established Investment management company that are looking for a Data engineer. Responsibilities: Develop and maintain complex SQL queries, customize reports, and optimize data usage. Create tools to address business challenges and empower more »
Job Title: Senior DevOps Engineer Location: London Job Type: Permanent Work Type: Hybrid Are you a sports enthusiast with a passion for working in the sports industry? Whether you're interested in the dynamic world of professional sports, the business more »
Wandsworth, Greater London, Dundonald, United Kingdom
DataBuzz
Job Title: Senior DevOps Engineer Location: London Job Type: Permanent Work Type: Hybrid Are you a sports enthusiast with a passion for working in the sports industry? Whether you're interested in the dynamic world of professional sports, the business more »
Job Title: Senior DevOps Engineer Location: London Job Type: Permanent Work Type: Hybrid Are you a sports enthusiast with a passion for working in the sports industry? Whether you're interested in the dynamic world of professional sports, the business more »
business problems that can be solved using data, we would like to talk to you. We are looking to recruit an experienced Principal Azure Databricks Architect in a long-term contract role. Details follow: This is a client facing role. Should have extensively worked on Azure Hub, Azure Device SDK more »
Global Enterprise Partners is currently looking for a Azure Databricks consultant for an assignment with our client in the Netherlands. As Azure Databricks specialist you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing Requirements Azure Databricks specialist Role: Design and develop data … solutions for data generation, collection, and processing using Microsoft Azure Databricks Good experience in Spark Able to create, maintain, and optimize data pipelines to ensure data quality and implement ETL processes to migrate and deploy data across systems. Collaborate with team members to ensure and troubleshoot data pipeline performance issues … data modelling) and support throughout the implementation life cycle. Conduct rigorous system testing and troubleshooting to optimize the performance of data solutions. Details Azure Databricks specialist Role: Start date: ASAP (Flexible) Duration: Initially 6 months (with extensions) Hours per week: 40 hours Location: Utrecht, the Netherlands Type of contract: Freelancer more »
project uses Java to ingest the data into the platform. The Consultant would be helping to build out the Data Lake and Lakehouse on Databricks, working with AWS. Skill Priority: AWS, Java (to support ingestion), Databricks - experience in PowerBi would also be useful. You will be a Engineer with past … Java, Python, PySpark Mechanisms: MongoDB, Redshift, AWS S3 Environments/Infra: AWS (required), [AWS Lambda, Terraform] (nice to have) Platforms: Creating data pipelines within Databricks or equivalent such as Jupyter Notebook, PowerBI (nice to have) of Enterprise Models for Engineering: microservices, APIs, AWS services to support models (SQS, SNS, etc. more »
talented Senior Data Engineer to join our team and build robust data pipelines. Responsibilities: Build, and maintain scalable data pipelines. Maintain the pipeline through Databricks and Apache Beam Seamless data integration and quality. Data storage solutions using GCP (will still be considered with other cloud experience) Best practices in data … management and SDLC. Work with Data Scientists to bring the AI/ML models into production Databricks for daily data processing Qualifications: Experience in data engineering and pipeline construction. GCP and familiarity with other cloud environments. Understanding of SDLC. Experience with Databricks. Programming skills (Python and SQL). What We more »
resources. Support Microsoft Power Platform components including CRM, Power Apps, Power Automate, and Power BI. Share knowledge and support tools like Synapse, Data Factory, DataBricks, MS on-prem data gateway, Entra ID, and databases. Own the Kanban process, conduct bi-weekly standups, and coordinate team tasks. Join our dynamic team … design, budget management, and cost reporting will be crucial. Experience with DevOps principles, Microsoft Power Platform components, and tools like Synapse, Data Factory, and DataBricks is essential. Effective communication, coordination, and team management abilities will make you an invaluable asset. I look forward to hearing from and speaking with you. more »
governance and tools Technical familiarity with data modeling and database technologies (ETL processes, DB staging, data platforms) as well as data tools (Power BI, Databricks, Palantir) (Re)Insurance business understanding (specifically underwriting and claims processes) Excellent verbal-/and written communication- and presentation skills (English), with the ability to explain … in the topic of technology, data governance, AI, health & wellness, financial sector, insurance, and consumer behavior Familiarity with BI & data management tools (Power BI, Databricks, Palantir, etc.) and their data requirements Familiar with data science and engineering techniques Required skills & experience Bachelor’s or master’s degree in a quantitative more »
Sheffield, South Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
DWP Digital
on-prem, but the direction of travel is cloud engineering. You'll be executing code in different places across the following tech stack: Azure, Databricks, PySpark and Pandas. You will steer the data engineering function within a wider product team. There'll be lots of connecting and interaction with stakeholders … and inclusive environment where you can grow your career and make a real difference. Essential criteria: Enterprise-scale experience with Azure data engineering tools, Databricks, PySpark and Pandas Experience of data modelling and transforming raw data into datasets Experience of building team capability through role modelling, mentoring, and coaching Able more »
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
DWP Digital
on-prem, but the direction of travel is cloud engineering. You'll be executing code in different places across the following tech stack: Azure, Databricks, PySpark and Pandas. You will steer the data engineering function within a wider product team. There'll be lots of connecting and interaction with stakeholders … and inclusive environment where you can grow your career and make a real difference. Essential criteria: Enterprise-scale experience with Azure data engineering tools, Databricks, PySpark and Pandas Experience of data modelling and transforming raw data into datasets Experience of building team capability through role modelling, mentoring, and coaching Able more »
Manchester, North West, United Kingdom Hybrid / WFH Options
DWP Digital
on-prem, but the direction of travel is cloud engineering. You'll be executing code in different places across the following tech stack: Azure, Databricks, PySpark and Pandas. You will steer the data engineering function within a wider product team. There'll be lots of connecting and interaction with stakeholders … and inclusive environment where you can grow your career and make a real difference. Essential criteria: Enterprise-scale experience with Azure data engineering tools, Databricks, PySpark and Pandas Experience of data modelling and transforming raw data into datasets Experience of building team capability through role modelling, mentoring, and coaching Able more »
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
DWP Digital
on-prem, but the direction of travel is cloud engineering. You'll be executing code in different places across the following tech stack: Azure, Databricks, PySpark and Pandas. You will steer the data engineering function within a wider product team. There'll be lots of connecting and interaction with stakeholders … and inclusive environment where you can grow your career and make a real difference. Essential criteria: Enterprise-scale experience with Azure data engineering tools, Databricks, PySpark and Pandas Experience of data modelling and transforming raw data into datasets Experience of building team capability through role modelling, mentoring, and coaching Able more »
Longbenton, England, United Kingdom Hybrid / WFH Options
DWP Digital
on-prem, but the direction of travel is cloud engineering. You'll be executing code in different places across the following tech stack: Azure, Databricks, PySpark and Pandas. You will steer the data engineering function within a wider product team. There'll be lots of connecting and interaction with stakeholders … and inclusive environment where you can grow your career and make a real difference. Essential criteria: Enterprise-scale experience with Azure data engineering tools, Databricks, PySpark and Pandas Experience of data modelling and transforming raw data into datasets Experience of building team capability through role modelling, mentoring, and coaching Able more »
Newcastle Upon Tyne, Tyne and Wear, North East, United Kingdom Hybrid / WFH Options
DWP Digital
on-prem, but the direction of travel is cloud engineering. You'll be executing code in different places across the following tech stack: Azure, Databricks, PySpark and Pandas. You will steer the data engineering function within a wider product team. There'll be lots of connecting and interaction with stakeholders … and inclusive environment where you can grow your career and make a real difference. Essential criteria: Enterprise-scale experience with Azure data engineering tools, Databricks, PySpark and Pandas Experience of data modelling and transforming raw data into datasets Experience of building team capability through role modelling, mentoring, and coaching Able more »
Blackpool, Lancashire, Marton, United Kingdom Hybrid / WFH Options
Department of Work & Pensions
on-prem, but the direction of travel is cloud engineering. You'll be executing code in different places across the following tech stack: Azure, Databricks, PySpark and Pandas. You will steer the data engineering function within a wider product team. There'll be lots of connecting and interaction with stakeholders … and inclusive environment where you can grow your career and make a real difference. Essential criteria: Enterprise-scale experience with Azure data engineering tools, Databricks, PySpark and Pandas Experience of data modelling and transforming raw data into datasets Experience of building team capability through role modelling, mentoring, and coaching Able more »
for optimal AI application performance. Advise peers and senior management on business and customer impact. Write production-grade Python code, leveraging CI/CD, Databricks, and ensuring appropriate test coverage; proficiency in SQL. Liaise with third-party partners and stay updated on field advancements and regulations. Collaborate with data governance … offerings by 3rd party vendors. Knowledge of OO programming and software design, i.e., SOLID principles. Knowledge and working experience of AGILE methodologies. Familiarity with Databricks, and RAG application architecture a plus Experience with latency optimisation and quantisation preferred Why choose us? This is your opportunity to be at the forefront more »
Reigate, England, United Kingdom Hybrid / WFH Options
esure
for optimal AI application performance. Advise peers and senior management on business and customer impact. Write production-grade Python code, leveraging CI/CD, Databricks, and ensuring appropriate test coverage; proficiency in SQL. Liaise with third-party partners and stay updated on field advancements and regulations. Collaborate with data governance … offerings by 3rd party vendors. Knowledge of OO programming and software design, i.e., SOLID principles. Knowledge and working experience of AGILE methodologies. Familiarity with Databricks, and RAG application architecture a plus Experience with latency optimisation and quantisation preferred Additional Information Why choose us? This is your opportunity to be at more »