and transformation in Azure. Utilizing Azure Data Factory or comparable technologies, create and maintain ETL (Extract, Transform, Load) operations Use Azure Data Factory and Databricks to assemble large, complex data sets Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Ensure data quality … solving, and collaboration with cross-functional teams Azure DevOps Apache Spark, Python Strong SQL proficiency Data modeling understanding ETL processes, Azure Data Factory Azure Databricks knowledge Familiarity with data warehousing Big data technologies Data governance principles are a plus Overview: Infosys is a global leader in next-generation digital services More ❯
platforms to support business insights, analytics, and other data-driven initiatives. Job Specification ( Technical Skills) : Cloud Platforms: Expert-level proficiency in Azure (Data Factory, Databricks, Spark, SQL Database, DevOps/Git, Data Lake, Delta Lake, Power BI), with working knowledge of Azure WebApp and Networking. Conceptual understanding of Azure AI … collaborate with business segments (MW Snacking, Petcare, Royal Canin) to identify data platform requirements and develop tailored solutions working with vendors such as Microsoft, Databricks, Snowflake etc., Develop and implement strategies for activating new platforms, frameworks, and technologies within the MARS, deploy to segments and corporate teams within MGS. Ensure More ❯
the design, development, testing, implementation, and maintenance of data-driven software solutions. The ideal candidate will possess expert-level knowledge of Python, C#, .NET, Databricks, and Azure, ensuring alignment with business objectives. They will identify and champion innovation in technology and processes to unlock new commercial opportunities. Collaborating closely with … latitude in decision-making. Key responsibilities include migrating an existing Python-based reporting/analytics application and its data workflows from Azure SQL to Databricks, creating and optimizing data pipelines in Databricks, and leveraging scheduling, job orchestration, and optionally machine learning features to serve trading and analytics needs. The Senior … code (e.g., data structures, error handling, code optimization). Proficiency in SQL – comfortable designing databases, writing complex queries, and handling performance tuning. Experience with Databricks (or a comparable Spark environment) – ability to build data pipelines, schedule jobs, and create dashboards/notebooks. Experience with Azure services (Data Factory, Synapse, or More ❯
the design, development, testing, implementation, and maintenance of data-driven software solutions. The ideal candidate will possess expert-level knowledge of Python, C#, .NET, Databricks, and Azure, ensuring alignment with business objectives. They will identify and champion innovation in technology and processes to unlock new commercial opportunities. Collaborating closely with … latitude in decision-making. Key responsibilities include migrating an existing Python-based reporting/analytics application and its data workflows from Azure SQL to Databricks, creating and optimizing data pipelines in Databricks, and leveraging scheduling, job orchestration, and optionally machine learning features to serve trading and analytics needs. The Senior … code (e.g., data structures, error handling, code optimization). Proficiency in SQL – comfortable designing databases, writing complex queries, and handling performance tuning. Experience with Databricks (or a comparable Spark environment) – ability to build data pipelines, schedule jobs, and create dashboards/notebooks. Experience with Azure services (Data Factory, Synapse, or More ❯
computing environment for Underwriting. In addition, you will act as technical leader for the development and delivery of a new Underwriting Datamart using the Databricks platform. To be successful in the role, you will be technically astute with experience of Python, SQL and Databricks. While an actuarial qualification is not More ❯
month • Sector: Technology Required Skills: • LLM (Latent Log-linear Model) : Expertise in using LLM for advanced predictive modelling and analysis. • Data Bricks : Proficiency with Databricks platform for big data processing and analytics. • Machine Learning : Strong background in developing and deploying machine learning algorithms and models. • Python : Excellent coding skills in More ❯
wider data architecture. Be a subject matter expert in the use and management of our data platform tools such as Azure Data Factory, Azure Databricks and Unity Catalogue and SQL Server. Apply best practice to ensure our pipelines are transparent, our data lineage is understood, and our data quality is More ❯
wider data architecture. Be a subject matter expert in the use and management of our data platform tools such as Azure Data Factory, Azure Databricks and Unity Catalogue and SQL Server. Apply best practice to ensure our pipelines are transparent, our data lineage is understood, and our data quality is More ❯
and experience in technical testing for data analytics/engineering Hands-on experience with GCP or AWS and data warehousing such as Snowflake or Databricks Experience of working with relevant stakeholders to understand requirements, approach challenges and to understand discuss the architectural framework. Mentoring or training experience would be seen More ❯
preferably within the fashion or retail industry. • Strong analytical skills with proficiency in Excel, data visualization tools (Tableau, PowerBI). • Able to code in Databricks and comfortable working with large datasets • Passion for fashion, customer-centric thinking, and a keen eye for detail. • Excellent communication skills and ability to work More ❯
preferably within the fashion or retail industry. • Strong analytical skills with proficiency in Excel, data visualization tools (Tableau, PowerBI). • Able to code in Databricks and comfortable working with large datasets • Passion for fashion, customer-centric thinking, and a keen eye for detail. • Excellent communication skills and ability to work More ❯
month Sector: Technology Required Skills: LLM (Latent Log-linear Model): Expertise in using LLM for advanced predictive modelling and analysis. Data Bricks: Proficiency with Databricks platform for big data processing and analytics. Machine Learning: Strong background in developing and deploying machine learning algorithms and models. Python: Excellent coding skills in More ❯
Data, Analytics and AI concepts and experience in either delivery or pre-sales or solutions on next-gen Data on Cloud Technologies (e.g. Microsoft, Databricks, Snowflake, AWS, GCP) Successfully delivered Projects/Programs in building Enterprise Datra Platform on Cloud, Migration Programs, Data Consulting, or projects encompassing Data and Analytics More ❯
Reading, Berkshire, South East, United Kingdom Hybrid / WFH Options
TM Group (UK) Limited
governance, quality frameworks, and lifecycle management is second to none, and you're comfortable working with modern data platforms such as Amazon EMR, Glue, Databricks, DBT, Power BI or RStudio. Communication is your superpower whether you're presenting to the board or collaborating with IT, you know how to make More ❯
controls. Preferred Qualifications: Candidates with the following knowledge areas, though not required, are preferred: Operating Systems: Windows, Unix, Linux Databases: Oracle, Sybase, SQL Server, Databricks, Postgres Code and Scripting Languages: SQL, PowerShell, Linux shell scripting, Python and an understanding of operating systems Windows and Linux and accounts management on these More ❯
A/B testing. Strong machine learning and statistical knowledge. Preferred Qualifications: Proficient with Machine learning frameworks such as Tensorflow, PyTorch, MLlib. Experience with Databricks, Spark, Tecton, Kubernetes, Helm, Jenkins. Familiarity with standard methodologies in large-scale DL training/Inference. Experience with reducing model serving latency, memory footprint. Experience More ❯
Guildford, Surrey, United Kingdom Hybrid / WFH Options
Allianz Popular SL
Institute and Faculty of Actuaries in the UK. Our team values continuous learning; you can enjoy free access to top training platforms like DataCamp, Databricks Academy, and LinkedIn Learning. Additionally, we host Lunch and Learn sessions on key business topics hosted by experts across the team and monthly insight sessions More ❯
Bournemouth, Dorset, United Kingdom Hybrid / WFH Options
Allianz Popular SL
Institute and Faculty of Actuaries in the UK. Our team values continuous learning; you can enjoy free access to top training platforms like DataCamp, Databricks Academy, and LinkedIn Learning. Additionally, we host Lunch and Learn sessions on key business topics hosted by experts across the team and monthly insight sessions More ❯
Data Scientist with Databricks Experience Salary-up to £90K base + bonus + benefits Location- Work from Home but accessible to travel to London when needed Our client is an international company that requires a senior Data Scientist with experience in Azure Databricks, Knowledge Graph, Neo4J Graph Database, and RAG … pipelines for LLM to join the team. Job Description: Responsibilities: Develop and implement data models and algorithms to solve complex business problems. Utilize Databricks to manage and analyse large datasets efficiently. Collaborate with cross-functional teams to understand business requirements and deliver data-driven insights. Design and build scalable data … industry trends and best practices in data science and big data technologies. Requirements: Proven experience as a Data Scientist or similar role. Proficiency with Databricks and its ecosystem. Strong programming skills in Python, R, or Scala. Experience with big data technologies such as Apache Spark, Databricks. Knowledge of SQL and More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
ADLIB Recruitment
FTSE100 Company Join a small, collaborative team with big impact. Bring ideas to life using modern tech like Databricks and AWS. Flexible hybrid working and strong support from leadership. Were working with a well-established organisation thats on an exciting journey to bring more of their data capability in-house … building new ones from scratch, helping shape best practices, mentoring more junior engineers, or exploring the latest tools and tech. The team are using Databricks and AWS and theyre keen for someone whos worked across data warehouse architecture, orchestration tools like Airflow, and configuration-driven development. Youll also work closely … PySpark preferred) Experience with cloud platforms (AWS/Azure) Solid understanding of data architecture, modelling, and ETL/ELT pipelines Experience using tools like Databricks, Redshift, Snowflake, or similar Comfortable working with APIs, CLIs, and orchestration tools like Airflow Confident using Git and familiar with CI/CD processes (Azure More ❯
Greater Bristol Area, United Kingdom Hybrid / WFH Options
ADLIB Recruitment | B Corp™
FTSE100 Company Join a small, collaborative team with big impact. Bring ideas to life using modern tech like Databricks and AWS. Flexible hybrid working and strong support from leadership. We’re working with a well-established organisation that’s on an exciting journey to bring more of their data capability … building new ones from scratch, helping shape best practices, mentoring more junior engineers, or exploring the latest tools and tech. The team are using Databricks and AWS and they’re keen for someone who’s worked across data warehouse architecture, orchestration tools like Airflow, and configuration-driven development. You’ll … PySpark preferred) Experience with cloud platforms (AWS/Azure) Solid understanding of data architecture, modelling, and ETL/ELT pipelines Experience using tools like Databricks, Redshift, Snowflake, or similar Comfortable working with APIs, CLIs, and orchestration tools like Airflow Confident using Git and familiar with CI/CD processes (Azure More ❯
evaluate cutting-edge data technologies, tools, and practices to improve data engineering processes. Experience Required: Developing data processing pipelines in python and SQL for Databricks including many of the following technologies: Spark, Delta, Delta Live Tables, PyTest, Great Expectations (or similar) and Jobs. Developing data pipelines for batch and stream … data governance, privacy regulations (e.g. GDPR), and security best practices. Delivering data engineering and designing for cloud native data platforms (AWS/Azure/Databricks). Building DevOps pipelines for data engineering solutions using Terraform, GitHub, DevOps. Working within highly agile multidisciplinary scrum teams in Scrum/Kanban. Mentoring data More ❯
Belfast, City of Belfast, County Antrim, United Kingdom
Innovative Tech People
Skills : Development experience with Microsoft (Azure) technologies, including Azure Data Factory, Synapse, and Power BI, or relevant ETL tools. Expertise in Microsoft Fabric or Databricks Experience with technology partners or consulting organizations is highly desirable. Leadership experience in technical teams (engineers, analysts, architects) for data-intensive systems. Proficiency in SQL … enterprise architecture patterns. Proven track record in delivering data-oriented solutions, including data warehousing, operational insight, data management, or business intelligence. Certifications : Azure/Databricks data certifications are desirable. If you want the opportunity to take your career to the next level, please apply now. More ❯
evaluate cutting-edge data technologies, tools, and practices to improve data engineering processes. Experience Required: Developing data processing pipelines in python and SQL for Databricks including many of the following technologies: Spark, Delta, Delta Live Tables, PyTest, Great Expectations (or similar) and Jobs. Developing data pipelines for batch and stream … data governance, privacy regulations (e.g. GDPR), and security best practices. Delivering data engineering and designing for cloud native data platforms (AWS/Azure/Databricks). Building DevOps pipelines for data engineering solutions using Terraform, GitHub, DevOps. Working within highly agile multidisciplinary scrum teams in Scrum/Kanban. Mentoring data More ❯
data integration, automation, and problem-solving. In-depth experience with cloud data platforms (AWS, Azure, Google Cloud) and Microsoft services (e.g., Azure Storage, Azure Databricks, Azure Synapse Analytics, Azure SQL Database, Microsoft Fabric, Microsoft Purview). Deep understanding of data warehousing concepts and ETL/ELT methodologies, including hands-on … experience with relevant tools and technologies (e.g., Databricks). Proven track record of diagnosing complex issues within data processing workflows and optimizing data delivery and performance at scale. Analytical & Problem-Solving Skills: Strong analytical abilities, with experience in identifying and solving highly complex data issues quickly and efficiently. Ability to More ❯