reliable data movement and transformation. • Data Modelling using Kimball, 3NF or Dimensional methodologies • Utilize SQL and Python languages to extract, transform, and load data from various sources into Azure Databricks and Azure SQL/SQL Server. • Design and implement metadata driven pipelines to automate data processing tasks. • Collaborate with cross-functional teams to understand data requirements and implement appropriate solutions. … are some obvious required Skills and Experience we are going to be seeking out. For this role we'd be expecting to see: • Solid experience in Azure, specifically Azure Databricks and Azure SQL/SQL Server. • Proficiency in SQL and Python languages. • Hands-on experience in designing and building data pipelines using Azure Data Factory (ADF). • Familiarity with building More ❯
Liverpool, England, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
to Haves": • Certification in dbt or Google Cloud Platform or related technologies. • Experience with other cloud platforms (e.g. AWS, Azure, Snowflake) and data warehouse/lakehouse technologies (e.g. Redshift, Databricks, Synapse) • Knowledge of distributed big data technologies. • Proficiency in Python. • Familiarity with data governance and compliance frameworks. Your characteristics as a Consultant will include: • Driven by delivering quality work, with More ❯
Social network you want to login/join with: This start-up focusses on healthcare AI and are looking for a Data Engineer to join their team. You will be joining a team of 45 people, including Data Scientists, ML More ❯
warehouses using Azure Synapse, ensuring data integrity and security Build, deploy, and manage ETL processes to support real-time and batch data processing using tooling across the Azure estate, Databricks, PySpark, and SQL Oversee data storage across both relational and non-relational databases, ensuring efficient data retrieval Design and implement data security protocols to safeguard sensitive information Collaborate with DBAs … technical guidance and fostering a culture of continuous learning and improvement What we are looking for: 5+ years of experience in data engineering Expertise in Azure DWH and AWS Databricks Strong programming skills in Python/PySpark or other relevant languages for data manipulation and ETL workflows Proficiency in SQL and experience with both relational (e.g., SQL Server, MySQL) and More ❯
. Strong background in data architecture , including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact More ❯
Discretionary bonus scheme And many more Role and Responsibilities Design and deliver data science solutions that drive business outcomes Lead the development of predictive and optimisation models using Python, Databricks, and GCP Enhance the businesses AI and ML capabilities to modernise the data platform Collaborate with stakeholders to understand their goals and ensure analytics work is aligned to practical business … the data science function Requirements Experience in a data science leadership position Expert use of Python, PySpark, and PyTorch Strong experience with AI and ML best practice and development Databricks and GCP experience Strong business and commercial acumen Excellent stakeholder management My client have already began holding interviews for this role. So if you’re interested, get in touch ASAP More ❯
Social network you want to login/join with: Principal Data and AI Engineer, liverpool col-narrow-left Client: Location: liverpool, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 4 Posted: 07.06.2025 Expiry More ❯
Liverpool, England, United Kingdom Hybrid / WFH Options
BJSS
Professional Services team offers significant freedom to develop our data platform accelerators. We welcome candidates from diverse backgrounds and skills, provided you meet the following criteria: Strong experience with Databricks Experience working in AWS or Azure environments Comfortable with Infrastructure as Code Ability to automate repetitive tasks and workloads Passion for working with and evangelizing data solutions Interest in staying More ❯
end-to-end data flows. Experience in implementing data governance, including data cataloging, data lineage tracking, and metadata management to ensure data accuracy, accessibility, and compliance. Preferred: Experience with Databricks Understanding of how data platforms interact with marketing and customer engagement platforms. Knowledge of service-oriented architecture, including exposing and consuming data via APIs, streams, and webhooks. Good understanding of More ❯
stakeholders and business users. Key Skills: Proven background in platform selection, configuration, and onboarding (ideally across AWS, cloud, or on-prem solutions). Hands-on familiarity with tools like Databricks, Python, and SaaS analytics environments. Experience working within a CTO or principal engineering team to translate complex technical concepts into language understood by functional users. Strong stakeholder management - able to More ❯