Manchester, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
forward-thinking organization using data to drive innovation and business performance. They’re expanding their team and are looking for a talented Data Engineer with experience in Azure and Databricks to join the team. Salary and Benefits £55,000 – £65,000 salary depending on experience 10% performance-related bonus Hybrid working model – 2 days in the Greater Manchester office Supportive … do I need to apply for the role: Solid hands-on experience with Azure data tools, including Data Factory, Data Lake, Synapse Analytics, and Azure SQL. Strong proficiency with Databricks, including Spark, Delta Lake, and notebooks. Skilled in Python and SQL for data transformation and processing. Experience with Git and modern CI/CD workflows. Strong analytical mindset and effective More ❯
any required data migrations from on-premises or 3rd party hosted databases/repositories Build and support data-pipelines using ETL tools such as MS Azure Data Factory and Databricks Design and manage a standard access method to both cloud and on-premise data sources for use in data visualisation and reporting (predominantly using Microsoft Power Bi) Own and build … government services Your knowledge and certifications: Any MS Azure data certifications 2 years + working with Azure data engineering tools e.g: Azure Data factory Azure Synapse Azure SQL Azure DataBricks Microsoft Fabric Azure data lake Exposure to other data engineering and storage tools: Snowflake AWS tools - Kinesis/Glue/Redshift Google tools - BigQuery/Looker Experience working with open More ❯
data quality solutions. The ideal candidate should have strong expertise in ETL framework testing (preferably Talend or DataStage), BI report testing (preferably Power BI, Cognos), cloud technologies (preferably Azure, Databricks), SQL/PLSQL coding, and Unix/Python scripting. Key Responsibilities Lead and mentor a team of test engineers, assisting them with technical challenges and guiding them on best testing … frameworks. Understanding of DataOps & TestOps concepts for continuous data quality testing and automation. Experience validating unstructured data formats, including XML, JSON, Parquet. Knowledge of cloud data platforms like Azure, Databricks for data processing and analytics. In addition to our open door policy and supportive work environment, we also strive to provide a competitive salary and benefits package. TJX considers all More ❯
Lutterworth, England, United Kingdom Hybrid / WFH Options
PharmaLex
database. Collaborate with Data Analysts and Scientists to optimise data quality, reliability, security, and automation. Skills & Responsibilities: Core responsibility will be using the NHS Secure Data Environment which utilises databricks, to design and extract regular datasets. Configure and troubleshoot Microsoft Azure, manage data ingestion using LogicApps and Data Factory. Develop ETL scripts using MS SQL, Python, handle web scraping, APIs More ❯
ll play a key role in shaping the data strategy, enhancing platform capabilities, and supporting business intelligence initiatives. Key Responsibilities Design and develop Azure data pipelines using Data Factory, Databricks, and related services. Implement and optimize ETL processes for performance, reliability, and cost-efficiency. Build scalable data models and support analytics and reporting needs. Design and maintain Azure-based data More ❯
West Midlands, England, United Kingdom Hybrid / WFH Options
Amtis - Digital, Technology, Transformation
ll play a key role in shaping the data strategy, enhancing platform capabilities, and supporting business intelligence initiatives. Key Responsibilities Design and develop Azure data pipelines using Data Factory, Databricks, and related services. Implement and optimize ETL processes for performance, reliability, and cost-efficiency. Build scalable data models and support analytics and reporting needs. Design and maintain Azure-based data More ❯
London, England, United Kingdom Hybrid / WFH Options
ScanmarQED
Experience: 3–5 years in Data Engineering, Data Warehousing, or programming within a dynamic (software) project environment. Data Infrastructure and Engineering Foundations: Data Warehousing: Knowledge of tools like Snowflake, DataBricks, ClickHouse and traditional platforms like PostgreSQL or SQL Server. ETL/ELT Development: Expertise in building pipelines using tools like Apache Airflow, dbt, Dagster. Cloud providers: Proficiency in Microsoft Azure More ❯
data and analytics, unlocking quality growth and operational excellence. What are we looking for? Hands-on experience designing greenfield, scalable data platforms in cloud using Azure D&A stack, Databricks, and Azure Open AI solutions. Proficiency in coding (Python, PL/SQL, Shell Script), relational and non-relational databases, ETL tooling (such as Informatica), and scalable data platforms. Proficiency in More ❯
Maidstone, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Analytics Engineer Associate. Responsibilities: Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions for data ingestion, storage, and transformation, ensuring data quality and availability. Writing clean, efficient, reusable Python code for data engineering tasks. Collaborating with data More ❯
Nottingham, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Associate. Responsibilities: Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, and transform large datasets, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating More ❯
Stockport, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Analytics Engineer Associate. Responsibilities Daily responsibilities include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions for data ingestion, storage, and transformation, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating with More ❯
have what it takes? Extensive experience in data engineering, analytics engineering or a related field Extensive experience working with data modeling and transformation workflows Experience developing on AWS and Databricks and experience with leveraging infrastructure as code to deploy data infrastructure Excellent communication and collaboration skills Experience working with Git, practicing code reviews and branching strategies, CI/CD and More ❯
delivery of strategic programmes, taking ownership for designing and building innovative data solutions, including managing teams of Data Scientists & Engineers. Work with a mix of cloud services (majoring in Databricks and Azure, alongside AWS and Snowflake). Support in setting the direction and vision of the Data Engineering part of our business, putting in place frameworks and guidance to support More ❯
particularly in Data & Analytics function. Expert proficiency in Python, R, SQL, and distributed computing frameworks (e.g., Spark, Hadoop). Advanced knowledge of data engineering tools (e.g., Airflow, Kafka, Snowflake, Databricks). Proficiency in machine learning frameworks (TensorFlow, PyTorch, Scikit-learn). Ability to implement robust data governance and AI model explainability frameworks. Commitment to ethical AI practices and responsible data More ❯
London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
teams, where required. Essential Skills and Experience: Educated to degree level or have equivalent professional experience. Experience translating business requirements into solution design and implementation. Experience of MS Azure Databricks Experience working with Database technologies such as SQL Server, and Data Warehouse Architecture with knowledge of big data, data lakes and NoSQL. Experience following product/solution development lifecycles using More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
teams, where required. Essential Skills and Experience: Educated to degree level or have equivalent professional experience. Experience translating business requirements into solution design and implementation. Experience of MS Azure Databricks Experience working with Database technologies such as SQL Server, and Data Warehouse Architecture with knowledge of big data, data lakes and NoSQL. Experience following product/solution development lifecycles using More ❯
London, England, United Kingdom Hybrid / WFH Options
Datatech
teams, where required. Essential Skills and Experience: Educated to degree level or have equivalent professional experience. Experience translating business requirements into solution design and implementation. Experience of MS Azure Databricks Experience working with Database technologies such as SQL Server, and Data Warehouse Architecture with knowledge of big data, data lakes and NoSQL. Experience following product/solution development lifecycles using More ❯
data-heavy systems, ideally in a startup or fast-moving environment. Technical Stack : Languages/Tools: Python (REST API integrations), DBT, Airbyte, GitHub Actions Modern Data Warehousing: Snowflake, Redshift, Databricks, or BigQuery. Cloud & Infra: AWS (ECS, S3, Step Functions), Docker (Kubernetes or Fargate a bonus) Data Modelling : Strong grasp of transforming structured/unstructured data into usable models (facts, dimensions More ❯
Chester, England, United Kingdom Hybrid / WFH Options
Forge Holiday Group Ltd
ETL/ELT processes Exposure to Python or any other object-oriented programming languages Experience with modern data stack tools and cloud-based data warehouses like MS Fabric, Snowflake, Databricks, Teradata or AWS Experience in designing and constructing effective reports and dashboards that transform data into actionable insights with Tableau or Power BI Proven ability to manage work within set More ❯
London, England, United Kingdom Hybrid / WFH Options
First Central Services
PowerBI. Experience & knowledge Requires extensive experience of developing and implementing end to end data solutions in the cloud preferably in Azure. Experience engineering with big data technologies such as Databricks and/or Synapse Analytics, using PySpark. Solution design experience across end-to-end data solutions (sourcing to consumption) Experience in Azure services such as Data Factory, Azure Functions, ADLS More ❯
Skills/Experience Essential Extensive experience in data engineering, including designing and developing data pipelines for retrieval/ingestion/presentation/semantics in an Azure environment. Strong ADF, DataBricks, SQL, Python, Power BI Data acquisition from various data sources including Salesforce, API, XML, JSON, Parquet, flat file systems and relational data. Excellent team player able to work under pressure. More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Capgemini
Driven, use of Dataverse) and/or Power Automate. Copilot Studio experience also desirable. Programming language (Python, R or SQL) Hands-on experience with tools such as Microsoft Fabric, Databricks, or Snowflake Consulting Experience: Current experience in a major consulting firm and/or significant consulting background with evidence of effective stakeholder management to address business challenges Experience working across More ❯
Data Engineer – Databricks About the Role We’re looking for a Databricks Champion to design, build, and optimize data pipelines using Databricks. You’ll work with clients and internal teams to deliver scalable, efficient data solutions tailored to business needs. Key Responsibilities Develop ETL/ELT pipelines with Databricks and Delta Lake Integrate and process data from diverse sources Collaborate … with data scientists, architects, and analysts Optimize performance and manage Databricks clusters Build cloud-native solutions (Azure preferred, AWS/GCP also welcome) Implement data governance and quality best practices Automate workflows and maintain CI/CD pipelines Document architecture and processes What We’re Looking For Required: 5+ years in data engineering with hands-on Databricks experience Databricks Champion … Status (Solution Architect/Partner) Proficient in Databricks, Delta Lake, Spark, Python, SQL Cloud experience (Azure preferred, AWS/GCP a plus) Strong problem-solving and communication skills Databricks Champion More ❯
Passionate about Databricks and Data Engineering? Aivix has been in operation for over a year and recently achieved AWS Select Partner & Databricks Registered partner status. We excel in offering analytics solutions that facilitate data-driven digital transformations for our customers. Aivix is currently looking for data engineers who are interested in Databricks. You will be shaping the beating heart of … we are an enthusiastic group. We encourage everyone to be themselves while actively empowering an entrepreneurial mindset. Ready to contribute to help and build our growing story? As a Databricks Engineer you will be involved in: Making use of best practices in Databricks; Big Data and AI projects Having conversations with the customer to correctly identify the data needs; Developing More ❯
10+ years Job Summary : We are seeking a highly experienced and visionary Data Architect to lead the design and implementation of the data architecture for our cutting-edge Azure Databricks platform focused on economic data. This platform is crucial for our Monetary Analysis, Forecasting, and Modelling efforts. The Data Architect will be responsible for defining the overall data strategy, data … within the economic domain. Key Experience: Extensive Data Architecture Knowledge: They possess a deep understanding of data architecture principles, including data modeling, data warehousing, data integration, and data governance. Databricks Expertise: They have hands-on experience with the Databricks platform, including its various components such as Spark, Delta Lake, MLflow, and Databricks SQL. They are proficient in using Databricks for … various data engineering and data science tasks. Cloud Platform Proficiency: They are familiar with cloud platforms like AWS, Azure, or GCP, as Databricks operates within these environments. They understand cloud- data architectures and best practices. Leadership and Communication Skills: They can lead technical teams, mentor junior architects, and effectively communicate complex technical concepts to both technical and non-technical stakeholders. More ❯