Nottingham, Nottinghamshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
are now looking for a hands-on Data Engineer to join their talented delivery team. This is a fantastic opportunity to work with a modern data stack - including Snowflake , dbt , AWS , Python , and SQL - on impactful projects that power reporting, automation, predictive analytics and Artificial Intelligence. This role is fully remote, and is therefore open to candidates across the UK. … a self-starter mentality, who wants to be part of company that is growing and maturing whilst continually learning. Key responsibilities: Designing and building robust data pipelines using SQL, dbt, and Python within Snowflake Developing scalable, testable, and maintainable code Collaborating with analytics, product, and client teams to deliver high-quality data solutions Supporting the development of data products like … tooling Contributing to internal best practices and agile delivery processes Experience required: Proven experience as a Data Engineer or Analytics Engineer Strong experience with Snowflake Hands-on experience with dbt Proficiency in Python and SQL Solid understanding of Git and development lifecycle best practices Experience integrating APIs or working with event/log data streams Ability to manage multiple priorities More ❯
data platform, ensuring scalability, reliability, and security. Drive modernisation by transitioning from legacy systems to a lean, scalable platform. Act as a lead expert for technologies such as AWS, DBT, Airflow, and Databricks. Establish best practices for data modelling, ingestion, storage, streaming, and APIs. Governance & Standards Ensure all technical decisions are well-justified, documented, and aligned with business needs. Lead … Expertise in data engineering and cloud engineering, including data ingestion, transformation, and storage. Significant hands-on experience with AWS and its data services. Expert-level skills in SQL, Python, DBT, Airflow and Redshift. Confidence in coding, scripting, configuring, versioning, debugging, testing, and deploying. Ability to guide and mentor others in technical best practices. A product mindset, focusing on user needs More ❯
data platform, ensuring scalability, reliability, and security. Drive modernisation by transitioning from legacy systems to a lean, scalable platform. Act as a lead expert for technologies such as AWS, DBT, Airflow, and Databricks. Establish best practices for data modelling, ingestion, storage, streaming, and APIs. Governance & Standards Ensure all technical decisions are well-justified, documented, and aligned with business needs. Lead … Expertise in data engineering and cloud engineering, including data ingestion, transformation, and storage. Significant hands-on experience with AWS and its data services. Expert-level skills in SQL, Python, DBT, Airflow and Redshift. Confidence in coding, scripting, configuring, versioning, debugging, testing, and deploying. Ability to guide and mentor others in technical best practices. A product mindset, focusing on user needs More ❯
Social network you want to login/join with: We have partnered with a company that empowers underwriters to serve their insureds more effectively. They are using advanced data intelligence tools to rebuild the way that underwriters share and exchange More ❯
Nottingham, England, United Kingdom Hybrid / WFH Options
Cloud2 Consult
tooling, and approaches within the business. Stay up to date with emerging data engineering trends and identify opportunities for continuous improvement. Essential Skills & Experience Technical Skills: DataBuildTool (DBT): Minimum 1 year’s hands-on project delivery experience. Microsoft Fabric: Minimum 1 year of practical experience working with Fabric-based solutions. Azure Ecosystem: At least 2 years of experience More ❯
mansfield, midlands, united kingdom Hybrid / WFH Options
Cloud2 Consult
tooling, and approaches within the business. Stay up to date with emerging data engineering trends and identify opportunities for continuous improvement. Essential Skills & Experience Technical Skills: DataBuildTool (DBT): Minimum 1 year’s hands-on project delivery experience. Microsoft Fabric: Minimum 1 year of practical experience working with Fabric-based solutions. Azure Ecosystem: At least 2 years of experience More ❯
derby, midlands, united kingdom Hybrid / WFH Options
Cloud2 Consult
tooling, and approaches within the business. Stay up to date with emerging data engineering trends and identify opportunities for continuous improvement. Essential Skills & Experience Technical Skills: DataBuildTool (DBT): Minimum 1 year’s hands-on project delivery experience. Microsoft Fabric: Minimum 1 year of practical experience working with Fabric-based solutions. Azure Ecosystem: At least 2 years of experience More ❯
Azure, Fabric, Dataverse, Synapse, Data Lake, Purview. Deep expertise in data engineering tools and practices, including Python, SQL, and modern ETL/ELT frameworks (e.g., Azure Data Factory, Talend, dbt). Experience designing and implementing scalable data pipelines and integration patterns across structured and unstructured data sources (e.g., Azure SQL, MySQL, MongoDB). Familiarity with data governance, metadata management, and More ❯
Azure, Fabric, Dataverse, Synapse, Data Lake, Purview. Deep expertise in data engineering tools and practices, including Python, SQL, and modern ETL/ELT frameworks (e.g., Azure Data Factory, Talend, dbt). Experience designing and implementing scalable data pipelines and integration patterns across structured and unstructured data sources (e.g., Azure SQL, MySQL, MongoDB). Familiarity with data governance, metadata management, and More ❯
Azure, Fabric, Dataverse, Synapse, Data Lake, Purview. Deep expertise in data engineering tools and practices, including Python, SQL, and modern ETL/ELT frameworks (e.g., Azure Data Factory, Talend, dbt). Experience designing and implementing scalable data pipelines and integration patterns across structured and unstructured data sources (e.g., Azure SQL, MySQL, MongoDB). Familiarity with data governance, metadata management, and More ❯
Azure, Fabric, Dataverse, Synapse, Data Lake, Purview. Deep expertise in data engineering tools and practices, including Python, SQL, and modern ETL/ELT frameworks (e.g., Azure Data Factory, Talend, dbt). Experience designing and implementing scalable data pipelines and integration patterns across structured and unstructured data sources (e.g., Azure SQL, MySQL, MongoDB). Familiarity with data governance, metadata management, and More ❯
Azure, Fabric, Dataverse, Synapse, Data Lake, Purview. Deep expertise in data engineering tools and practices, including Python, SQL, and modern ETL/ELT frameworks (e.g., Azure Data Factory, Talend, dbt). Experience designing and implementing scalable data pipelines and integration patterns across structured and unstructured data sources (e.g., Azure SQL, MySQL, MongoDB). Familiarity with data governance, metadata management, and More ❯
Azure, Fabric, Dataverse, Synapse, Data Lake, Purview. Deep expertise in data engineering tools and practices, including Python, SQL, and modern ETL/ELT frameworks (e.g., Azure Data Factory, Talend, dbt). Experience designing and implementing scalable data pipelines and integration patterns across structured and unstructured data sources (e.g., Azure SQL, MySQL, MongoDB). Familiarity with data governance, metadata management, and More ❯
resolution, customer segmentation, and real-time personalization. Hands-on experience with agile product development methodologies. Excellent communication and stakeholder management skills. Knowledge of modern data tools (e.g., Snowflake, Databricks, dbt, Kafka). Understanding of machine learning workflows and personalization engines. Product certifications (e.g., SAFe, Pragmatic, CSPO). Key Success Metrics: Consistent development roll outs of the Horizon CDP platform Increased More ❯