London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
Azure Data Engineer - £400PD - Remote Seeking an experienced Data Engineer to design, build, and optimise data solutions within the Microsoft Azure ecosystem. The role focuses on pipeline development, data modelling, governance, and supporting analytics teams with high-quality, reliable data. Key Responsibilities: Develop and maintain scalable data pipelines using Azure Data Factory, Synapse, Databricks, and Microsoft Fabric. Build efficient … Skills & Experience: Proven experience as a Data Engineer working with Azure. Strong skills in Azure Data Factory, Synapse Analytics, Databricks, SQL Database, and Azure Storage. Excellent SQL and data modelling (star/snowflake, dimensionalmodelling). Knowledge of Power BI dataflows, DAX, and RLS. Experience with Python, PySpark, or T-SQL for transformations. Understanding of CI/ More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
Azure Data Engineer - £400PD - Remote Seeking an experienced Data Engineer to design, build, and optimise data solutions within the Microsoft Azure ecosystem. The role focuses on pipeline development, data modelling, governance, and supporting analytics teams with high-quality, reliable data. Key Responsibilities: Develop and maintain scalable data pipelines using Azure Data Factory, Synapse, Databricks, and Microsoft Fabric. Build efficient … Skills & Experience: Proven experience as a Data Engineer working with Azure. Strong skills in Azure Data Factory, Synapse Analytics, Databricks, SQL Database, and Azure Storage. Excellent SQL and data modelling (star/snowflake, dimensionalmodelling). Knowledge of Power BI dataflows, DAX, and RLS. Experience with Python, PySpark, or T-SQL for transformations. Understanding of CI/ More ❯
Central London, London, England, United Kingdom Hybrid/Remote Options
E-Solutions IT Services UK Ltd
data products on which they can add reporting and analytics. The candidate will be required to deliver to all stages of the data engineering process – data ingestion, transformation, data modelling and data warehousing, and build self-service data products. The role is a mix of Azure cloud delivery and on-prem (SQL) development. Ultimately all on-prem will be … and knowledge: Excellent data analysis and exploration using T-SQL Strong SQL programming (stored procedures, functions) Extensive experience with SQL Server and SSIS Knowledge and experience of data warehouse modelling methodologies (Kimball, dimensionalmodelling, Data Vault 2.0) Experience in Azure – one or more of the following: Data Factory, Databricks, Synapse Analytics, ADLS Gen2 Experience in building robust More ❯
Demonstrated experience in data engineering within the Azure cloud environment. Proficiency with Azure Data Factory, Synapse Analytics, Databricks, Azure SQL, and Azure Storage. Strong SQL skills and expertise in dimensionalmodelling (e.g., star/snowflake schemas). Familiarity with Power BI dataflows, DAX, and RLS setup. Hands-on experience with Python, PySpark, or T-SQL for data processing More ❯
months), ensuring that near-term work supports broader programme and client objectives. About You Professional knowledge and experience Essential Proven experience in data engineering, data integration and data modelling Expertise with cloud platforms (e.g. AWS, Azure, GCP) Expertise with modern cloud data platforms (e.g. Microsoft Fabric, Databricks) Expertise with multiple data analytics tools (e.g. Power BI) Deep understanding of … data wareho using concepts, ETL/ELT pipelines and dimensionalmodelling Proficiency in advanced programming languages (Python/PySpark, SQL) Experience in data pipeline orchestration (e.g. Airflow, Data Factory) Familiarity with DevOps and CI/CD practices (Git, Azure DevOps etc) Ability to communicate technical concepts to both technical and non-technical audiences Proven experience in delivery of More ❯
of experience in BI architecture, data engineering, and analytics delivery. Expertise in BI platforms such as Power BI, Tableau, Qlik, Looker, or similar. Strong experience with SQL, data modeling, dimensional modeling, and star/snowflake schemas. Hands-on experience building ETL/ELT solutions using tools like Informatica, Talend, SSIS, Data Factory, or similar. Proficiency in cloud ecosystems (Azure More ❯
Highly experienced with SQL, Python, Spark (Spark SQL & PySpark), Git Familiarity with other Microsoft data products such as Power BI, Power Automate, Power Platform Understanding of data modeling concepts, dimensional modeling, normalization, and medallion architecture Understanding of data security, data privacy and their related compliance standards Proficiency with Git for source control management and an understanding of CI/ More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid/Remote Options
Fruition Group
Senior Analytics Engineer/Data Modeller role offers the chance to make a significant impact within an innovative, tech-driven organisation. You'll work at the forefront of data modelling, analytics engineering, and AI-enablement, helping shape scalable data products that power enterprise reporting, machine learning, and self-service insights. Senior Analytics Engineer/Data Modeller Responsibilities Design, build … and maintain analytical and AI-ready data marts sourced from the enterprise data lakehouse. Develop semantic models and dimensional structures optimised for BI, dashboarding, and machine learning. Ensure documentation, governance, and data consistency across domains. Collaborate with data engineers to support robust ETL/ELT pipelines and maintain end-to-end data lineage. Deploy analytics engineering solutions using dbt … single source of truth for reporting. Work with business teams to translate complex requirements into scalable, high-impact models. Support agile delivery of data products and continuous improvement of modelling standards. Senior Analytics Engineer/Data Modeller Requirements Proven experience as an Analytics Engineer, Data Modeller, or similar position in a modern cloud-based data environment. Strong SQL skills More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Oscar Technology
Work collaboratively and effectively within in a Data Analytics team to create, inform, and champion data and analytics best practices. About You 3-5 years' experience applying relational data modelling and data warehousing techniques, including the Kimball Methodology or other similar dimensionalmodelling standards, is essential to the role. Technical experience building and deploying models and reports More ❯
Spotfire Lead Consultant (Spotfire to Power BI Migration) Location: Central London & Remote Type: Contract Spotfire Architect with strong data modelling skills to lead and support the migration of complex Spotfire reports to Power BI. The role requires deep technical expertise in Spotfire s visual and scripting frameworks, alongside the ability to reverse-engineer, redesign and optimize data models for …/dashboards in Spotfire 8+ years of experience in BI & analytics with at least 5 years in Spotfire (including architecture, scripting, and advanced configurations). Strong experience with data modelling for BI platforms; dimensionalmodelling, fact/dimension tables, normalisation/denormalisation. Hands-on expertise with IronPython scripting, custom expressions, data functions (TERR/Python/R More ❯
Analytics will also be considered. Proven expertise with Databricks, including extensive hands-on experience with PySpark, Python, SQL, Kafka, and Databricks notebooks. Strong experience with data modeling techniques (e.g., dimensional modeling, data vault) and database design. Experience building and optimizing data pipelines for batch and/or streaming data. Experience with cloud platforms (e.g., AWS, Azure, GCP) and services More ❯
ADF, Azure Data Lake Storage Gen2). Strong SQL skills (T-SQL), with experience in Python/Scala andSpark/PySpark for big-data transformations. Demonstrated experience in data modelling and warehousing (star schema, snowflake, dimensionalmodelling, medallion architecture). Experience designing and implementing ETL/ELT pipelines using ADF, dataflows, or Synapse pipelines. Knowledge of data More ❯
e.g. subscription billing), SQL sources, SharePoint and SFTP using ADF + metadata-driven frameworks . Apply Lakeflow expectations for data quality, schema validation and operational reliability. Curated Data Layers & Modelling Build clean, conformed Silver/Gold models aligned to enterprise business domains (customers, subscriptions, deliveries, finance, credit, logistics, operations). Deliver star schemas, harmonisation logic, SCDs and business marts … delivering production workloads on Azure + Databricks . Strong PySpark/Spark SQL and distributed data processing expertise. Proven Medallion/Lakehouse delivery experience using Delta Lake . Solid dimensionalmodelling (Kimball) including surrogate keys, SCD types 1/2, and merge strategies. Operational experience-SLAs, observability, idempotent pipelines, reprocessing, backfills. Mindset Strong grounding in secure Azure Landing … Clear communicator who can translate technical decisions into business outcomes. Nice to Have Databricks Certified Data Engineer Associate Streaming ingestion experience (Auto Loader, structured streaming, watermarking) Subscription/entitlement modelling experience Advanced Unity Catalog security (RLS, ABAC, PII governance) Terraform/Bicep for IaC Fabric Semantic Model/Direct Lake optimisation More ❯
e.g. subscription billing), SQL sources, SharePoint and SFTP using ADF + metadata-driven frameworks . Apply Lakeflow expectations for data quality, schema validation and operational reliability. Curated Data Layers & Modelling Build clean, conformed Silver/Gold models aligned to enterprise business domains (customers, subscriptions, deliveries, finance, credit, logistics, operations). Deliver star schemas, harmonisation logic, SCDs and business marts … delivering production workloads on Azure + Databricks . Strong PySpark/Spark SQL and distributed data processing expertise. Proven Medallion/Lakehouse delivery experience using Delta Lake . Solid dimensionalmodelling (Kimball) including surrogate keys, SCD types 1/2, and merge strategies. Operational experience-SLAs, observability, idempotent pipelines, reprocessing, backfills. Mindset Strong grounding in secure Azure Landing … Clear communicator who can translate technical decisions into business outcomes. Nice to Have Databricks Certified Data Engineer Associate Streaming ingestion experience (Auto Loader, structured streaming, watermarking) Subscription/entitlement modelling experience Advanced Unity Catalog security (RLS, ABAC, PII governance) Terraform/Bicep for IaC Fabric Semantic Model/Direct Lake optimisation More ❯
with Data Engineer in the design, build, and maintain scalable data pipelines using Azure Data Factory and Databricks to automate data ingestion, transformation, and processing workflows. DCreate and maintain dimensional data models and semantic layers that support business intelligence and analytics use cases. Build and optimise data transformation workflows using dbt, SQL, and Python to create clean, well-documented … and modern data transformation tools (dbt strongly preferred), with experience in cloud data platforms (Azure Databricks, Snowflake, or similar). Proven experience designing and implementing scalable data architectures, including dimensionalmodelling, data lakehouse/warehouse concepts, and modern data stack technologies. Strong software engineering practices including version control (Git), CI/CD pipelines, code testing, and infrastructure as More ❯
Spotfire Lead Consultant (Spotfire to Power BI Migration) Location: Central London & Remote Type: Contract Spotfire Architect with strong data modelling skills to lead and support the migration of complex Spotfire reports to Power BI. The role requires deep technical expertise in Spotfire s visual and scripting frameworks, alongside the ability to reverse-engineer, redesign and optimize data models for …/dashboards in Spotfire 8+ years of experience in BI & analytics with at least 5 years in Spotfire (including architecture, scripting, and advanced configurations). Strong experience with data modelling for BI platforms; dimensionalmodelling, fact/dimension tables, normalisation/denormalisation. Hands-on expertise with IronPython scripting, custom expressions, data functions (TERR/Python/R More ❯
South Jordan, Utah, United States Hybrid/Remote Options
Strider Technologies
modern data stack (e.g., dbt, Airflow, Fivetran, Dagster). Proven ability to manage and transform large, complex datasets across structured and unstructured formats. Solid understanding of data warehousing concepts, dimensional modeling, and Git-based version control. Experience leveraging AI-assisted development tools and enthusiasm for scaling their use internally. Comfortable working in fast-paced environments and collaborating across technical More ❯
using Azure + Databricks , with Power BI sitting on top of a fully curated Gold Layer. They develop everything using PBIP + Git + Tabular Editor 3 , and semantic modelling is treated as a first-class engineering discipline. This is your chance to own the creation of high-quality datasets and dashboards used across Operations, Finance, Sales, Logistics and … into insights the business relies on every day. Why This Role Exists To turn clean, curated Gold Lakehouse data into trusted, enterprise-grade Power BI insights.You'll own semantic modelling, dataset optimisation, governance and best-practice delivery across a modern BI ecosystem. What You'll Do Semantic Modelling with PBIP + Git Build and maintain enterprise PBIP datasets … fully version-controlled in Git. Use Tabular Editor 3 for DAX, metadata modelling, calc groups and object governance. Manage branching, pull requests and releases via Azure DevOps . Lakehouse-Aligned Reporting (Gold Layer Only) Develop semantic models exclusively on top of curated Gold Databricks tables . Work closely with Data Engineering on schema design and contract-first modelling. Maintain More ❯
using Azure + Databricks , with Power BI sitting on top of a fully curated Gold Layer. They develop everything using PBIP + Git + Tabular Editor 3 , and semantic modelling is treated as a first-class engineering discipline. This is your chance to own the creation of high-quality datasets and dashboards used across Operations, Finance, Sales, Logistics and … insights the business relies on every day. ? Why This Role Exists To turn clean, curated Gold Lakehouse data into trusted, enterprise-grade Power BI insights. You'll own semantic modelling, dataset optimisation, governance and best-practice delivery across a modern BI ecosystem. ? What You'll Do Semantic Modelling with PBIP + Git Build and maintain enterprise PBIP datasets … fully version-controlled in Git. Use Tabular Editor 3 for DAX, metadata modelling, calc groups and object governance. Manage branching, pull requests and releases via Azure DevOps . Lakehouse-Aligned Reporting (Gold Layer Only) Develop semantic models exclusively on top of curated Gold Databricks tables . Work closely with Data Engineering on schema design and contract-first modelling. Maintain More ❯
Analyst at Atlanta, GA. Job Title: Lead Azure Data Analyst Job Location: Atlanta, GA. Job Type: Long term contract Responsibilities: Lead Data Analyst will be responsible for analysing and modelling data to support our wealth management initiatives. Bachelor's or Master's degree in computer science, Information Technology, Finance, or related field. 10+years of experience in data analysis, data … modelling, and data management, preferably in the financial services industry with a focus on wealth management. Strong understanding of source-to-target transformations, ETL processes, and data integration techniques. Proficiency in data modelling tools such as Erwin, Erwin Data Modeler, or similar tools. Hands-on experience with SQL, relational databases (e.g., SQL Server, Oracle), and data warehousing concepts. … Knowledge of industry-standard data modelling methodologies (e.g., ER/Studio, Kimball DimensionalModelling) and best practices. Experience working in Agile environments and collaborating with cross-functional teams. Excellent analytical, problem-solving, and communication skills, with the ability to translate complex business requirements into technical solutions. Would you like to know more about this opportunity? For immediate More ❯
field. 5+ yearsof experience in data modeling for large-scale data warehouse environments. Strong experience with Teradatadatabases and architecture. Expertise in Erwin Model Mart. Strong understanding of normalized and dimensional modeling techniques. Experience with ETL design concepts, data governance, and metadata management. Solid understanding of SQLand database performance tuning. Excellent analytical, communication, and documentation skills. Preferred Qualifications Experience in More ❯
Lansing, Michigan, United States Hybrid/Remote Options
A.J. Boggs & Company
and deploying data pipelines. Familiarity with other Azure data services (e.g., Azure Data Lake Storage, Azure Synapse Analytics, Azure Databricks) is preferred. Strong understanding of data warehousing concepts and dimensional modeling. Experience with data quality management and data governance principles. Experience working in a shared service, hybrid environment. Benefits The annual salary range is $85,000 - $101,000 Remote More ❯
with AWS data ecosystem including Redshift, S3, Glue, Athena, EMR, EC2, DynamoDB, Lambda, and Redis • Strong understanding of data warehousing and data modeling principles (e.g., star/snowflake schema, dimensional modeling) • Familiarity with dbt Labs and modern ELT/analytics engineering practices • Experience working with structured, semi-structured, and unstructured data • Knowledge of data governance, quality assurance, and observability More ❯
business requirements into clear, actionable reporting solutions Essential Skills and Experience Strong, proven experience in Power BI development within an Asset Management environment Strong, proven experience in advanced data modelling , DAX , and data transformation Expertise with star schemas , dimensionalmodelling semantic models , and dataflows Solid understanding of data security, access controls, and distribution practices Experience with data More ❯
business requirements into clear, actionable reporting solutions Essential Skills and Experience Strong, proven experience in Power BI development within an Asset Management environment Strong, proven experience in advanced data modelling , DAX , and data transformation Expertise with star schemas , dimensionalmodelling semantic models , and dataflows Solid understanding of data security, access controls, and distribution practices Experience with data More ❯