London, England, United Kingdom Hybrid / WFH Options
Peaple Talent
now looking for a Senior Data Consultant, specialising in Microsoft Fabric. We are looking for: Demonstratable data engineering/BI skills, with a focus on having delivered in Microsoft Azure Strong experience designing and delivering data solutions in Fabric, Azure Databricks or AzureSynapse Proficient with SQL and Python Great communication skills, effectively participating with Senior More ❯
skills including: MS SQL Server, T-SQL, indexing, stored procedures, relational/dimensional modelling, data dashboards. Any cloud platforms experience would be helpful including: such as Snowflake, Databricks, BigQuery, Azure SQL. Working closely with key stakeholders including data architects, analysts, testers and managers. Working across the full SQL development life-cycle including: design, development, documentation and testing. Advantageous Skills … Power BI, NoSQL, AzureSynapse, data visualisation tools, Data Lakes, streaming technologies, MDM, Git, DevOps pipelines. Benefits include: 25 days holiday (plus BHs) + pension + company bonus + More. More ❯
relational/dimensional modelling, data dashboards. Building/optimising data pipelines and integrations across cloud platforms. Any cloud platforms experience would be helpful including: such as Snowflake, Databricks, BigQuery, Azure SQL. Working closely with key stakeholders including data architects, analysts, testers and managers. Working across the full SQL development life-cycle including: design, development, documentation and testing. Advantageous Skills … Power BI, NoSQL, AzureSynapse, data visualisation tools, Data Lakes, streaming technologies, MDM, Git, DevOps pipelines. Benefits include: £40k-50k Base (DOE) + 25 days holiday (plus BHs) + pension + company bonus + More. More ❯
and a focus on GDPR at all times To be successful in this role you will have . Coding experience with Python/PySpark Data pipeline development experience utilising Azure Data Factory or Fabric Pipelines Experience working within an Azure environment such as Lakehouse Architecture, Data Lake, Delta Lake, AzureSynapse Strong SQL knowledge Strong communication More ❯
practice approaches to monitoring and error-handling Writing unit tests, git Version control Awareness of reliability patterns in ETL pipelines Techniques and tools for sanitizing data prior to use Azure data certifications AzureSynapse/Fabric: Synapse-link, Fabric Link, on-demand SQL engine Use of python in Jupyter notebooks for data processing Azure storage More ❯
London, England, United Kingdom Hybrid / WFH Options
Experis - ManpowerGroup
practice approaches to monitoring and error-handling Writing unit tests, git Version control Awareness of reliability patterns in ETL pipelines Techniques and tools for sanitizing data prior to use Azure data certifications AzureSynapse/Fabric: Synapse-link, Fabric Link, on-demand SQL engine Use of python in Jupyter notebooks for data processing Azure storage More ❯
London, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
and a focus on GDPR at all times To be successful in this role you will have . Coding experience with Python/PySpark Data pipeline development experience utilising Azure Data Factory or Fabric Pipelines Experience working within an Azure environment such as Lakehouse Architecture, Data Lake, Delta Lake, AzureSynapse Strong SQL knowledge Strong communication More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Cognitive Group | Part of the Focus Cloud Group
Full-time, Permanent Eligible for SC Clearance/Actively SC Cleared Be part of a growing consulting team with a leading Microsoft Partner. This role is for a Senior Azure Data & AI Consultant, who will deliver high-impact solutions that transform how clients operate using Artificial Intelligence. You’ll play a critical role in guiding organisations through the AI … and technical teams, helping clients unlock the full value of AI through innovation, collaboration, and expertise. What You'll Do Lead and deliver complex AI and data projects using Azure Data Services , Microsoft Fabric , and other modern data platforms Work closely with clients to understand business challenges and identify where AI and data-driven solutions can drive value Architect … development within the consultancy What We’re Looking For Proven experience delivering AI and data solutions in a Microsoft Partner or specialist technology consultancy environment Strong technical proficiency across AzureSynapse , Data Lake , Databricks , Power BI , Microsoft Fabric , and relevant AI services (e.g. Azure OpenAI, ML Studio) Deep understanding of modern data architectures, data governance, and operationalisation More ❯
Full-time, Permanent Eligible for SC Clearance/Actively SC Cleared Be part of a growing consulting team with a leading Microsoft Partner. This role is for a Senior Azure Data & AI Consultant, who will deliver high-impact solutions that transform how clients operate using Artificial Intelligence. You’ll play a critical role in guiding organisations through the AI … and technical teams, helping clients unlock the full value of AI through innovation, collaboration, and expertise. What You'll Do Lead and deliver complex AI and data projects using Azure Data Services , Microsoft Fabric , and other modern data platforms Work closely with clients to understand business challenges and identify where AI and data-driven solutions can drive value Architect … development within the consultancy What We’re Looking For Proven experience delivering AI and data solutions in a Microsoft Partner or specialist technology consultancy environment Strong technical proficiency across AzureSynapse , Data Lake , Databricks , Power BI , Microsoft Fabric , and relevant AI services (e.g. Azure OpenAI, ML Studio) Deep understanding of modern data architectures, data governance, and operationalisation More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Cognitive Group | Part of the Focus Cloud Group
Full-time, Permanent Eligible for SC Clearance/Actively SC Cleared Be part of a growing consulting team with a leading Microsoft Partner. This role is for a Senior Azure Data & AI Consultant, who will deliver high-impact solutions that transform how clients operate using Artificial Intelligence. You’ll play a critical role in guiding organisations through the AI … and technical teams, helping clients unlock the full value of AI through innovation, collaboration, and expertise. What You'll Do Lead and deliver complex AI and data projects using Azure Data Services , Microsoft Fabric , and other modern data platforms Work closely with clients to understand business challenges and identify where AI and data-driven solutions can drive value Architect … development within the consultancy What We’re Looking For Proven experience delivering AI and data solutions in a Microsoft Partner or specialist technology consultancy environment Strong technical proficiency across AzureSynapse , Data Lake , Databricks , Power BI , Microsoft Fabric , and relevant AI services (e.g. Azure OpenAI, ML Studio) Deep understanding of modern data architectures, data governance, and operationalisation More ❯
Lincoln, England, United Kingdom Hybrid / WFH Options
Experis UK
practice approaches to monitoring and error-handling Writing unit tests, git Version control Awareness of reliability patterns in ETL pipelines Techniques and tools for sanitizing data prior to use Azure data certifications AzureSynapse/Fabric: Synapse-link, Fabric Link, on-demand SQL engine Use of python in Jupyter notebooks for data processing Azure storage More ❯
focus on quality, scalability, and value. What you'll be doing: Designing, building and maintaining robust, cloud-native data pipelines Automating and orchestrating workflows using tools like AWS Glue, Azure Data Factory, or Google Cloud Dataflow Working with platform services such as Amazon S3, AzureSynapse, Google BigQuery, and Snowflake Implementing Lakehouse architectures using tools like Databricks … got solid experience working with Python, SQL, Spar and data pipeline tools such as dbt or Airflow You're comfortable working across cloud platforms - especially AWS (Glue, Lambda, Athena), Azure (Data Factory, Synapse), or GCP (Dataflow, BigQuery, Cloud Composer) You have a good understanding of data modelling, data warehousing and performance optimisation You care deeply about data quality More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
CreateFuture
focus on quality, scalability, and value. What you'll be doing: Designing, building and maintaining robust, cloud-native data pipelines Automating and orchestrating workflows using tools like AWS Glue, Azure Data Factory, or Google Cloud Dataflow Working with platform services such as Amazon S3, AzureSynapse, Google BigQuery, and Snowflake Implementing Lakehouse architectures using tools like Databricks … got solid experience working with Python, SQL, Spar and data pipeline tools such as dbt or Airflow You’re comfortable working across cloud platforms – especially AWS (Glue, Lambda, Athena), Azure (Data Factory, Synapse), or GCP (Dataflow, BigQuery, Cloud Composer) You have a good understanding of data modelling, data warehousing and performance optimisation You care deeply about data quality More ❯
as PL-SQL or Python/Spark, and experience of delivering metadata driven development processes would be desirable. In addition, experience of building pipelines in cloud technologies such as Azure Data Factory/Synapse would be a bonus. Enjoy a healthy work/life balance with our great range of benefits When you join us, you’ll be … will we help you in return? You’ll have access to a wide range oftools that will enable you to develop and enhance your data skills – these include Microsoft Azure training courses via our partnership with Microsoft, online training such as DataCamp and Pluralsight,webinars, taught courses, one-to-one coaching and access to a large peer-support network. … as PL-SQL or Python/Spark, and experience of delivering metadata driven development processes would be desirable. In addition, experience of building pipelines in cloud technologies such as Azure Data Factory/Synapse would be a bonus. Enjoy a healthy work/life balance with our great range of benefits When you join us, you’ll be More ❯
30pm What you will be doing: Lead the design, development, and optimization of scalable data and pipelines in a cloud-based environment Implement and manage data warehouses using AzureSynapse, ensuring data integrity and security Build, deploy, and manage ETL processes to support real-time and batch data processing using tooling across the Azure estate, Databricks, PySpark … and business needs Manage the ingestion, transformation, and processing of large datasets utilizing big data tooling Ensure optimal performance of data pipelines and infrastructure using cloud services such as Azure and AWS S3 Lead a team of data engineers, providing technical guidance and fostering a culture of continuous learning and improvement What we are looking for: 5+ years of … experience in data engineering Expertise in Azure DWH and AWS Databricks Strong programming skills in Python/PySpark or other relevant languages for data manipulation and ETL workflows Proficiency in SQL and experience with both relational (e.g., SQL Server, MySQL) and non-relational databases (e.g., MongoDB, Cassandra) Experience with AWS S3 and other AWS services related to big data More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
Kerv
practice approaches to monitoring and error-handling Writing unit tests, git Version control Awareness of reliability patterns in ETL pipelines Techniques and tools for sanitizing data prior to use Azure data certifications AzureSynapse/Fabric: Synapse-link, Fabric Link, on-demand SQL engine Use of python in Jupyter notebooks for data processing Azure storage More ❯
Bolton, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
wide Job Description: My client are a leading data partner of Microsoft and are looking for an experienced consultant and architect with expertise in MS Fabric, Databricks, and the Azure tech stack to join their expanding team. Salary and Benefits: Competitive salary of £80k - £100k (DOE) Performance-related bonus of up to 10% Hybrid remote working (between once a … days annual leave Private medical care Free provision of MS data certifications Role and Responsibilities: Implementing MS Fabric solutions including Fabric Lakehouse and Warehouse solutions. Implementing Databricks and MS Azure solutions catered to specific client requirements. Presenting and explaining architectural decisions to a variety of key stakeholders. Support clients across a range of industries and sectors. Debugging and optimization … cost optimization. Develop governance of enterprise-level data environments. What do I need to apply for the role: Consultancy experience. Strong experience implementing Modern Data Warehouse architectures. Experience with Azure Data Engineer technologies like ADF, Azure SQL Database and Synapse. Confident in designing, implementing, and maintaining solutions using MS Fabric. Strong understanding of SQL and NoSQL databases. Solid More ❯
skills, technologies, and frameworks. Demonstrated ability to communicate complex technical information in a condensed manner to various stakeholders verbally and in writing. An expert working knowledge of Python and Azure Synapses. 2 - 3 years experience of working within a data analyst/engineering capacity. This is a fantastic opportunity for an individual looking to be an SME to a More ❯
London, England, United Kingdom Hybrid / WFH Options
Sarafin Partners
skills, technologies, and frameworks. Demonstrated ability to communicate complex technical information in a condensed manner to various stakeholders verbally and in writing. An expert working knowledge of Python and Azure Synapses. 2 - 3 years experience of working within a data analyst/engineering capacity. This is a fantastic opportunity for an individual looking to be an SME to a More ❯
London, England, United Kingdom Hybrid / WFH Options
Hamilton Barnes ?
Direct message the job poster from Hamilton Barnes Location: Work From Home - Need to be based around either London or the Midlands Role: As an Azure Data Engineer within a managed service provider, you will design, develop, and maintain modern data solutions leveraging the Microsoft Azure ecosystem. Your responsibilities include building robust data pipelines, integrating diverse data sources … and enabling advanced analytics using services like Azure Data Factory, SynapseAnalytics, Data Lake, and Microsoft Fabric. You'll play a key role in helping clients unlock the value of their data through scalable, secure, and efficient solutions, while supporting ongoing data platform operations in a managed services environment. Responsibilities: Work for one of their end … clients, providing expertise from Business Intelligence (BI) to Microsoft Fabric. Plan for core data migration to Fabric as part of an Azure migration. Engage with and interpret data to derive insights. Skills: Microsoft Fabric Power BI What’s in It for You? Opportunity to work with cutting-edge Microsoft technologies. Career progression opportunities within a leading IT solutions provider. More ❯
Join to apply for the Azure Data Engineer role at IBM 1 day ago Be among the first 25 applicants Join to apply for the Azure Data Engineer role at IBM Introduction In this role, you’ll work in one of our IBM Consulting Client Innovation Centres (Delivery Centres), where we deliver deep technical and industry expertise to … pension plan of an additional 5% of your base salary paid by us monthly to save for your future. Your Role And Responsibilities A data engineer with expertise in Azure toolset advises on, develops, and maintains data engineering solutions on the Azure Cloud ecosystem. They design, build, and operate batch and real-time data pipelines using Azure services such as AzureSynapseAnalytics, Azure Data Factory, Azure DataBricks, and Event Hub. This role also involves designing, building, and operating the data layer on AzureSynapseAnalytics, SQL DW, and Cosmos DB. The data engineer is proficient in Azure Data Platform components, including ADLS2, Blob Storage, SQLDW, SynapseMore ❯
Social network you want to login/join with: Solutions Architect (Data Analytics)- Presales, RFP creation, london (city of london) col-narrow-left Client: Location: london (city of london), United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 3 Posted: 16.06.2025 Expiry Date: 31.07.2025 col-wide Job Description: Job Title: Solutions Architect (Data … Analytics)-Pre-sales, RFP creation Location: London (3days/week onsite) Duration: Permanent Responsibilities: 16-18+ years of total experience in DWBI, Big Data, Cloud Technologies Implementation experience and hands on experience in either of the 2 Cloud technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS …/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute More ❯
Social network you want to login/join with: Solutions Architect (Data Analytics), slough col-narrow-left Client: Location: slough, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 3 Posted: 06.06.2025 Expiry Date: 21.07.2025 col-wide Job Description: Responsibilities: 16-18+ years of total experience in DWBI, Big Data, Cloud Technologies … Implementation experience and hands on experience in either of the 2 Cloud technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift …/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients More ❯
Solutions Architect (Data Analytics)- Presales, RFP creation Solutions Architect (Data Analytics)- Presales, RFP creation Get AI-powered advice on this job and more exclusive features. This range is provided by Vallum Associates. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more. Base pay range Direct message the job poster … from Vallum Associates Job Title: Solutions Architect (Data Analytics)-Pre-sales, RFP creation Location: London (3days/week onsite) Duration: Permanent Responsibilities: 16-18+ years of total experience in DWBI, Big Data, Cloud Technologies Implementation experience and hands on experience in either of the 2 Cloud technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience … on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. Excellent consulting experience and More ❯
London, England, United Kingdom Hybrid / WFH Options
RGH-Global Limited
About the job Data and Analytics Senior Developer Join Our Clients Team as a Data and Analytics Senior Developer! Location: Greenwich, Hybrid (minimum 2 days in office per week) Hours: Monday - Friday, 8:30am - 5:30pm Contract: 1 Year Role Overview: We are seeking a skilled Data and Analytics Senior Developer to join our Client's dynamic … data. Schema Design: Proficiency in star schema structure & design and understanding of Kimball & Inmon hybrid data warehouse design. Cloud Data Products: Experience with Data Factory, Event Hubs, Data Lake, Synapse, Azure SQL Server. Databricks & PySpark: Expertise in developing with Databricks and coding with PySpark and Spark SQL. Coding Standards: Ensuring ETL code is standardized, self-documenting, and reliably …/Public Sector IT. Emerging Technologies: Interest in the latest technologies for designing and delivering enterprise-wide solutions. Agile Environment: Experience with tools and technology supporting the Data and Analytics development lifecycle in an agile scrum environment. Complex Solutions Design: Expertise in designing and developing complex data and analytics solutions for large enterprise business/data warehouse implementations. More ❯