countries in North America, EMEA, and APAC. Our teams bring extensive cross-sector knowledge in critical technology areas such as mobility, software services, robotics, simulations, cybersecurity, AI, and data analytics, enabling clients to tackle complex challenges in today’s rapidly evolving markets. Scope: Akkodis is launching a new technical delivery team to drive a UK national program in collaboration … dedicated to building a scalable, data-driven recruitment ecosystem. Through redesigning, building, and rolling out a sophisticated Big Data system, our diverse roles span across architecture, project management, data analytics, development, and technical support, giving you the chance to shape a dynamic, next-gen digital infrastructure. You’ll be integral to our mission of crafting a seamless, powerful platform … Glue, Data Sync, DMS, Step Functions, Redshift, DynamoDB, Athena, Lambda, RDS, EC2 and S3 Datalake, CloudWatch for building and optimizing ETL pipelines and data migration workflows. Working knowledge of Azure data engineering tools, including ADF (Azure Data Factory), Azure DB, AzureSynapse, Azure Data lake and Azure Monitor providing added flexibility for diverse More ❯
dedicated to building a scalable, data-driven recruitment ecosystem. Through redesigning, building, and rolling out a sophisticated Big Data system, our diverse roles span across architecture, project management, data analytics, development, and technical support, giving you the chance to shape a dynamic, next-gen digital infrastructure. You’ll be integral to our mission of crafting a seamless, powerful platform … data modelling techniques (Star Schema, Snowflake Schema). Familiarity with security frameworks (GDPR, HIPAA, ISO 27001, NIST, SOX, PII) and AWS security features (IAM, KMS, RBAC). Knowledge of Azure data engineering tools (ADF, Azure DB, AzureSynapse, Azure Data Lake, Azure Monitor) for hybrid migration scenarios. Proficiency with cloud automation/orchestration tools … quality, identify inconsistencies, and resolve migration issues. Ability to manage end-to-end migration projects, ensuring accuracy, meeting timelines, and collaborating with technical and non-technical stakeholders. AWS or Azure cloud certifications preferred. Required Experience Proven experience in data migration, data management, or ETL development. Experience working with ETL tools and database management systems. Familiarity with data integrity and More ❯
Luton, Bedfordshire, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
Consultant Location: Hybrid in London Salary: Up to £90,000 + Benefits Are you passionate about data engineering? Join a leading Microsoft partnered consultancy delivering cutting-edge solutions with Azure, Power BI, and Microsoft Fabric. This is your chance to work on exciting projects, design robust data architectures, and help organisations unlock the full potential of their data. Key … Responsibilities: Build and optimise data pipelines and ETL workflows using Microsoft Fabric and AzureSynapse or Databricks. Implement scalable solutions for data ingestion, storage, and transformation. Develop clean, reusable Python code for data engineering tasks. Research and integrate the latest cloud-based technologies. Requirements: Proven experience in data engineering with Azure tools (Synapse, Data Factory, Databricks More ❯
building scalable BI solutions and working with cutting-edge data technologies? We are seeking a BI Developer to join a dynamic team and help shape the future of data analytics within a global organisation. About the role: Design and develop BI solutions using Microsoft Fabric and related technologies. Build and manage data pipelines leveraging Data Factory. Develop semantic models … Collaborate with data architects, analysts, and stakeholders to deliver actionable insights. Optimise data models for performance and reusability. Support governance, security, and compliance best practices. Key Responsibilities Deliver scalable Azure-based data platforms, including Data Warehouses and reporting tools. Provide technical support and manage a modern technology stack (AzureSynapse, SSIS, SQL, Data Lake). Assist with More ❯
building scalable BI solutions and working with cutting-edge data technologies? We are seeking a BI Developer to join a dynamic team and help shape the future of data analytics within a global organisation. About the role: Design and develop BI solutions using Microsoft Fabric and related technologies. Build and manage data pipelines leveraging Data Factory. Develop semantic models … Collaborate with data architects, analysts, and stakeholders to deliver actionable insights. Optimise data models for performance and reusability. Support governance, security, and compliance best practices. Key Responsibilities Deliver scalable Azure-based data platforms, including Data Warehouses and reporting tools. Provide technical support and manage a modern technology stack (AzureSynapse, SSIS, SQL, Data Lake). Assist with More ❯
ETL pipelines using Microsoft based tools (Data Factory, Fabric). Maintain a medallion architecture (Bronze–Gold) for trusted, refined datasets. Develop, optimize, and maintain complex SQL queries to support analytics and reporting requirements. Implement data quality, testing and observability; ensure lineage, accuracy and compliance. Enable self-serve analytics through well-documented models and transformation logic. Integrate internal/… Partner cross-functionally to deliver fit-for-purpose data solutions. Proactively identify opportunities for continuous improvement. What you can already do Minimum 3 years’ experience in data engineering, data analytics, and BI. Proficiency in Python and SQL languages. Experience in delivering technology projects within a fast-paced business, medium sized organisations. Deliver solutions within the appropriate framework and methodology … whilst ensuring the supportability of services delivered. Experience working with ETL (Extract-Transform-Load) Pipelines. Proven experience building and operating pipelines on Azure (ADF, Synapse, Fabric) Familiarity with version control systems (Git, GitHub) and CI/CD best practices Excellent understanding of Power BI Service and Fabric Strong grasp of data modelling and warehousing concepts such as MS More ❯
+ package Basildon (4 days a week on site) On-site parking available Are you an experienced Data Engineer with strong cloud experience in GCP, Azure, or AWS? This is an exciting opportunity to join a global data engineering organisation delivering one of Europe’s most significant data platform modernisation programmes. You’ll help migrate complex legacy data systems … ability to lead on projects whilst also supporting the development of future more junior engineers. Experience working with GCP would be preferable but if you do come from an Azure/AWS environment and have the ability to work with GCP that is fine. As a Senior Data Engineer focused on migration and transformation, you will: Support and lead … the transition from legacy data systems into a modern cloud platform (GCP/Azure/AWS). Design, build, and optimise production-grade data pipelines and cloud-native data engineering solutions. Develop reusable data patterns and implement automated data lineage and scalable architectures. Contribute to the migration from on-premise systems (e.g., Teradata) to a cloud environment, ensuring performance More ❯
office) Type: Permanent Responsibilities: 15-20 years of total experience in DWBI, Big Data, Cloud Technologies Implementation experience and hands on experience in either of the 2 Cloud technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache … Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions … and verbal communication skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills More ❯