Glasgow, Lanarkshire, United Kingdom Hybrid / WFH Options
Uniting People
global compliance standards. Core Responsibilities Solution Architecture & Design Define target architecture for Amazon Connect across voice, chat, and tasks, ensuring high availability, disaster recovery, and multi-region resilience. Apply AWS Well-Architected Framework principles and enterprise architecture standards for security, sustainability, and cost optimization. Implementation & Migration Lead end-to-end migration from Legacy platforms (Avaya, Cisco, Genesys) to Amazon … with CRM/ITSM platforms (Salesforce, ServiceNow, Dynamics, Pega), WFM/QM systems, and identity management solutions. Build API-driven architectures, event streams, and Real Time analytics pipelines using AWS services (Lambda, EventBridge, Kinesis, Glue, Athena). Security, Compliance & Governance Implement IAM, KMS encryption, VPC networking, and PrivateLink for secure connectivity. Ensure compliance with GDPR/UK GDPR … NHS DSP Toolkit, HIPAA). Define data retention policies, PIA/DPIA frameworks, and lawful intercept/emergency call handling. DevOps & Automation Establish CI/CD pipelines for flows, Lambda, Lex bots, and infrastructure using CloudFormation/Terraform/CDK. Implement automated testing and version control for safe, repeatable deployments. Operational Excellence Define KPIs/SLAs (AHT, CSAT/ More ❯
global compliance standards. Core Responsibilities Solution Architecture & Design Define target architecture for Amazon Connect across voice, chat, and tasks, ensuring high availability, disaster recovery, and multi-region resilience. Apply AWS Well-Architected Framework principles and enterprise architecture standards for security, sustainability, and cost optimization. Implementation & Migration Lead end-to-end migration from Legacy platforms (Avaya, Cisco, Genesys) to Amazon … with CRM/ITSM platforms (Salesforce, ServiceNow, Dynamics, Pega), WFM/QM systems, and identity management solutions. Build API-driven architectures, event streams, and Real Time analytics pipelines using AWS services (Lambda, EventBridge, Kinesis, Glue, Athena). Security, Compliance & Governance Implement IAM, KMS encryption, VPC networking, and PrivateLink for secure connectivity. Ensure compliance with GDPR/UK GDPR … NHS DSP Toolkit, HIPAA). Define data retention policies, PIA/DPIA frameworks, and lawful intercept/emergency call handling. DevOps & Automation Establish CI/CD pipelines for flows, Lambda, Lex bots, and infrastructure using CloudFormation/Terraform/CDK. Implement automated testing and version control for safe, repeatable deployments. Operational Excellence Define KPIs/SLAs (AHT, CSAT/ More ❯
We are a Global Recruitment specialist that provides support to the clients across EMEA, APAC, US and Canada. Role Title: Sr.Databricks Engineer (AWS) Location: Glasgow Duration: 31/12/2026 Days on site: 2-3 Rate: £402/day on Umbrella Role Description: We are currently migrating our data pipelines from AWS to Databricks, and are seeking … this transformation. This is a hands-on engineering role focused on designing, building, and optimizing scalable data solutions using the Databricks platform. Key Responsibilities: Lead the migration of existing AWS-based data pipelines to Databricks. Design and implement scalable data engineering solutions using Apache Spark on Databricks. Collaborate with cross-functional teams to understand data requirements and translate them … Must-Have Skills: Strong hands-on experience with Databricks and Apache Spark (preferably PySpark). Proven track record of building and optimizing data pipelines in cloud environments. Experience with AWS services such as S3, Glue, Lambda, Step Functions, Athena, IAM, and VPC. Proficiency in Python for data engineering tasks. Familiarity with GitLab for version control and CI/ More ❯
Sr.Databricks Engineer Location: Glasgow Days on site: 2-3 Pay Rate: £405 per day MUST BE PAYE THROUGH UMBRELLA Role Description: We are currently migrating our data pipelines from AWS to Databricks, and are seeking a Senior Databricks Engineer to lead and contribute to this transformation. This is a hands-on engineering role focused on designing, building, and optimizing … scalable data solutions using the Databricks platform. Key Responsibilities: . Lead the migration of existing AWS-based data pipelines to Databricks. . Design and implement scalable data engineering solutions using Apache Spark on Databricks. . Collaborate with cross-functional teams to understand data requirements and translate them into efficient pipelines. . Optimize performance and cost-efficiency of Databricks workloads. … . Strong hands-on experience with Databricks and Apache Spark (preferably PySpark). . Proven track record of building and optimizing data pipelines in cloud environments. . Experience with AWS services such as S3, Glue, Lambda, Step Functions, Athena, IAM, and VPC. . Proficiency in Python for data engineering tasks. . Familiarity with GitLab for version control and More ❯
engagement, helping shape and deliver scalable, cloud-native data solutions for household-name clients. What youll be doing Designing, building and maintaining robust data pipelines Automating and orchestrating workflows (AWS Glue, Azure Data Factory, GCP Dataflow) Working across leading cloud platforms (AWS, Azure, or GCP) Implementing and optimising modern data architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams … experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset adaptable, collaborative, and delivery-focused The details Location: Edinburgh 2 days onsite per week Duration: 3 months initially Day Rate: c.£500/ More ❯
helping shape and deliver scalable, cloud-native data solutions for household-name clients. What you ll be doing Designing, building and maintaining robust data pipelines Automating and orchestrating workflows (AWS Glue, Azure Data Factory, GCP Dataflow) Working across leading cloud platforms (AWS, Azure, or GCP) Implementing and optimising modern data architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams … experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset adaptable, collaborative, and delivery-focused The details Location: Edinburgh 2 days onsite per week Duration: 3 months initially Day Rate: c.£500/ More ❯
Edinburgh, York Place, City of Edinburgh, United Kingdom
Bright Purple
helping shape and deliver scalable, cloud-native data solutions for household-name clients. What you’ll be doing Designing, building and maintaining robust data pipelines Automating and orchestrating workflows (AWS Glue, Azure Data Factory, GCP Dataflow) Working across leading cloud platforms (AWS, Azure, or GCP) Implementing and optimising modern data architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams … experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset – adaptable, collaborative, and delivery-focused The details Location: Edinburgh – 2 days onsite per week Duration: 3 months initially Day Rate: c.£500/ More ❯