queries, including common table expressions (CTEs), window functions, and complex joins. Experience with query optimization and performance tuning on relational databases like PostgreSQL, MySQL, or similar Cloud data ecosystem (AWS) : hands-on experience with core AWS data services. Key services include: S3 for data lake storage AWS Glue for ETL and data cataloging Amazon Redshift or Athena … for data warehousing and analytics Lambda for event-driven data processing. ETL/ELT pipeline development : experience in designing, building, and maintaining robust, automated data pipelines. You should be comfortable with both the theory and practical application of extracting, transforming, and loading data between systems Programming for data : Strong scripting skills, including Python Infrastructure as code (IaC) : Experience deploying … and managing cloud infrastructure using tools like Terraform or AWS CDK/CloudFormation Data modelling and warehousing: Dimensional Data Modeling : Deep understanding of data warehousing concepts and best practices. Experience of, and ability to, transform raw transactional data into well-structured analytics-ready datasets using schemas like the star schema (Kimball methodology) Data Quality & Governance : build trust in data More ❯
Technical Lead – Power BI/AWS – SC Cleared (short term contract) We are seeking an experienced Technical Lead to guide the final delivery phase of a Proof of Concept, developing a KPI Framework Power BI Dashboard . The successful candidate will shape the future-state solution architecture , evaluate the existing data and reporting infrastructure , and steer the technical direction … Required Skills & Experience Proven experience designing and implementing Power BI architectures at enterprise scale. Strong expertise in data modelling , ETL , and enterprise data architecture . Hands-on experience with AWS data services (e.g. Glue, S3, Lambda, Redshift). Experience building reusable data ingestion and transformation pipelines. Knowledge of metadata management , data governance , and data lineage practices. Skilled at More ❯
engagement, helping shape and deliver scalable, cloud-native data solutions for household-name clients. What youll be doing Designing, building and maintaining robust data pipelines Automating and orchestrating workflows (AWS Glue, Azure Data Factory, GCP Dataflow) Working across leading cloud platforms (AWS, Azure, or GCP) Implementing and optimising modern data architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams … experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset adaptable, collaborative, and delivery-focused The details Location: Edinburgh 2 days onsite per week Duration: 3 months initially Day Rate: c.£500/ More ❯
helping shape and deliver scalable, cloud-native data solutions for household-name clients. What you ll be doing Designing, building and maintaining robust data pipelines Automating and orchestrating workflows (AWS Glue, Azure Data Factory, GCP Dataflow) Working across leading cloud platforms (AWS, Azure, or GCP) Implementing and optimising modern data architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams … experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset adaptable, collaborative, and delivery-focused The details Location: Edinburgh 2 days onsite per week Duration: 3 months initially Day Rate: c.£500/ More ❯
Edinburgh, York Place, City of Edinburgh, United Kingdom
Bright Purple
helping shape and deliver scalable, cloud-native data solutions for household-name clients. What you’ll be doing Designing, building and maintaining robust data pipelines Automating and orchestrating workflows (AWS Glue, Azure Data Factory, GCP Dataflow) Working across leading cloud platforms (AWS, Azure, or GCP) Implementing and optimising modern data architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams … experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset – adaptable, collaborative, and delivery-focused The details Location: Edinburgh – 2 days onsite per week Duration: 3 months initially Day Rate: c.£500/ More ❯