Swindon, Wiltshire, South West, United Kingdom Hybrid/Remote Options
Experis
engineering, with a strong focus on Azure Databricks. Key Skills & Experience Proven expertise in Azure Databricks (must-have) Hands-on experience with DBT (Data Build Tool) Strong knowledge of Snowflake Solid background in designing, building, and optimizing data pipelines Ability to collaborate effectively in hybrid working arrangements All profiles will be reviewed against the required skills and experience. Due to More ❯
have a number of years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines. ETL process expertise is essential. Proficiency in working with Snowflake or similar cloud-based data warehousing solutions is also essential. Experience in data development and solutions in highly complex data environments with large data volumes is also required. You will More ❯
Cognitive Services and OpenAI Proven ability to deploy Azure services and Infrastructure as Code (IaC) Deep knowledge of Terraform, automation and Infrastructure as Code (IaC) Strong working knowledge of Snowflake engineering and operations Solid understanding of security, networking, identity management and access control (automated via Terraform) Good awareness of the DevOps approach Excellent communication skills and stakeholder engagement Nice to More ❯
Strong background in modern data platform technologies, with hands-on experience in programming (such as SQL, Python, and Terraform), cloud environments (Azure preferred), and advanced data warehousing solutions (ideally Snowflake). Strong knowledge of data management and governance practices. Experience in the life science industry is advantageous. Technical proficiency in data integration and analytics tools. Ability to collaborate with cross More ❯
Strong background in modern data platform technologies, with hands-on experience in programming (such as SQL, Python, and Terraform), cloud environments (Azure preferred), and advanced data warehousing solutions (ideally Snowflake). Strong knowledge of data management and governance practices. Experience in the life science industry is advantageous. Technical proficiency in data integration and analytics tools. Ability to collaborate with cross More ❯
Desford, Peckleton Common, Leicestershire, United Kingdom
Seismic Recruitment
Extract information from internal systems using SQL or EDFL to push data into Power BI Create and format data structures within safe databases and/or internal systems like Snowflake or SharePoint Continuously enhance data collection, storage, and analysis methods What We Are Looking For: Strong understanding of statistical analysis and data visualization tools (e.g., Tableau, Power BI) Degree in More ❯
Leicester, Leicestershire, East Midlands, United Kingdom
Entech Technical Solutions Ltd
of standard work, automation of data flows and understanding stakeholder Extensive experience of creating dashboards using most of the following - power bi, power app, power automate, Alteryx, Tableau software, Snowflake databases & Strong Excel skills To £21.50 FCSA Umbrella (this role is deemed inside IR35) to £16.00 paye Required Hours 8.00am 4.45pm Mon Thurs, earlier finish Friday Please apply for further More ❯
management; using Python and R for data analysis and data automation; creating and maintaining business intelligence systems, including MS Excel spreadsheets and MS Access databases; data visualization tools, including Snowflake and Tableau; and Agile methodologies, C++, erwin, MySQL Server, Python, R, RBDMS, SAS, Scrum methodologies, and SQL. Telecommuting is available up to 2 days per week. Job Location: New York More ❯
leeds, west yorkshire, yorkshire and the humber, united kingdom
AND Digital
architecture (e.g., cloud platform or service fabric) in your choices. Technology Adaptation: Model data structures across various environments including relational databases, NoSQL stores, and cloud-based data platforms (e.g., Snowflake, Databricks). Experience: Proven experience in data modelling, architecture, or database design Proficiency with modelling tools such as ERwin, PowerDesigner, or similar Strong understanding of relational and NoSQL data structures More ❯
such as modern data platforms, data product engineering, data marketplace architecture, data developer portals, platform engineering. Experience co-selling partner solutions with hyperscalers or platforms (e.g. AWS, Azure, GCP, Snowflake, Databricks). Outstanding communication skills - able to translate complex ideas for both technical and business audiences. Demonstrated thought leadership in AI/ML such as speaking at industry events, contributing More ❯
buildkite) and Docker Cloud: You have worked with cloud-based environments before (we use AWS) SQL: You have a good grasp of SQL, particularly with cloud data warehouses like Snowflake Version control: You are proficient with git Soft Skills: You are an excellent communicator, with an ability to translate non-technical requirements into clear, actionable pieces of work You have More ❯
such as modern data platforms, data product engineering, data marketplace architecture, data developer portals, platform engineering. Experience co-selling partner solutions with hyperscalers or platforms (e.g. AWS, Azure, GCP, Snowflake, Databricks). Outstanding communication skills - able to translate complex ideas for both technical and business audiences. Demonstrated thought leadership in AI/ML such as speaking at industry events, contributing More ❯
such as modern data platforms, data product engineering, data marketplace architecture, data developer portals, platform engineering. Experience co-selling partner solutions with hyperscalers or platforms (e.g. AWS, Azure, GCP, Snowflake, Databricks). Outstanding communication skills - able to translate complex ideas for both technical and business audiences. Demonstrated thought leadership in AI/ML such as speaking at industry events, contributing More ❯
such as modern data platforms, data product engineering, data marketplace architecture, data developer portals, platform engineering. Experience co-selling partner solutions with hyperscalers or platforms (e.g. AWS, Azure, GCP, Snowflake, Databricks). Outstanding communication skills - able to translate complex ideas for both technical and business audiences. Demonstrated thought leadership in AI/ML such as speaking at industry events, contributing More ❯
CPG environments. Proven success managing large, distributed teams and complex client programmes. Strong understanding of the programmatic ecosystem, ad verification tools, data privacy regulations, and analytics technologies (e.g., BigQuery, Snowflake, APIs, CDPs, DMPs). Strategic and analytical mindset with a data driven approach to decision making. Excellent communication and stakeholder management skills, able to influence across all levels. Highly organised More ❯
shape and maintain a modern enterprise data platform. In this role, you'll design and optimise scalable data pipelines and models that bring data from core business systems into Snowflake, enabling analytics, reporting, and data-driven insights across the organisation.You'll translate strategic data architecture into robust technical solutions, ensuring the platform is reliable, performant, and well-structured. You'll … modelled data to support decision-making, operational reporting, and future AI/ML capabilities. Key Responsibilities Data Engineering Delivery: Build and maintain high-quality data pipelines and models in Snowflake to support analytics and reporting needs. Architecture Implementation: Apply defined data architecture standards to ingestion, transformation, storage, and optimisation processes. Pipeline Development: Develop robust ELT/ETL workflows using dbt … and orchestration tools, ensuring reliability and maintainability. Performance & Cost Optimisation: Configure Snowflake warehouses and implement query optimisation techniques for efficiency. Data Quality & Governance: Apply data quality checks, lineage tracking, and security standards aligned with InfoSec and regulatory requirements. Feature Adoption: Leverage Snowflake capabilities (Tasks, Streams, Snowpark, Time Travel, Secure Data Sharing) to improve automation and accessibility. Collaboration: Work closely with More ❯
into a more engineering-focused position, someone who enjoys understanding the business context just as much as building the data solutions behind it. You'll work extensively with Python , Snowflake , SQL , and dbt to design, build, and maintain scalable, high-quality data pipelines and models that support decision-making across the business. This is a hands-on, collaborative role, suited … s confident communicating with data, product, and engineering teams, not a 'heads-down coder' type. Top 4 Core Skills Python - workflow automation, data processing, and ETL/ELT development. Snowflake - scalable data architecture, performance optimisation, and governance. SQL - expert-level query writing and optimisation for analytics and transformations. dbt (Data Build Tool) - modular data modelling, testing, documentation, and version control. … build, and maintain dbt models and SQL transformations to support analytical and operational use cases. Develop and maintain Python workflows for data ingestion, transformation, and automation. Engineer scalable, performant Snowflake pipelines and data models aligned with business and product needs. Partner closely with analysts, product managers, and engineers to translate complex business requirements into data-driven solutions. Write production-grade More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Robert Half
requirements and support business deliverables.* Collect, transform, and process datasets from various internal and external sources, ensuring data quality, governance, and integrity.* Implement efficient data models and schemas within Snowflake, and use DBT for transformation, orchestration, and workflow management.* Optimise ELT/ETL processes for improved performance, cost efficiency, and scalability.* Troubleshoot and resolve data pipeline issues swiftly and effectively … technologies in data engineering, and continuously improve your skills and knowledge. Profile * Minimum 3 years' experience working as a Data Engineer in a commercial environment.* Strong commercial experience with Snowflake and DBT.* Proficient in SQL and experienced in data modelling within cloud data warehouses.* Familiarity with cloud platforms such as AWS or Azure.* Experience with Python, Databricks, or related data More ❯
Sheffield, South Yorkshire, England, United Kingdom Hybrid/Remote Options
DCS Recruitment
Research and recommend new tools, technologies, and processes to improve performance, scalability, and efficiency. Contribute to migrations and modernisation projects across cloud and data platforms (e.g. AWS, Azure, GCP, Snowflake, Databricks). Create and maintain documentation aligned with internal processes and change management controls. Experience & Technical Skills Proven hands-on experience as a Data Engineer or in a similar data … of ETL/ELT pipelines, data modelling, and data warehousing principles. Experience working with cloud platforms such as AWS, Azure, or GCP. Exposure to modern data tools such as Snowflake, Databricks, or BigQuery. Familiarity with streaming technologies (e.g., Kafka, Spark Streaming, Flink) is an advantage. Experience with orchestration and infrastructure tools such as Airflow, dbt, Prefect, CI/CD pipelines More ❯
boundaries of what's possible with data and AI. What You'll Do Design & build high-performance ETL/ELT pipelines in modern cloud environments (including Azure, AWS, GCP, Snowflake or Databricks). Lead CI/CD automation, environment versioning, and production deployments for data products. Integrate AI and ML outputs into scalable, automated data workflows. Implement monitoring, alerting, and … modern data tools and cloud platforms to turn data into something powerful. You'll bring: 3+ years' experience in data engineering or cloud platform development (including Azure, AWS, GCP, Snowflake or Databricks) Strong proficiency in SQL and Python. Proven experience with CI/CD tools, DevOps, and automation practices. Solid understanding of data modelling, orchestration, and workflow management. A desire More ❯
Job Title: Lead Data Engineer Job Location: Roseland New Jersey 07068 Onsite Requirements: Data Bricks - must be skilled in data pipelines AWS - Experienced in AWS services related to data migration PySpark - Proficient in pythodor pyspark to build data ingestion for More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Akkodis
Akkodis is partnering with a reputable client in Finance and they are looking for a Data Engineer with strong Databricks experience to expand their IT Team. They are currently improving their CX transformation programme and making changes to their institutional More ❯
Sacramento, California, United States Hybrid/Remote Options
KK Tech LLC
Snowflake Developer (SQL Specialist) Location: 100% Remote Looking: Chinese Candidates Client: Virtuoso Job Summary: We are looking for a Snowflake Developer with strong SQL and data engineering experience. The ideal candidate will design, develop, and maintain Snowflake data solutions, ensuring performance, scalability, and data integrity. Key Responsibilities: Design and implement Snowflake data models, pipelines, and transformations. Develop, optimise, and troubleshoot … complex SQL queries. Integrate Snowflake with other data sources and ETL tools. Ensure data accuracy, performance tuning, and security best practices. Collaborate with BI, analytics, and data science teams to support reporting and insights. Work with cross-functional teams in both English and Mandarin. Required Skills & Qualifications: Bachelor's degree in Computer Science, Data Engineering, or related field. 8+ years … of experience in Snowflake development. Strong hands-on experience with SQL, ETL processes, and data warehousing concepts. Knowledge of Snow pipe, Tasks, Streams, and Time Travel features. Experience with Python or ETL tools (e.g., Airflow, Talend, Informatica) is a plus. Fluent in Mandarin Chinese (spoken and written) and proficient in English. More ❯
on position involves architecting and optimising scalable data pipelines to support advanced analytics, AI/ML initiatives, and actionable insights across the organisation.You'll take full ownership of the Snowflake platform implementation and adoption, ensuring it becomes the central hub for trusted, secure, and high-performing data. Acting as the technical authority, you'll define best practices, establish governance frameworks … to maximise platform value.This is an opportunity to shape the data landscape and deliver solutions that empower decision-making and innovation. Key Responsibilities Platform Leadership: Design, implement, and manage Snowflake as the enterprise data hub, ensuring scalability, security, and performance. Data Architecture & Strategy: Define frameworks for ingestion, replication, storage, and transformation across diverse data sources. Pipeline Development: Build efficient ELT … pipelines using tools such as DBT and Python, integrating operational, financial, and network data. Performance Optimisation: Configure Snowflake warehouses and partitioning strategies for cost efficiency and speed. Governance & Compliance: Implement data quality, lineage, and access control aligned with regulatory and security standards. Innovation: Drive adoption of advanced Snowflake features (Snowpark, Streams, Tasks, Secure Data Sharing) to enhance platform capabilities. Mentorship More ❯
Milton Keynes, Buckinghamshire, United Kingdom Hybrid/Remote Options
Tria
Confidence working independently as the sole Power BI Developer * Solid experience with SQL databases * Proficiency with Power Query (this is a nice to have) * Exposure to modern cloud environments, Snowflake experience is a bonus Benefits of the opportunity: * Full ownership of Power BI reporting with the chance to shape its future * Competitive salary up to £50,000 * Hybrid working More ❯