data pipelines, models and maintaining Data Warehouses for reporting and analytics. Strong skills in SQL, Python, problem-solving and data analysis. Deep Experience with Snowflake and AWS Deep Experience with dbt. Excellent communication and collaboration skills. An eagerness to learn and collaborate with others, learn quickly and able to work More ❯
brighton, south east england, united kingdom Hybrid / WFH Options
Trust In SODA
and assist with the continued scaling and optimisation of these. Their ideal candidate would have good knowledge within: Cloud (AWS, GCP, Azure) Data Warehousing (Snowflake, Redshift, BigQuery) ETL (Data Fabric, Data Mesh) DevOps (IaC, CI/CD, Containers) Consulting/Client Facing Experience In return they would be offering Free More ❯
brighton, south east england, united kingdom Hybrid / WFH Options
Trust In SODA
Their ideal candidate would have 10+ years experience in Data Engineering/Architecture and have good knowledge within: Cloud (AWS, GCP, Azure) Data Warehousing (Snowflake, Redshift, BigQuery) ETL (Data Fabric, Data Mesh) DevOps (IaC, CI/CD, Containers) Leadership/Line Management Consulting/Client Facing Experience In return they More ❯
brighton, south east england, united kingdom Hybrid / WFH Options
Maxwell Bond
events. Key Responsibilities: Design and build large-scale data warehouses, ETL pipelines, and reporting platforms that are robust and efficient. (Looking at going down Snowflake or BigQuery) Utilize your expertise in backend languages; the tech stack is flexible, including Java, Python, .NET, Ruby, and more. Implement strong coding principles, including More ❯
Brighton & Hove, East Sussex, UK Hybrid / WFH Options
James Chase
Expertise in large-scale data processing and pipeline optimisation Proficient in Python, SQL, and cloud platforms (AWS or similar) Hands-on with Airflow, Fivetran, Snowflake, Docker (or equivalents) Experience with real-time and batch data pipelines Ready for your next move? Apply now or send your CV across to chinmaye.ramnath More ❯
Brighton And Hove, England, United Kingdom Hybrid / WFH Options
James Chase
Expertise in large-scale data processing and pipeline optimisation Proficient in Python, SQL, and cloud platforms (AWS or similar) Hands-on with Airflow, Fivetran, Snowflake, Docker (or equivalents) Experience with real-time and batch data pipelines Ready for your next move? Apply now or send your CV across to chinmaye.ramnath More ❯
A supportive environment that encourages growth and innovation What we’re looking for: 5+ years of experience in data analytics with advanced SQL skills (Snowflake, BigQuery, etc.) Strong experience with BI tools – ideally Looker (LookML a definite plus) Excellent communication skills and the confidence to engage with cross-functional teams More ❯
brighton, south east england, united kingdom Hybrid / WFH Options
James Chase
Expertise in large-scale data processing and pipeline optimisation Proficient in Python, SQL, and cloud platforms (AWS or similar) Hands-on with Airflow, Fivetran, Snowflake, Docker (or equivalents) Experience with real-time and batch data pipelines Ready for your next move? Apply now or send your CV across to chinmaye.ramnath More ❯
brighton, south east england, united kingdom Hybrid / WFH Options
McCabe & Barton
pipelines and analytics platforms that empower business teams to unlock insights through self-service tools. You’ll work with modern cloud-native tools like Snowflake, Databricks, Python, and advanced visualisation platforms to create solutions that drive measurable business outcomes. Key Responsibilities Develop Scalable Data Pipelines : Design and implement ETL/… SQL, or tools like Apache Airflow, and expertise in visualisation tools (Power BI, Tableau, or Looker). Cloud Expertise : Familiarity with cloud platforms like Snowflake, Databricks, or AWS/GCP/Azure for scalable data solutions. Data Modelling Mastery : Strong understanding of dimensional and relational modelling techniques for analytics use More ❯
Role : Snowflake Data Architect Location : Hove, UK Permanent Role Work Mode : Hybrid Role & Responsibilities · Define and implement the end-end architecture of Data warehouse on Snowflake. · Create and maintain conceptual, Logical and Physical Data Models in Snowflake. · Design Data Pipelines and ingestion frameworks using Snowflake native tools. · Work with Data More ❯
brighton, south east england, united kingdom Hybrid / WFH Options
Falcon Smart IT (FalconSmartIT)
Job Title: Snowflake Data Architect Job Location: Hove, UK/Hybrid Is it Permanent/Contract: Permanent Job Description Who we are? We are a leading global IT Services company, dedicated to driving digital transformation and innovation for businesses around the world. Founded in 1990, Client has grown into a … architecture of Data warehouse on Snowflake. Create and maintain conceptual, Logical and Physical Data Models in Snowflake. Design Data Pipelines and ingestion frameworks using Snowflake native tools. Work with Data Governance teams to establish Data lineage, Data quality and access control mechanisms. Engage with Data stewards and other stake holders More ❯
brighton, south east england, united kingdom Hybrid / WFH Options
Uneek Global
supportive, forward-thinking team where your growth is genuinely encouraged, this could be the perfect next step! Why This Role? ✅ Work with modern tech - Snowflake, DBT, Python, Azure ✅ Be part of a friendly, collaborative, remote-first team ✅ Get real investment in your training & development ✅ Take your career to the next … cloud-native tech Shape reusable data engineering patterns Communicate clearly with both technical and non-technical stakeholders Scope and evolve solutions alongside clients Tech: Snowflake, DBT, Airflow, Azure Data Factory Python, Spark, SQL Cloud (Azure, AWS, or GCP) Power BI, Tableau, Looker Modern architecture concepts Why Join? Remote-first, with More ❯
brighton, south east england, united kingdom Hybrid / WFH Options
Intec Select
SQL Database Administrator – Transport Manufacturing - £30-35k – Hybrid/Remote Overview: An exciting opportunity has arisen for a SQL Database Administrator to join one of the world’s leading transportation manufacturers who specialise in providing safety and securement solutions More ❯
strong benefits package and flexible working. The ideal Data Engineering Manager will come from a data engineering background and have strong knowledge in SQL, Snowflake, Microsoft Azure, Azure Data Factory and Azure DevOps. The Engineering Manager will design, improve and maintain robust data pipelines within data architecture. To be considered … robust data pipelines Strong SQL programming skills. Knowledge of other programming languages such as Python, C++ and Java beneficial Possesses a strong understanding of Snowflake - beneficial Experience managing small teams of Data Engineers Strong experience working in a cloud environment and knowledge in the following very beneficial: Microsoft Azure, Azure More ❯
brighton, south east england, united kingdom Hybrid / WFH Options
Wyoming Interactive
day you’ll be designing and maintaining high-quality, analysis-ready datasets that support CRM, product, and marketing teams in making informed decisions. Using Snowflake alongside tools like dbt and SQL , you’ll model data to enable segmentation, performance tracking, and personalised marketing. You will have the opportunity to work … around privacy, compliance, and governance. What You’ll Bring 3+ years working in data, analytics, or martech-adjacent engineering roles Strong skills in SQL , Snowflake , and data transformation/modelling tools (e.g. dbt) Experience integrating data from Google Analytics and HubSpot Familiarity with CDPs or event tracking platforms such as More ❯
roadmap planning, with an openness to exploring innovative solutions like generative AI where appropriate. We work with an Airflow/AWS/Fivetran/Snowflake/Looker stack and typically use Python and Docker in our pipelines. You’ll need to be highly proficient in these or similar tools and … ELT patterns Comfortable evaluating both business and technical requirements Skilled at working with large datasets and optimising data flows Experience with Airflow, AWS, Fivetran, Snowflake, Docker (or similar) Strong in Python, SQL, and cloud platforms (AWS or comparable) Experienced in handling real-time data pipelines Experienced in evolving data pipelines More ❯
roadmap planning, with an openness to exploring innovative solutions like generative AI where appropriate. We work with an Airflow/AWS/Fivetran/Snowflake/Looker stack and typically use Python and Docker in our pipelines. You’ll need to be highly proficient in these or similar tools and … ELT patterns Comfortable evaluating both business and technical requirements Skilled at working with large datasets and optimising data flows Experience with Airflow, AWS, Fivetran, Snowflake, Docker (or similar) Strong in Python, SQL, and cloud platforms (AWS or comparable) Experienced in handling real-time data pipelines Experienced in evolving data pipelines More ❯
will define the data-driven future of a business with a 300-year heritage. With a clear plan and the technology in place (InvestCloud, Snowflake, Alteryx, and Power BI) it’s the ideal setting to show how your data leadership skills can transform business performance and client outcomes. Reporting to … ll look to you for direction and coaching. Leading on data engineering to ensure high-quality, accessible data; designing and protecting data pipelines (e.g. Snowflake). Implementing data governance frameworks, ensuring regulatory and security compliance; embedding the principles across the business, and managing emerging risks. Providing strategic insight and recommendations More ❯