our customers to navigate the rapidly changing Digital-First world we live in. We foster strong partnerships with leading technology giants including Microsoft, AWS, Oracle, Red Hat, OutSystems, and Snowflake, ensuring that our customers are provided with the highest quality solutions and services. We're an award-winning employer reflecting how our employees are at the very heart of Version More ❯
the Data Architect will play a pivotal role in designing, governing, and evolving the enterprise data architecture underpinning ILIM's data platform. Our chosen technology stack is Microsoft ADF, Snowflake and Power BI. You will ensure our data infrastructure supports current and future business needs, aligns with ILIM's data strategy, and enables scalable, compliant, and performant data operations. This … ensuring alignment with group-wide data initiatives. What you will help us to achieve Deliver a best-in-class, user-centric, high-performing, and scalable cloud data platform using Snowflake, supporting ILIM's data strategy and business needs. Design and govern best practice data models (Operational Data Store & Data Warehouse), pipelines, data engineering standards, and security best practices within Snowflake … oriented person who is passionate about data and the impact it can have in an organisation. An engineering mindset with proven experience building out large scale data platforms using Snowflake (big advantage) or similar with extensive knowledge on best practices for data pipelines, deployment, orchestration, cost and performance optimisation. Forward thinking, ensuring we can continue to incrementally build out our More ❯
in a collaborative, fast-paced environment. What You'll Be Doing: Building and maintaining scalable data pipelines across Azure and GCP Working with tools such as Azure Data Factory, Snowflake, BigQuery, and Cloud Functions Supporting data warehousing and integration across diverse industry projects Collaborating with data professionals and contributing to cross-functional delivery teams Assisting in reporting and data visualisation … years of experience in a data engineering or similar role (graduates welcome) Proficiency in SQL and Python Exposure to any of the following: Azure Data Factory GCP BigQuery Snowflake Cloud Functions Strong communication and problem-solving skills Must be eligible for SC (Security Check) clearance (UK residency required) What's On Offer: £60,000 base salary Hybrid working (London office More ❯
and interpersonal skills Involving migration moving from standard database to Datawarehouse Day to day responsibilities are working in Strong on python, PySpark, ETL jobs from standard database to Datawarehouse Snowflake is preferred but not mandate Quick learner of new technologies, tools, concepts and ability to translate them to action. Excellent problem-solving skills and attention to detail Effective communication and … API, Integration/EAI technologies like Informatica 3+ years' experience with Technologies including Web Service API, XML, JSON, JDBC, Java, Python. 3+ years working with SaaS platforms such as Snowflake, Collibra, Mongo/MongoDB Atlas, Knowledge of enterprise data models, information classification, meta-data models, taxonomies and ontologies. Exposure to Full stack enterprise application development (Agular, Spring Boot, Automation testing … data analysis, and data management role Experience with different query languages such as PL/SQL, T-SQL, and ANSI SQL Experience with database technologies such as DB2, PostgreSQL, Snowflake Knowledge of data warehousing and business intelligence concepts including data mesh, data fabric, data lake, data warehouse, and data marts Skills: Python PySpark Informatica SQL Education: Bachelor's degree in More ❯
and familiar with HIPAA and other data privacy controls. Preferred experience/skills:Amazon Web Services (AWS) Certification or Cloud Data Engineer Certification Informatica (IICS) and Airflow (Orchestration) experience Snowflake certification Cognos and/or Tableau reporting experience JOB DESCRIPTION DPH is executing large-scale data modernization across the agency as part of DMI. This includes creating a consolidated Enterprise … Data Platform on modern cloud tools (including AWS & Snowflake,) and re-platforming existing business data applications from solutions such as SAS and legacy databases to the new platform. DPH data systems enable epidemiologists, public health researchers, state and local public health officials, and business leaders to analyze public health trends and drive policy decisions. The ideal candidate is an expert … Conducts thorough analysis and documentation of cloud data engineering strategies and designs. Establish regular overall system performance assessment processes and action plans. Expertise in Data Integration leveraging Informatica (IICS), Snowflake, SQL and Python code to solve complex use cases Provide direction for the design and development of the data layers including review of ETL requirements. Capable of leading team design More ❯
make a meaningful impact on our data-driven journey. We have an exciting opportunity for you to work with a talented team, harness cutting-edge technologies like Data Vault, Snowflake, DBT, Airflow and AWS/Azure, and drive innovative solutions that shape the future of our organization. As a Data Architect, you will play a crucial role in designing and … the data they generate, leveraging opportunities to structure and utilise this data for driving business value. Be part of a cutting-edge retail transformation, leveraging the latest technologies like Snowflake, DBT, Data Vault and Azure/AWS. Collaborate with top talent in data engineering, analytics, and Products teams across the organization to create impactful data solutions. Key Skills & Experience As More ❯
on Python/Pyspark/Snowpark for development and integration. Data Pipeline and Warehousing Optimization: Design, implement, and maintain efficient data workflows and data warehousing solutions using tools like Snowflake to ensure seamless data ingestion, transformation, storage, and retrieval. Cloud-Native Solutions: Leverage Azure cloud platforms to deploy scalable data and software solutions, incorporating serverless architectures, containerization (e.g., Docker, Kubernetes … CI/CD tools such as Azure DevOps pipelines, and Infrastructure as Code. Professional-level skills in Python/Pyspark development, database management, and data warehousing. Professional experience with Snowflake, Azure Data Factory for pipelining, Azure Databricks for data processing, and PowerBI for reporting and analytics. Good knowledge in streaming and interfacing technologies including Kafka, Terraform, and Azure Microservices. Collaborative More ❯
on Python/Pyspark/Snowpark for development and integration. Data Pipeline and Warehousing Optimization: Design, implement, and maintain efficient data workflows and data warehousing solutions using tools like Snowflake to ensure seamless data ingestion, transformation, storage, and retrieval. Cloud-Native Solutions: Leverage Azure cloud platforms to deploy scalable data and software solutions, incorporating serverless architectures, containerization (e.g., Docker, Kubernetes … CI/CD tools such as Azure DevOps pipelines, and Infrastructure as Code. Professional-level skills in Python/Pyspark development, database management, and data warehousing. Professional experience with Snowflake, Azure Data Factory for pipelining, Azure Databricks for data processing, and PowerBI for reporting and analytics. Good knowledge in streaming and interfacing technologies including Kafka, Terraform, and Azure Microservices. Collaborative More ❯
on Python/Pyspark/Snowpark for development and integration. Data Pipeline and Warehousing Optimization: Design, implement, and maintain efficient data workflows and data warehousing solutions using tools like Snowflake to ensure seamless data ingestion, transformation, storage, and retrieval. Cloud-Native Solutions: Leverage Azure cloud platforms to deploy scalable data and software solutions, incorporating serverless architectures, containerization (e.g., Docker, Kubernetes … CI/CD tools such as Azure DevOps pipelines, and Infrastructure as Code. Professional-level skills in Python/Pyspark development, database management, and data warehousing. Professional experience with Snowflake, Azure Data Factory for pipelining, Azure Databricks for data processing, and PowerBI for reporting and analytics. Good knowledge in streaming and interfacing technologies including Kafka, Terraform, and Azure Microservices. Collaborative More ❯
on Python/Pyspark/Snowpark for development and integration. Data Pipeline and Warehousing Optimization: Design, implement, and maintain efficient data workflows and data warehousing solutions using tools like Snowflake to ensure seamless data ingestion, transformation, storage, and retrieval. Cloud-Native Solutions: Leverage Azure cloud platforms to deploy scalable data and software solutions, incorporating serverless architectures, containerization (e.g., Docker, Kubernetes … CI/CD tools such as Azure DevOps pipelines, and Infrastructure as Code. Professional-level skills in Python/Pyspark development, database management, and data warehousing. Professional experience with Snowflake, Azure Data Factory for pipelining, Azure Databricks for data processing, and PowerBI for reporting and analytics. Good knowledge in streaming and interfacing technologies including Kafka, Terraform, and Azure Microservices. Collaborative More ❯
on Python/Pyspark/Snowpark for development and integration. Data Pipeline and Warehousing Optimization: Design, implement, and maintain efficient data workflows and data warehousing solutions using tools like Snowflake to ensure seamless data ingestion, transformation, storage, and retrieval. Cloud-Native Solutions: Leverage Azure cloud platforms to deploy scalable data and software solutions, incorporating serverless architectures, containerization (e.g., Docker, Kubernetes … CI/CD tools such as Azure DevOps pipelines, and Infrastructure as Code. Professional-level skills in Python/Pyspark development, database management, and data warehousing. Professional experience with Snowflake, Azure Data Factory for pipelining, Azure Databricks for data processing, and PowerBI for reporting and analytics. Good knowledge in streaming and interfacing technologies including Kafka, Terraform, and Azure Microservices. Collaborative More ❯
on Python/Pyspark/Snowpark for development and integration. Data Pipeline and Warehousing Optimization: Design, implement, and maintain efficient data workflows and data warehousing solutions using tools like Snowflake to ensure seamless data ingestion, transformation, storage, and retrieval. Cloud-Native Solutions: Leverage Azure cloud platforms to deploy scalable data and software solutions, incorporating serverless architectures, containerization (e.g., Docker, Kubernetes … CI/CD tools such as Azure DevOps pipelines, and Infrastructure as Code. Professional-level skills in Python/Pyspark development, database management, and data warehousing. Professional experience with Snowflake, Azure Data Factory for pipelining, Azure Databricks for data processing, and PowerBI for reporting and analytics. Good knowledge in streaming and interfacing technologies including Kafka, Terraform, and Azure Microservices. Collaborative More ❯
Be part of something groundbreaking At AIG, we are making long-term investments in a brand-new, innovative Generative AI team, designed to explore new possibilities for how artificial intelligence can be applied in insurance and beyond, and we need More ❯
for motivated graduates and early-career talent to join our growing team. We’ve placed over 20,000 people into high-growth tech companies, helping build industry giants like Snowflake, MongoDB, Databricks, Zscaler, Rubrik , and more. Now, we’re partnering with the next wave of innovative startups across AI, Data, Fintech, and Cybersecurity — and we want you to be part More ❯
years of professional experience in Data Extensive experience in data and analytics consulting, with a track record of leading strategic programmes Deep technical expertise in cloud data platforms (e.g. Snowflake, Databricks, Matillion) and data architecture Strong stakeholder engagement skills, including experience working with board-level clients A commercial mindset and ability to align delivery with business outcomes A passion for More ❯
Major Financials Company in New York City is seeking a Database Engineer (with DBA experience in MongoDB, Snowflake, and Confluent Kafka) for a hybrid role (3 days On-Site/2 days Remote). This is a full-time perm employee role (direct-hire) with excellent benefits package. Role: Responsible for designing, implementing, and maintaining modern database systems to ensure … their optimal performance, security, governance, and reliability. Skills: Design, implement, and maintain database solutions using MongoDB, Snowflake, and Confluent Kafka. Design, implement, and maintain reporting platforms such as Tableau, Power BI, Sigma. Design, implement, and maintain data caching solutions such as Redis/ElastCache. Ensure high availability, performance, and security of database systems. Monitor and optimize database performance, including query More ❯
City of London, London, United Kingdom Hybrid / WFH Options
I3 Resourcing Limited
Senior Data Engineer MUST HAVE SNOWFLAKE Salary - £90-100k with 15% bonus Hybrid working - couple of days in the office City of London We are looking for: Good understanding of data engineering principles A good technical grasp of Snowflake and automating it, and transforming complex datasets AWS Skillset Delivery experience Building solutions in snowflake implementing data warehousing solutions using … Snowflake and AWS Hands-on experience with AWS services such as Glue (Spark), Lambda, Step Functions, ECS, Redshift, and SageMaker. Enthusiasm for cross-functional work and adaptability beyond traditional data engineering. Examples like building APIs, integrating with microservices, or contributing to backend systems - not just data pipelines or data modelling. Mention on tools like GitHub Actions, Jenkins, AWS CDK, CloudFormation … Terraform. MUST HAVE SNOWFLAKE, AWS Skills and Experience: Proven experience as a data engineer with strong hands-on programming skills and software engineering fundamentals, with experience building scalable solutions in cloud environments (AWS preferred) Extensive experience in AWS services, e.g. EC2, S3, RDS, DynamoDB, Redshift, Lambda, API Gateway Solid foundation in software engineering principles, including version control (Git), testing, CI More ❯
We are looking for a talented Data Engineer , specialising in designing and optimising scalable data pipelines, and cloud-based data solutions. This position requires expertise in Python, SQL, dpt and experience with Snowflake. If you have industry experience working with More ❯
Chelsea, London, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
Data Scientist – Location: London (Hybrid) 4 days in the office Salary: £50,000 to £60,000 + Benefits CRM | Predictive Modeling | Snowflake | dbt Are you a Data Analyst or mid level Data Scientist who’s ready to move beyond dashboards and into real impact? This is your chance to join a global data team that’s driving strategy across marketing … ll be building predictive models (LTV, lead scoring), developing analytics-ready data marts, and deploying solutions that directly influence decision-making. You’ll work with a cutting-edge stack: Snowflake, dbt, Tableau, Salesforce, Python, SQL and collaborate with stakeholders who genuinely value data.What You’ll Be Doing as a Data Scientist: Designing and refining predictive models to guide marketing spend … with drift monitoring and retraining triggers Partnering with marketing and enrolment teams to run A/B tests and measure impact Building analytics-ready data marts in dbt/Snowflake with proper documentation and SLAs Developing dashboards in Tableau to track ROI, pipeline health, and market segmentation Maintaining pipelines from Salesforce, Marketing Cloud, GA, Facebook, and more Using reverse-ETL More ❯
Overview Data Scientist - Location: London (Hybrid) 4 days in the office Salary: £50,000 to £60,000 + Benefits CRM Predictive Modeling Snowflake dbt Are you a Data Analyst or mid level Data Scientist who's ready to move beyond dashboards and into real impact? This is your chance to join a global data team that's driving strategy across … ll be building predictive models (LTV, lead scoring), developing analytics-ready data marts, and deploying solutions that directly influence decision-making. You'll work with a cutting-edge stack: Snowflake, dbt, Tableau, Salesforce, Python, SQL and collaborate with stakeholders who genuinely value data. What You'll Be Doing as a Data Scientist Designing and refining predictive models to guide marketing … with drift monitoring and retraining triggers Partnering with marketing and enrolment teams to run A/B tests and measure impact Building analytics-ready data marts in dbt/Snowflake with proper documentation and SLAs Developing dashboards in Tableau to track ROI, pipeline health, and market segmentation Maintaining pipelines from Salesforce, Marketing Cloud, GA, Facebook, and more Using reverse-ETL More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
Teem
employees across 3 offices (North America and Europe). APSCo’s Recruitment Company of the Year 2021. Winner for Culture in the People's Platform Awards 2023. Clients: Snowflake, Kong, Confluent, Fivetran, Looker, Workato, Dataiku and many more Venture Capital backed software scale-ups! Benefits: 20 vacation days + public holidays. Quarterly activity and annual club trip abroad (last year More ❯
warrington, cheshire, north west england, united kingdom Hybrid / WFH Options
Teem
employees across 3 offices (North America and Europe). APSCo’s Recruitment Company of the Year 2021. Winner for Culture in the People's Platform Awards 2023. Clients: Snowflake, Kong, Confluent, Fivetran, Looker, Workato, Dataiku and many more Venture Capital backed software scale-ups! Benefits: 20 vacation days + public holidays. Quarterly activity and annual club trip abroad (last year More ❯
bolton, greater manchester, north west england, united kingdom Hybrid / WFH Options
Teem
employees across 3 offices (North America and Europe). APSCo’s Recruitment Company of the Year 2021. Winner for Culture in the People's Platform Awards 2023. Clients: Snowflake, Kong, Confluent, Fivetran, Looker, Workato, Dataiku and many more Venture Capital backed software scale-ups! Benefits: 20 vacation days + public holidays. Quarterly activity and annual club trip abroad (last year More ❯
Hedge Funds, or Asset Management or Trading Environments: Proven expertise in Power BI: Advanced DAX, Power Query (M), and complex data modelling. Strong hands-on experience with SQL Server, Snowflake, and Azure cloud data warehouses. Experience with financial market data, trading systems, and portfolio/risk reporting (preferred but not necessary) Strong knowledge of BI architectures, cloud platforms (Azure preferred … to maintain performance and adaptability. Skills & Experience: Required Proven expertise in Power BI: Advanced DAX, Power Query (M), and complex data modelling. Strong hands-on experience with SQL Server, Snowflake, and Azure cloud data warehouses. Experience with financial market data, trading systems, and portfolio/risk reporting (preferred but not necessary) Strong knowledge of BI architectures, cloud platforms (Azure preferred More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
be doing: Leading complex analytics and BI engagements end-to-end, from scoping to delivery. Designing cloud-native data architectures and building robust analytics solutions. Developing with modern platforms (Snowflake, Azure, AWS, or GCP), SQL, and BI tools. Applying and exploring AI/ML techniques (LLMs, predictive modelling, automation). Engaging with senior stakeholders, running workshops, and shaping technical roadmaps. … and service design. What we're looking for: 7+ years' experience in analytics with a strong track record of delivery and client impact. Technical expertise in cloud data platforms (Snowflake, Azure/AWS/GCP), SQL, and data modelling. Strong BI experience - e.g. Power BI or Tableau (expert-level in at least one). Consulting skills: confident engaging with senior More ❯