analytics teams with reliable, well-modelled data Maintain documentation, contribute to architectural decisions, and drive best practices Essential Requirements: Strong SQL skills and experience with modern cloud data warehouses (Snowflake preferred) ELT pipeline development and modern data stack understanding Solid software engineering practices (Git, testing, CI/CD) Own pipeline health, monitoring, and optimisations. API integration experience (REST APIs, webhooks More ❯
analytics teams with reliable, well-modelled data Maintain documentation, contribute to architectural decisions, and drive best practices Essential Requirements: Strong SQL skills and experience with modern cloud data warehouses (Snowflake preferred) ELT pipeline development and modern data stack understanding Solid software engineering practices (Git, testing, CI/CD) Own pipeline health, monitoring, and optimisations. API integration experience (REST APIs, webhooks More ❯
at least one major cloud provider (AWS, Azure, or GCP Strong experience building cloud data lakes, warehouses, and streaming architectures. Proficiency with data processing tools such as Spark, Databricks, Snowflake, Hadoop, or similar. Strong knowledge of ETL/ELT frameworks, API integration, and workflow orchestration tools (Airflow, Azure Data Factory, AWS Glue, etc. Deep understanding of relational and NoSQL databases More ❯
Kafka, Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery, Redshift). Strong understanding of data governance, security, and compliance (experience within financial services is a plus). Leadership experience, with the ability to mentor, influence, and set technical More ❯
stakeholders, including business teams. Build relationships across the bank, establishing a strong peer network and helping to strengthen collaboration.Skills and Experience: Essential Advanced proficiency in databases - SQL Server or Snowflake Advanced experience with low-code/no-code data engineering/ETL tool, preferably Markit EDM (v19.2 or above), however similar tools such as Informatica Power Centre may be acceptable More ❯
Snowflake Amsterdam, North Holland, Netherlands Solutions Architect Snowflake is about empowering enterprises to achieve their full potential - and people too. With a culture that's all in on impact, innovation, and collaboration, Snowflake is the sweet spot for building big, moving fast and taking technology - and careers - to the next level. For our Amsterdam office, we are looking for technology … a strong background in building data platforms and love to use their skills to generate value by supercharging data teams. Our team helps customers on their data journey with Snowflake, which is the central data platform for companies big and small. That means you will work on some of the most challenging data projects in the world, in a post … sales capacity. WHO WE ARE The Snowflake AI Data Cloud's mission is to mobilize the world's data, so that businesses can be truly data driven. We believe in the importance of making our platform easy to use, which allows us to power companies at all stages of the data maturity journey. In our YouTube channel you can find More ❯
skills, with the ability to engage both technical and non-technical audiences. Experience working in agile, cross-functional environments. Hands-on experience with modern data platforms such as Databricks, Snowflake, or similar. Knowledge of Docker, Kubernetes, and DevOps practices for data solution deployment. Solid understanding of data security, compliance frameworks, and industry standards. Relevant certifications (e.g., AWS Solutions Architect, Azure More ❯
support governance and compliance requirements. Skills & Experience Advanced technical proficiency in SQL, Python, and modern data transformation tools (dbt strongly preferred), with experience in cloud data platforms (Azure Databricks, Snowflake, or similar). Proven experience designing and implementing scalable data architectures, including dimensional modelling, data lakehouse/warehouse concepts, and modern data stack technologies. Strong software engineering practices including version More ❯
anddata modeling concepts. Excellent analytical skills, attention to detail, and ability to interpret SAP business data. Preferred Skills Experience working with SAP extractors, IDOCs, orBW data flows. Familiarity with Snowflake, BigQuery,AWS Redshift, or other cloud data platforms. Exposure to data testing frameworks like Great Expectations or PyTest. Understanding of BI tools such as Power BI, Tableau, or SAP Analytics More ❯
modelling. · Strong knowledge of software development methodologies, tools, and frameworks, particularly Agile. · Proficiency in both SQL and NOSQL database management systems (e.g. SQL Server/Oracle/MongoDB, CosmosDB, Snowflake, Databricks). · Hands-on experience with data modelling tools, data warehousing, ETL processes, and data integration techniques. · Experience with at least one cloud data platform (e.g. AWS, Azure, Google Cloud More ❯
time streaming architectures, to support advanced analytics, AI, and business intelligence use cases. Proven experience in designing architectures for structured, semi-structured, and unstructured data , leveraging technologies like Databricks, Snowflake, Apache Kafka , and Delta Lake to enable seamless data processing and analytics. Hands-on experience in data integration , including designing and optimising data pipelines (batch and streaming) and integrating cloud More ❯
integration.Deep understanding of ETL concepts, data warehousing principles, and data modeling techniques.Proficiency in SQL and PL/SQL with experience working on major RDBMS platforms (Oracle, SQL Server, Teradata, Snowflake, etc.Experience with performance tuning and optimization of Informatica mappings and sessions.Strong understanding of data governance, data quality, and metadata management.Familiarity with cloud-based data integration platforms (Informatica Cloud, AWS Glue More ❯
monitoring in Databricks CICD Knowledge of DevOps practices for data pipelines Certifications: Azure Data Engineer or Azure Solutions Architect certifications Skills Mandatory Skills: Python for DATA,Java,Python,Scala,Snowflake,Azure BLOB,Azure Data Factory, Azure Functions, Azure SQL, Azure Synapse Analytics, AZURE DATA LAKE,ANSI-SQL,Databricks,HDInsight If you're excited about this role then we would like More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Randstad Technologies
monitoring in Databricks CICD Knowledge of DevOps practices for data pipelines Certifications: Azure Data Engineer or Azure Solutions Architect certifications Skills Mandatory Skills: Python for DATA,Java,Python,Scala,Snowflake,Azure BLOB,Azure Data Factory, Azure Functions, Azure SQL, Azure Synapse Analytics, AZURE DATA LAKE,ANSI-SQL,Databricks,HDInsight If you're excited about this role then we would like More ❯
and SQL . Hands-on experience with data orchestration tools (Airflow, dbt, Dagster, or Prefect). Solid understanding of cloud data platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift). Experience with streaming technologies (Kafka, Kinesis, or similar). Strong knowledge of data modelling, governance, and architecture best practices . Excellent leadership, communication, and stakeholder management skills. More ❯
Kent, England, United Kingdom Hybrid/Remote Options
Harnham - Data & Analytics Recruitment
stakeholder engagement - able to translate technical progress into business outcomes. Strategic thinker with a delivery mindset and the ability to influence at senior levels. Nice-to-haves: Experience with Snowflake or hybrid multi-cloud data solutions. Familiarity with banking or financial data frameworks such as BCBS239 or IRB. Background in agile delivery and infrastructure automation using Terraform or similar. Package More ❯
s degree in Computer Science, Information Technology, or related field. Experience with the following: PowerBI Governance Framework PowerBI vendor and stakeholder management Ideally, you will also have: Experience with Snowflake, Tableau, and other BI tools is a plus. Knowledge of Specialty Insurance domain is advantageous. What we look for You are detail-oriented, proactive, and thrive in a fast-paced More ❯
on colleagues for guidance. Strong SQL skills and understanding of data models and data principles. Experience with mobile-responsive dashboards and self-service analytics. Desirable Experience with cloud platforms (Snowflake, Azure, BigQuery). Python skills. Experience in agile delivery environments. Understanding of GDPR and data governance. BI tool certifications. The BI team has historically been highly technical; you will bring More ❯
Crewe, Cheshire, United Kingdom Hybrid/Remote Options
Manchester Digital
model development. Expertise in Python, R, or Julia, with proficiency in pandas, NumPy, SciPy, scikit-learn, TensorFlow, or PyTorch. Experience with SQL, NoSQL, and big data technologies (Spark, Hadoop, Snowflake, Databricks, etc.). Strong background in statistical modelling, probability theory, and mathematical optimization. Experience deploying machine learning models to production (MLOps, Docker, Kubernetes, etc.). Familiarity with AWS/GCP More ❯
across geographies Deep expertise in commercial pharma data assets (IQVIA, Symphony, Veeva, Salesforce, Adobe, etc.) Strong technical proficiency in: Cloud platforms: AWS (preferred), Azure, or GCP Data tools: Databricks, Snowflake, dbt, Fivetran Visualization tools: Tableau, Power BI Version control & CI/CD: GitHub, Jenkins English fluency required, German language skills are considered a plus BioNTech is committed to the wellbeing More ❯
SQL and data manipulation across large datasets. Familiarity with data visualization tools (e.g., Matplotlib, Seaborn, Plotly, Tableau, or Power BI). Exposure to modern collaborative data platforms (e.g., Databricks, Snowflake, Palantir Foundry) is a plus. Excellent problem-solving skills, eagerness to learn, and ability to work in fast-paced, evolving environments. Strong written and verbal communication skills, with the ability More ❯
modelling practices (conceptual, logical, physical) and enterprise data management principles Hands-on experience with data modelling tools (e.g., Erwin, Sparx, PowerDesigner) and big data/cloud ecosystems (e.g., Databricks, Snowflake, Redshift, Spark) Solid grasp of data governance, metadata, and data quality frameworks Excellent stakeholder engagement skills — able to communicate complex data concepts to both technical and business audiences Bonus Points More ❯
modelling practices (conceptual, logical, physical) and enterprise data management principles Hands-on experience with data modelling tools (e.g., Erwin, Sparx, PowerDesigner) and big data/cloud ecosystems (e.g., Databricks, Snowflake, Redshift, Spark) Solid grasp of data governance, metadata, and data quality frameworks Excellent stakeholder engagement skills — able to communicate complex data concepts to both technical and business audiences Bonus Points More ❯
with AI/ML and AI enabled analytics (LLMs, RAG, agents). Strong hands on coding skills with Spark (PySpark/Scala), SQL, dbt, and modern data platforms (Databricks, Snowflake, BigQuery). Experience with cloud platforms (AWS preferred). Proven expertise in BI and semantic modeling using tools such as Power BI, Tableau, Looker, or Mode. Strong understanding of data More ❯
Swindon, Wiltshire, South West, United Kingdom Hybrid/Remote Options
Neptune (Europe) Ltd
techniques. Familiarity with SSIS, SSMS, and reporting tools such as Power BI or equivalent platforms. Experience working with cloud and on-premise databases, including solutions like Google Cloud Platform, Snowflake, Databricks, or Amazon Redshift. A meticulous attention to detail and a logical, structured approach to work. The ability to prioritise effectively and manage multiple tasks under time pressure. Strong communication More ❯