design and data transformation, supported by ETL processing. Managing data pipelines & orchestration that enable the transfer and processing of data (Databricks, Microsoft Fabric, Alteryx, Snowflake, Apache). Coding and programming, capable of working through complex problems with others from the team, adapting quickly to changing market trends and business needs More ❯
skills. Preferred Qualifications Strong communication skills and demonstrated ability to engage with business stakeholders and product teams. Experience in data modeling , data warehousing (e.g., Snowflake , AWS Glue , EMR , Apache Spark ), and working with data pipelines . Leadership experience—whether technical mentorship, team leadership, or managing critical projects. Familiarity with Infrastructure More ❯
tools e.g: Azure Data factory Azure Synapse Azure SQL Azure DataBricks Microsoft Fabric Azure data lake Exposure to other data engineering and storage tools: Snowflake AWS tools – Kinesis/Glue/Redshift Google tools – BigQuery/Looker Experience working with open datasets – ingesting data/building API based queries Here More ❯
Nottingham, England, United Kingdom Hybrid / WFH Options
Digital Native
use Developing foundational coding skills in Python and SQL to work with and manipulate data Gaining hands-on experience with modern data platforms like Snowflake , Databricks , and BigQuery Exploring cloud services such as AWS , Azure , or Google Cloud Platform (GCP) and their associated tools (e.g., Glue, Lambda, BigQuery, etc.) Being More ❯
Coventry, England, United Kingdom Hybrid / WFH Options
Berkeley Square IT
This role is 100% remote and sits Outside of IR35. Must have technologies: Experience in an ETL toolset (Talend, Pentaho, SAS DI, Informatica, etc.) Snowflake Experience in a Database (Oracle, RDS, Redshift, MySQL, Hadoop, Postgres, etc.) Experience in data modelling (Data Warehouse, Marts) Job Scheduling toolset (Job Scheduler, TWS, etc. More ❯
architecture. Experience with data warehouse development, SQL performance tuning, and managing metadata. Familiarity with Data Vault methodology and cloud platforms such as AWS, GCP, Snowflake, or Databricks. Support & Teamwork: Managing upgrades, deployments, and updates smoothly with stakeholders. Leading support incident triage and resolution on critical systems. Keeping an eye on More ❯
CX KPIs, retention analytics, and customer segmentation models Basic experience with predictive modeling techniques (regression models, clustering, or time-series analysis) Familiarity with Python, Snowflake ML, BigQuery ML, or Azure AI for foundational ML applications Experience with BI tools (Tableau, Power BI) for CX reporting and visualization Ability to optimize More ❯
transformation, business intelligence, AI, and advanced analytics. Proven hands-on capability with relevant technology: Azure Platform, Azure Data Services, Databricks, Power BI, SQL DW, Snowflake, Big Query, and Advanced Analytics. Proven ability to understand low-level data engineering solutions and languages (Spark, MPP, Python, Delta, Parquet). Experience with Azure More ❯
data practices. Expertise in data architecture, data modelling, data analysis, and data migration. Strong knowledge of SQL, NoSQL, cloud-based databases (e.g., AWS Redshift, Snowflake, Google BigQuery, Azure Synapse). Experience in ETL development, data pipeline automation, and data integration strategies. Familiarity with AI-driven analytics Strong understanding of data More ❯
data pipelines, models and maintaining Data Warehouses for reporting and analytics. Strong skills in SQL, Python, problem-solving and data analysis. Deep Experience with Snowflake and AWS Deep Experience with dbt. Excellent communication and collaboration skills. An eagerness to learn and collaborate with others, learn quickly and able to work More ❯
data pipelines, models and maintaining Data Warehouses for reporting and analytics. Strong skills in SQL, Python, problem-solving and data analysis. Deep Experience with Snowflake and AWS Deep Experience with dbt. Excellent communication and collaboration skills. An eagerness to learn and collaborate with others, learn quickly and able to work More ❯
data pipelines, models and maintaining Data Warehouses for reporting and analytics. Strong skills in SQL, Python, problem-solving and data analysis. Deep Experience with Snowflake and AWS Deep Experience with dbt. Excellent communication and collaboration skills. An eagerness to learn and collaborate with others, learn quickly and able to work More ❯
data pipelines, models and maintaining Data Warehouses for reporting and analytics. Strong skills in SQL, Python, problem-solving and data analysis. Deep Experience with Snowflake and AWS Deep Experience with dbt. Excellent communication and collaboration skills. An eagerness to learn and collaborate with others, learn quickly and able to work More ❯
data pipelines, models and maintaining Data Warehouses for reporting and analytics. Strong skills in SQL, Python, problem-solving and data analysis. Deep Experience with Snowflake and AWS Deep Experience with dbt. Excellent communication and collaboration skills. An eagerness to learn and collaborate with others, learn quickly and able to work More ❯
data pipelines, models and maintaining Data Warehouses for reporting and analytics. Strong skills in SQL, Python, problem-solving and data analysis. Deep Experience with Snowflake and AWS Deep Experience with dbt. Excellent communication and collaboration skills. An eagerness to learn and collaborate with others, learn quickly and able to work More ❯
and a strong focus on data governance, compliance, and cloud optimisation. Qualifications * 3+ years of Python-based data engineering experience * Proven experience with Databricks, Snowflake or equivalent big data platforms * Hands-on ETL/ELT experience across cloud services like AWS, Azure, or GCP * Strong grasp of data governance, lineage More ❯
Walsall, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Adecco
to shape clear, actionable data requirements - even when things are complex or incomplete. What You'll Bring Strong hands-on expertise in DBT, SQL, Snowflake , and orchestration tools like Airflow. Experience with Azure Data Factory and visualisation tools such as Power BI. Deep knowledge of agile data development methodologies and More ❯
Nottingham, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
tools (SQL, Python, Spark). Hands-on experience with cloud platforms (Azure, AWS, GCP). Hands-on experience with data platforms (Azure Synapse, Databricks, Snowflake). Ability to translate clients' business needs into technical solutions. Ability to attend & contribute to keynote talks & events. Customer engagement and relationship management skills. Understanding More ❯
Newark, Nottinghamshire, East Midlands, United Kingdom Hybrid / WFH Options
Future Prospects
modelling techniques. Strong analytical and problem-solving skills with an ability to work in agile development environment independently. Experience with data warehouse platforms (e.g., Snowflake, Azure Synapse, Redshift, BigQuery, or similar). Ability to work independently and manage multiple projects simultaneously. Excellent communication and collaboration skills. THE BENEFITS As a More ❯
and real-time personalization. Hands-on experience with agile product development methodologies. Excellent communication and stakeholder management skills. Knowledge of modern data tools (e.g., Snowflake, Databricks, dbt, Kafka). Understanding of machine learning workflows and personalization engines. Product certifications (e.g., SAFe, Pragmatic, CSPO). Key Success Metrics: Consistent development roll More ❯
building reports using SQL and data visualization tools like Tableau Web development skills for risk management UI applications Development experience with databases such as Snowflake, Sybase IQ, and distributed systems like HDFS Ability to interact with business users for issue resolution Design and support batch processes with scheduling infrastructure Leadership More ❯
applications. Experience with data modeling and curation for large datasets. Experience with cloud technologies, including building finance systems on cloud platforms like AWS S3, Snowflake, etc. Preferred Qualifications Interest or knowledge in investment banking or financial instruments. Experience with big data concepts, such as Hadoop for Data Lake. Experience with More ❯
applications. Experience with data modeling and curation for large datasets. Experience with cloud technologies, including building finance systems on cloud platforms like AWS S3, Snowflake, etc. Preferred Qualifications Interest or knowledge in investment banking or financial instruments. Experience with big data concepts, such as Hadoop for Data Lake. Experience with More ❯
RDBMS knowledge. Experience developing distributed, microservices-based applications. Data modeling and curation for large-scale datasets. Experience with cloud technologies such as AWS S3, Snowflake, etc. Preferred Qualifications Interest or knowledge in investment banking or financial instruments. Experience with big data concepts and tools like Hadoop for Data Lake. Experience More ❯
Utilize web development technologies to facilitate application development for front end UI used for risk management actions Develop software for calculations using databases like Snowflake, Sybase IQ and distributed HDFS systems. Interact with business users for resolving issues with applications. Design and support batch processes using scheduling infrastructure for calculation More ❯