Lead Data Engineer - Snowflake/DBT/Python - ecommerce - £95k Leading eCommerce client are now searching for a Lead Data Engineer to play a pivotal role in delivering their innovative data strategy. This role will shape our client’s data function, delivering end to end data requirements and designing data … business success. The role: Design, develop and test an array of data migration processes to streamline the conversation from existing system to a new Snowflake infrastructure. Manage the end-to-end data lifecycle, enabling the use of data to influence overall business decisions. Streamline our client’s data processes to … decisions. Solid experience using AWS architecture to design cost effective and scalable cloud data solutions. Extensive experience with Python and DBT. Strong experience with Snowflake and Data Warehousing. Advanced SQL and Data Modelling skills. Strong experience developing data pipelines. Excellent communication skills with the ability to build and lead a more »
to Python , DBT based pipelines and load data to Snowflake. Converting SAS based module to python based Candidates who have data management experience in Snowflake with expertise in Python, DBT, Airflow or similar technologies Must have hands on experience in DBT & SQL Snowflake (added advantage more »
Company Description Version 1 has celebrated over 27 years in the IT industry and continues to be trusted by global brands to deliver IT solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle more »
Version 1 has celebrated over 26 years in Technology Services and continues to be trusted by global brands to deliver solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat, OutSystems and more »
Company Description Version 1 has celebrated over 26 years in Technology Services and continues to be trusted by global brands to deliver solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat more »
City of London, London, United Kingdom Hybrid / WFH Options
Syntax Consultancy Limited
the migration process. The ability to write high-quality, maintainable code using languages such as Python, SQL Server. PC, PowerShell, Sharepoint, Azure, Desktop Support , Snowflake, Azure Data Lake services (ALA) Gen 2. Expertise in development, testing Proof of Concepts (PoCs), delivering Minimum Viable Products (MVPs) is essential for achieving project … optimize SQL queries for data retrieval and manipulation. Plan, design, and execute PoCs to validate new technologies, frameworks, or solutions. Establish data tools like Snowflake and Azure Data Lake Services (ADLS) Gen 2. Utilize PowerBI, Tableau, or similar tools to design and create interactive and visually appealing dashboards and reports. more »
Central London, United Kingdom Hybrid / WFH Options
Syntax Consultancy Ltd
the migration process. The ability to write high-quality, maintainable code using languages such as Python, SQL Server. PC, PowerShell, Sharepoint, Azure, Desktop Support , Snowflake, Azure Data Lake services (ALA) Gen 2. Expertise in development, testing Proof of Concepts (PoCs), delivering Minimum Viable Products (MVPs) is essential for achieving project … optimize SQL queries for data retrieval and manipulation. Plan, design, and execute PoCs to validate new technologies, frameworks, or solutions. Establish data tools like Snowflake and Azure Data Lake Services (ADLS) Gen 2. Utilize PowerBI, Tableau, or similar tools to design and create interactive and visually appealing dashboards and reports. more »
London, England, United Kingdom Hybrid / WFH Options
BBC
Extensive use of cloud technologies such as AWS and GCP. • Good working knowledge of Data Warehousing technologies (such as AWS Redshift, GCP BigQuery or Snowflake). • Experience in deploying and scheduling code bases in a data development environment, using technologies such as Airflow. • Demonstrable experience of working alongside cross-functional more »
first use cases for genAI. QualificationsWhat you’ll need:• Technical expertise in tools spanning: data warehousing, ETL, internal visualisation and analytics. Good examples are Snowflake, GCP, Azure Analytics, Sagemaker, Databricks, Tableau, PowerBI, Looker, Quicksight, Airflow, astronomer.io, Alteryx, Collibra.• Hands on experience and\or a detailed and deep understanding of the more »
Data Engineer/Software Engineer/ML Engineer on an open-source tech stack with Python, Scala, Spark, ‘modern data’ platforms such as Databricks, Snowflake etc., cloud platforms, database technologies.· Track record of leading and motivating high performance engineering staff through ‘leading by example’ hands-on approach.Senior EM would have more »
experience with Python.Experience building scalable, high-quality data models that serve complex business use cases.Knowledge of dbt and experience with cloud data warehouses (BigQuery, Snowflake, Databricks, Azure Synapse etc).Proficiency in building BI dashboards and self-service capabilities using tools like Tableau and Looker.Excellent communication skills and experience in managing more »
and pipeline development.Extensive experience with BI/visualisation tools.Experience working with cloud data warehouses.Added bonusProficiency in dbt.Skills in Python/R.Experience with Fivetran, Prefect, Snowflake, and Periscope.Familiarity with writing ETL pipelines using SQL and Python, and orchestration tools like Airflow or Prefect.Background in experimentation.Experience in fast-paced, venture-backed startup more »
for achieving project success. Key Responsibilities: Software Development: Write high-quality, maintainable code using languages such as Python and SQL Establish data tools like Snowflake and Azure Data Lake Services (ADLS) Gen 2 Utilize PowerBI, Tableau, or similar tools to design and create interactive and visually appealing dashboards and reports. more »
an extremely fast paced environment. Within this role, you will be responsible for building data pipelines for a cloud-based warehouse using Azure and Snowflake, enhancing data capabilities for analytics and science. What you need: 3+ years of hands-on experience as a Data Engineer, building ETL pipelines and managing more »
for achieving project success. Key Responsibilities: Software Development: Write high-quality, maintainable code using languages such as Python , SQL . Establish data tools like Snowflake and Azure Data Lake Services (ADLS) Gen 2 Utilize PowerBI, Tableau, or similar tools to design and create interactive and visually appealing dashboards and reports. more »
design, implement and manage data lake/data warehouse platforms. (Some of the following types of providers: AWS, Microsoft Azure, Google Cloud Platform, Databricks, Snowflake, Cloudera, Spark, MongoDB) Done this at companies using high volumes of data, ideally in retailing. Other sectors where used high volume data would also be more »
Engineer Experience working with AI frameworks and libraries (PyTorch) Confidence collaborating in a complex and cross-functional teams Strong skills with: AWS, Python, Airflow, Snowflake Interested ? Apply now or reach out to daisy@wearenumi.com for further details and a chat more »
environment ☁️ 🔹 Proven ability to design, implement, and utilize various database structures with a focus on cloud-based data services such as Azure, Databricks, and Snowflake ❄️ 🔹 Experience in building ETL/ELT data pipelines and applying DevOps (CI/CD) concepts to test, schedule, and deploy to a production environment 🔄 If more »
interface into IT.The ability to resolve complex and non-routine issues and identify improvements in the testing and validation of data accuracy.Extensive experience with Snowflake is essential and working knowledge of dbt, Airflow and AWS is highly desirable.Strong background developing, constructing, testing, and maintaining practical data architectures and drive improvements more »
London, England, United Kingdom Hybrid / WFH Options
Understanding Recruitment
business pitches We’d love to see: Strong proficiency with Python and SQL Extensive Machine Learning experience: forecasting, predictive performance analysis & optimisation Experience with Snowflake, DataIQ or Databricks Cloud platform experience (AWS preferred) Adobe analytics & sales platform experience is a bonus Previous experience working with Marketing Ecosystems Hybrid working with more »
and investigatory skills. with RDBMS systems with the vision to convert data requirements into logical models and physical solutions. with data warehousing solutions (e.g. Snowflake) and data lake architecture, Azure Databricks/Pyspark & ADF. retail data model standards - ADRM. communication skills, organisational and time management skills. to innovate, support change more »
London, England, United Kingdom Hybrid / WFH Options
Understanding Recruitment
reporting We’d love to see: Strong proficiency with Python and SQL Extensive Machine Learning experience: recommendation, forecasting, predictive performance analysis & optimisation Experience with Snowflake, DataIQ or Databricks AWS experience – Redshift & Fast API NLP & Computer Vision experience Previous experience working with Marketing Ecosystems Hybrid working with offices in Central London more »