varying data proficiency Thorough understanding of data lake and data warehousing principles and full project involvement in one or more major technology platforms, e.g. Snowflake, Databricks Proven experience with one or more Cloud Services provider, e.g. AWS, Azure or Google Cloud Platform. Good understanding of role-based access control, its more »
processes, and technologies. Strong SQL skills (ideally with Azure SQL), experience working with relational databases, and programming experience in Python or Scala. Experience with Snowflake and its tooling (Snowpark, Snowpipe, etc.). Familiarity with Fivetran, DBT, TensorFlow, PyTorch, and other modern data stack components. Knowledge of data integration and ETL more »
skills in Python and Java 11+, with a good grasp of frameworks like DropWizard. Lakehouse Architectures: Familiarity with modern data technologies such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt, and Airflow/Dagster. AWS Services: Hands-on experience with AWS, especially S3, ECS, and EC2/ more »
automation, data visualization tools, DevOps practices, machine learning frameworks, performance tuning, and data governance tools. Technical proficiency in Microsoft Azure SQL (PaaS & IaaS), CosmosDB, Snowflake Data Warehouse, Power Apps, Reporting Services, Tableau, T-SQL, Python Programming, and Azure Purview. If you're ready to join a dynamic team and drive more »
Spark experience Must have strong AWS experience Must have Terraform experience SQL & NoSQL experience Have built out Data Warehouses & built Data Pipelines Strong Databricks & Snowflake experience Docker, ECS, Kubernetes & Orchestration tools like Airflow or Step Functions are nice to have Contracts are running for 6 months initially, paying up to more »
Finance Team. Requirements To qualify for this role, you will require: · Strong experience with dbt and SQL · Experience working within Cloud Environments (Redshift, Bigquery, Snowflake) Salary A successful candidate will receive: · A Salary of up to £75,000 · Excellent progression opportunities Process- two interview stages 1st Stage- Short Conversation with more »
experience designing data pipelines/warehouses using AWS and Snowflake. Exposure to big data technologies such as Kafka, Spark, or Hadoop. Solid experience with Snowflake, including performance optimisation and cost management. Strong experience with SQL and Data modelling. Excellent understanding of AWS architecture and the ability to design effective, scalable more »
best practices. A team player with excellent communication & collaboration skills. Desirable: Experience developing and maintaining data warehouses/lakes using big data solutions (e.g., Snowflake, Databricks). Familiarity with SQL Server, PostgreSQL or NoSQL database technologies. Previous experience within banking or finance desirable, but by no means necessary. If you more »
their investment models. Tech is also completely flexible. Most of the work is done within Python, C# and Scala with a range of databases. Snowflake is widely used, as are Docker and Kubernetes for containerisation. ETL and ELT tech are also used every day, primarily Airflow, Spark, Hive and a more »
years Glide experience Proficiency in PL/SQL and an additional object-oriented programming language Experience in big data instances such as Cloudera, Azure, Snowflake, etc. Ability to roll out analytics dashboards (e.g., Power BI, Tableau Server) Ability to translate business needs into technical requirements Qualifications Knowledge of the ServiceNow more »
in data modeling techniques such as entity-relationship diagramming, dimensional modeling, and data normalization. • Proficient in SQL and database management systems (e.g., MySQL, Oracle, Snowflake, etc.). • Demonstrated experience and familiarity with emerging technologies, including IoT and AI. • Proficient in data visualization tools such as Tableau, Power BI and Sigma. more »
London, England, United Kingdom Hybrid / WFH Options
Data Idols
you are looking for a new challenge, please submit your CV for initial screening and more details. Data Architect Desired Skills and Experience AWS|Snowflake|ETL more »
Northampton, England, United Kingdom Hybrid / WFH Options
Harnham
ETL work, architecture design, advancing and maintaining warehouses, etc. YOUR SKILLS AND EXPERIENCE High levels of proficiency in Azure Cloud. Strong capabilities with Python, Snowflake, and Dynamics 365. Experience with CI/CD. Strong communication skills for working in a collaborative environment. Strong stakeholder management skills. THE BENEFITS Up to more »
Enterprise architecture approaches ( i.e. TOGAF) Experience in either: Data projects, AI or GenAI projects, Development and Software Architecture Experience with Security Architecture, Databricks or Snowflake would also be beneficial but by no means necessary Above all, we are looking for people who will be pragmatic and want to work collaboratively more »
from a Finance or Wealth/Investment background. Degree in relevant subject such as Computer Science, Mathematics or similar. Desired: Exp in C#, Powershell, Snowflake, Linux. Python frameworks such as Flask. Cloud expertise in Azure. If you would like to be considered for this position, please apply directly to this more »
Lead) with strong implementation experience of S4 HANA deployments in the Automobile industry Proven Knowledge of major cloud services and IOT solutions (AWS and Snowflake Cloud) expertise. Excellent and broad technical knowledge – Understanding of a wide variety of IoT technology and its application to Automotive OEM use cases. TOGAF, depth more »
the effects to the business & enterprise architecture. Please see below for the required experience: Cloud architecture – Azure Application Integration - Mulesoft or Boomi Data Integration Snowflake Integration Databricks Familiarity in Infrastructure as Code, DevOps and CI/CD Pipeline Collaboration skills, comfortable leading design workshops This role would be hybrid, with more »
is essential. We are seeking a DevOps Engineer with the following: Experience maintaining and scaling AWS services Big data experience with tooling such as Snowflake/PostSQL (over 1TB Daily ideally) Containerisation experience with Docker or Kubernetes We can offer a DevOps Engineer in this team: Remote working anywhere in more »
is instrumental in implementing the data strategy that supports front office stakeholders, systems, and clients. The Data Engineer will leverage cutting-edge technology in Snowflake, Python, SQL, and Azure to enhance our data capabilities and support the investment decision-making process. Key responsibilities include: Design, build, and maintain efficient, reliable … data pipelines using ETL and ELT processes. Ensure the seamless flow and availability of high-quality data across the organization. Utilize Snowflake for data storage, processing, and analytics. Optimize data structures and queries to support analytics and BI initiatives. Develop scripts in Python and SQL to automate data processes, integrate … Must have skills Proven experience as a Data Engineer, with a strong background in data pipeline construction, data architecture, and data warehousing. Expertise in Snowflake, Python, SQL, and cloud-native ETL/ELT tools. Familiarity with Azure and other cloud-native technologies. Understanding of finance industry data domains and their more »
Senior Data Engineer - Snowflake, DBT, CDC (Fivetran, Rivery) Are you ready to dive into the dynamic world of data consultancy? We're seeking a passionate Senior Data Engineer to join a rapidly growing business for an exciting contract opportunity! 📅 Contract Duration: 2 months (with potential for future extension), outside IR35 … Pattern: Fully remote, with 1-2 trips to Nottingham over 2 months Essential Requirements: Minimally 8-10 years of data warehousing experience Experience with Snowflake Proficiency in DBT for ELT Hands-on experience with Rivery or other CDC tools (e.g., Fivetran) Azure Blob Store expertise Strong SQL skills Data Migrations … expertise from AWS Redshift and on-prem SQL Server to Snowflake Proficiency in Star Schema data modeling Advantageous to Have: Experience in the energy supplier domain (Gorilla, Gentrack Junifer) Familiarity with Salesforce Knowledge of GitHub Exposure to PowerBI and/or Tableau Main Scope of the Role: As a Senior more »