of expertise. As a great influencer with great communication skills, you love sharing your knowledge with others and helping them grow. Our Technology Stack Snowflake Salesforce CDP AWS AWS Lakeformation AWS Kinesis AWS Event Bridge Glue/Glue Data Brew App Flow NOSQL Databases e.g. DynamoDB SQL Databases e.g. MySQL more »
GCP DataProc or GCP Cloud Data Fusion. * NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. * BigQuery and Data Studio/Looker. * Snowflake Data Warehouse/Platform * Streaming technologies and processing engines, Kinesis, Kafka, Pub/Sub and Spark Streaming. * Experience of working CI/CD technologies, Git more »
an extremely fast paced environment. Within this role, you will be responsible for building data pipelines for a cloud-based warehouse using Azure and Snowflake, enhancing data capabilities for analytics and science. What you need: 3+ years of hands-on experience as a Data Engineer, building ETL pipelines and managing more »
CD. professional experience with SQL and data transformation, ideally with dbt or similar. with at least one of these Cloud technologies: AWS, Microsoft Azure, Snowflake, GCP. Apply to the Role Roles like these are snapped up very quickly, so act now if you do not want to miss out! Reply more »
Greater London, England, United Kingdom Hybrid / WFH Options
InterEx Group
definition of Big Data architecture with different tools and environments: Cloud (AWS, Azure and GCP), Cloudera, No-sql databases (Cassandra, Mongo DB), ELK, Kafka, Snowflake, etc. Past experience in Data Engineering and data quality tools (Informatica, Talend, etc.) Previous involvement in working in a Multilanguage and multicultural environment Proactive, tech more »
have access to accurate up to date data We have a modern data stack already in place comprising of a Snowplow data pipeline, a Snowflake data warehouse, dbt as the data transformation tool and our BI tool, Looker. We don’t expect you to be fluent with all these technologies more »
City Of London, England, United Kingdom Hybrid / WFH Options
TECHOHANA
packages; pandas, SQLite Power BI experience; including DAX, Power Query & m-code Strong Excel knowledge, for data analysis, visualisation and report automation Understanding of Snowflake environment Comfortable working with senior and junior business stakeholders Comfortable explaining technical subjects and processes to a non-technical audience Strong analytical and problem-solving more »
London, England, United Kingdom Hybrid / WFH Options
Anson McCade
SSIS, Talend or Pentaho • Data governance and data management tools such as Informatica MDM, Informatica AXON, Informatica EDC, and Collibra • MySQL, SQL Server, Oracle, Snowflake, PostgreSQL and NoSQL databases • Programming languages such as Spark or Python • Amazon Web Services, Microsoft Azure or Google Cloud and distributed processing technologies such as more »
London, England, United Kingdom Hybrid / WFH Options
Chapman Tate Associates
Salary: £60,000 - £70,000 + 12.5% Bonus Key areas: SQL Data Engineer, Azure Data Engineer, Azure Data Factory, Azure Databricks, Azure Data Services, Snowflake, SQL Server, Data, BI, BI Development, MS BI Chapman Tate is collaborating with a leading Housing Association that is revolutionizing the industry through innovative data more »
BI A background in Data Engineering would be beneficial to show programming aptitude (SQL proficiency/data warehousing practices). Azure Data Factory or Snowflake Experience within a Real Estate Investment environment is beneficial The client will offer a three-day hybrid working policy on site, which has potential flexibility more »
in Docker Good experience in Linux Good experience in Airflow Good knowledge of cloud architecture Good experience in Terraform Expert experience with database systems (snowflake, sql, postgres etc.) Experience of micro-service development and API development Strong understanding of Agile delivery methodologies Demonstrable ability to design and deliver complex systems more »
London, England, United Kingdom Hybrid / WFH Options
Version 1
rapidly changing Digital-First world we live in. We foster strong partnerships with leading technology giants including Microsoft, AWS, Oracle, Red Hat, OutSystems, and Snowflake, ensuring that our customers are provided with the highest quality solutions and services. We’re an award-winning employer reflecting how our employees are at more »
batch/risk reports or trade life cycle Experience in at least one scripting ( PowerShell, python ) and database (ORACLE, MS SQL, MongoDB, Postgres or Snowflake ) language CI/CD deployment pipeline involving Azure DevOps, Team City, Git, Jenkins etc. Ability to work well under pressure and Strong communication skills Demonstrated more »
rules to analyse historical data to determine its accuracy. Experience with the use of data strategy & governance tools such as Informatica, Collibra, MS Purview, Snowflake, Snowsight. Project experience using proven delivery methodologies (e.g., Waterfall, Agile, Scrum, DevOps, Testing), frameworks and best practices. Advanced working knowledge in SQL and relational databases more »
is instrumental in implementing the data strategy that supports front office stakeholders, systems, and clients. The Data Engineer will leverage cutting-edge technology in Snowflake, Python, SQL, and Azure to enhance our data capabilities and support the investment decision-making process. Key responsibilities include: Design, build, and maintain efficient, reliable … data pipelines using ETL and ELT processes. Ensure the seamless flow and availability of high-quality data across the organization. Utilize Snowflake for data storage, processing, and analytics. Optimize data structures and queries to support analytics and BI initiatives. Develop scripts in Python and SQL to automate data processes, integrate … Must have skills Proven experience as a Data Engineer, with a strong background in data pipeline construction, data architecture, and data warehousing. Expertise in Snowflake, Python, SQL, and cloud-native ETL/ELT tools. Familiarity with Azure and other cloud-native technologies. Understanding of finance industry data domains and their more »
sole Data Engineer, making this a role with huge growth potential. They're currently working with an external partner to build a brand-new Snowflake data warehouse, which brings a huge scope of work for you to lead on internally, including the development of scalable data pipelines, overseeing the integration … designing and optimising complex queries and databases ETL process design using Azure Data Factory Data warehouse development experience using Azure Synapse Analytics Experience with Snowflake would be desirable but not essential Excellent communication and stakeholder management skills Benefits: Salary up to £85,000 depending on experience Bonus up to more »
City Of London, England, United Kingdom Hybrid / WFH Options
DGH Recruitment
proof of concept exercises to validate and evaluate the benefits of a technology against a business need - Experience of Azure Databricks architecture. - Experience of snowflake architecture and/or implementation - Familiarity in Infrastructure as Code and CI/CD Pipeline - Familiarity with Dev Ops Services - Experience of translating business requirements more »
City of London, London, United Kingdom Hybrid / WFH Options
DGH Recruitment
proof of concept exercises to validate and evaluate the benefits of a technology against a business need - Experience of Azure Databricks architecture. - Experience of snowflake architecture and/or implementation - Familiarity in Infrastructure as Code and CI/CD Pipeline - Familiarity with Dev Ops Services - Experience of translating business requirements more »
design, development, and optimisation of the company's data infrastructure. You will work with some of the most innovative tools in the market including Snowflake, AWS (Glue, S3), Apache Spark, Apache Airflow and DBT!! The role is hybrid, with 2 days in the office in central London and the company … Data Engineering Experience developing and maintaining data pipelines from scratch Data modelling, data integration and transformation experience Hands on work with tools such as Snowflake, AWS, Airflow, and DBT Proficiency in data manipulation, scripting and automation with Python Desirable: Experience leading teams Version control systems such as Git or Bitbucket more »
focused on building a massively distributed cloud-hosted data platform that would be used by the entire firm. Tech Stack: Python/Java, Dremio, Snowflake, Iceberg, PySpark, Glue, EMR, dbt and Airflow/Dagster. Cloud: AWS, Lambdas, ECS services This role would focus on various areas of Data Engineering including … management and data quality tooling. Requirements: Strong Python/Java Software Engineering skills Excellent AWS knowledge, ideally with exposure to Airflow, Glue, Iceberg and Snowflake Previous experience with Dremio, dbt, EMR or Dagster Good Computer Science fundamentals knowledge with strong knowledge of software and data architecture. If you would like more »