for 5 or more consecutive years Demonstrated experience in data architecture or similar role Practical experience across a variety of platforms and languages i.e.Databricks, Snowflake, Azure, AWS, Oracle Cloud, R, Python or similar Understanding of data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing more »
Knowledge of Scala, R, is a plus. Experienced in SQL. Familiarity with various relational database platforms is a plus (SQL Server, MySql, PostgreSQL, Oracle, Snowflake, Vertica, etc). Ability to write efficient and robust queries. Familiarity with DevOps process for model deployment and unit testing. Experience of work in cloud more »
Advanced knowledge of data visualization tools and dashboard design experience essential (Tableau preferred) Experience in the use of large databases and data warehouses required (Snowflake preferred) Experience with transitioning and deploying data science and quantitative models to Production environment required along with exposure to Agile development process -- ability to articulate more »
required. Experience of Machine Learning Engineering and CRM are also highly desirable. High coding standards and experience of Dev Ops/MLOps, Azure, Synapse, Snowflake and Databricks. Great people skills will be also vital in the role. You will be working with, and expected to influence, colleagues at every level more »
technologies and tracking the technology innovations applicable to the solutions. Proven experience of working effectively with senior business stakeholders Experience of using tools including Snowflake, DBT, ADF and Azure Synapse Ability to lead teams and projects towards a common architecture approach and language. Strong communication and collaboration skills. Additional Information more »
for someone who: ð§ ð- Can demonstrate extensive experience having designed and scaled a Data Platform- Has strong Python skills,- Has great SQL, preferably Snowflake- Has previous experience working with dbt & Airflow- Is passionate about solving complex data problems & is interested in working with rich & diverse climate datasets- Cares deeply more »
/scale up environment. Having previously worked within a data engineering team you are familiar with cloud-based data warehouses such as BigQuery and Snowflake, as well as the principles of a modern data lake based on Apache Avro and ParquetExperience managing data platforms as a product is essential and more »
to automate data pipelines and build analytical warehouses· Deep understanding of cloud-based data platforms (Azure SQL DB, Azure Synapse, ADLS, AWS, Hadoop, Spark, Snowflake, No-SQL etc).· Proficient scripting in programming languages such as Java, Python, Scala· Expert in SQLMachine Learning· Good basic understanding of the main types more »
interactive visual reporting dashboards + integrating visualizations into web applications. Azure Cloud: designing + implementing Azure cloud based solutions + using data tools like Snowflake and Azure Data Lake Services (ADLS). Developing solutions for Proof of Concept (POC) + Minimum Viable Products (MVP). Advantageous: Banking, JavaScript, HTML, CSS more »
City of London, London, United Kingdom Hybrid / WFH Options
Syntax Consultancy Limited
interactive visual reporting dashboards + integrating visualizations into web applications. Azure Cloud: designing + implementing Azure cloud based solutions + using data tools like Snowflake and Azure Data Lake Services (ADLS). Developing solutions for Proof of Concept (POC) + Minimum Viable Products (MVP). Advantageous: Banking, JavaScript, HTML, CSS more »
an extremely fast paced environment. Within this role, you will be responsible for building data pipelines for a cloud-based warehouse using Azure and Snowflake, enhancing data capabilities for analytics and science. What you need: 3+ years of hands-on experience as a Data Engineer, building ETL pipelines and managing more »
an opportunity you would like to apply to, please review the necessary competencies below: Knowledge of ETL, Analytics & Data Warehousing Experience with Cloud, AWS, Snowflake, Bigquery Experience in building a data driven culture, in the form of self-serve analytics Previous hands on experience in building a best in class more »
for achieving project success. Key Responsibilities: Software Development: Write high-quality, maintainable code using languages such as Python , SQL . Establish data tools like Snowflake and Azure Data Lake Services (ADLS) Gen 2 Utilize PowerBI, Tableau, or similar tools to design and create interactive and visually appealing dashboards and reports. more »
design, implement and manage data lake/data warehouse platforms. (Some of the following types of providers: AWS, Microsoft Azure, Google Cloud Platform, Databricks, Snowflake, Cloudera, Spark, MongoDB) Done this at companies using high volumes of data, ideally in retailing. Other sectors where used high volume data would also be more »
Kubernetes).Knowledge of big data technologies and frameworks (e.g., Hadoop, Spark).Familiarity with other cloud platforms (e.g. AWS, Google Cloud) and PaaS providers (e.g. Snowflake)Knowledge of Inner or Open Source paradigm and way of workingKnowledge of Cloud (Azure) Networking and Security StandardsKey Responsibilities:Data Platform Engineering: Contribute to designing more »
ECS, Dynamo DB, S3, Event Bridge, SQS, SNS, API Gateway, VPC, Security Hub, Control Tower and Cloud Trail Extra: CDK as infrastructure as Code, Snowflake, LaunchDarkly, Pagerduty Programming languages: we use Go heavily along with Python and Typescript What you’ll get for this role: Our purpose - with you today more »
languages or toolsets: AutoSys, Azure Function App, Azure GIT, Azure Portal, C#, Databricks, GraphQL/Graph API, Informatica CDI, Informatica Power Center, Java, Javascript, Snowflake, PowerBI, PyRecs, Python, Selenium, Spark, and SQL Nice to have skills Ability to propose and estimate the financial impact of architectural alternatives Existing knowledge of more »
Spark experience Must have strong AWS experience Must have Terraform experience SQL & NoSQL experience Have built out Data Warehouses & built Data Pipelines Strong Databricks & Snowflake experience Docker, ECS, Kubernetes & Orchestration tools like Airflow or Step Functions are nice to have Contracts are running for 6 months initially, paying up to more »
Data Engineer who has experience of working with large data sets. You will be a valuable member of the team helping them to implement Snowflake as their main Cloud source as well as enhancing their overall customer experience.The ideal candidate must have:Strong data engineering experienceExperience with SnowflakeExperience with Amazon more »
Analytics Engineer who has experience of working with large data sets. You will be a valuable member of the team helping them to implement Snowflake as their main Cloud source as well as enhancing their overall customer experience.The ideal candidate must have:Strong data engineering experienceStrong analytics experience Experience with more »
experience designing data pipelines/warehouses using AWS and Snowflake. Exposure to big data technologies such as Kafka, Spark, or Hadoop. Solid experience with Snowflake, including performance optimisation and cost management. Strong experience with SQL and Data modelling. Excellent understanding of AWS architecture and the ability to design effective, scalable more »
best practices. A team player with excellent communication & collaboration skills. Desirable: Experience developing and maintaining data warehouses/lakes using big data solutions (e.g., Snowflake, Databricks). Familiarity with SQL Server, PostgreSQL or NoSQL database technologies. Previous experience within banking or finance desirable, but by no means necessary. If you more »
as well as maintaining a hands on involvement in day to day tasks Strong SQL & Python, Dbt Expertise in data warehouses such as Redshift, Snowflake, AWS If you are interested in this or other Analytics opportunities please contact Liam Wilson on liam.wilson@xcede.com or +44(0) 203 871 6363. Unfortunately more »
their investment models. Tech is also completely flexible. Most of the work is done within Python, C# and Scala with a range of databases. Snowflake is widely used, as are Docker and Kubernetes for containerisation. ETL and ELT tech are also used every day, primarily Airflow, Spark, Hive and a more »
Greater London, England, United Kingdom Hybrid / WFH Options
Lyra, formerly ICAS World
teams 2+ years experience running Agile projects Strong awareness & understanding of data regulation similarities and differences across the international spectrum Desirable Working experience with Snowflake as a data warehouse Working experience with data integration platforms such as Fivetran Working experience with building visuals and dashboards on BI tools such as more »