position Proficiency in PL/SQL and an additional object-oriented programming language (Highly desirable) Experience in big data instances such as Cloudera, Azure, Snowflake, etc. Structured thinking with the ability to break down ambiguous problems and propose impactful data modeling designs Ability to roll out analytics dashboards (e.g., Power more »
East London, London, United Kingdom Hybrid / WFH Options
Wilmington
Passion for achieving excellent client satisfaction. Strong knowledge of SQL Server and Transact SQL. Knowledge of web-based technology and related products, such as Snowflake, Tableau, HTML, JavaScript and CSS. Ability to work to agreed timescales, keeping clients informed of any delays. Confidence in a customer facing role with excellent more »
are travelling and how to get there, they just need your help to do so. This will suit someone with experience of Python, AWS, Snowflake, Terraform and Kubernetes. Both Engineering and DevOps knowledge and or experience would be helpful. Expect in the next 12 - 24 months this will be a more »
E1, Portsoken, Greater London, United Kingdom Hybrid / WFH Options
Wilmington plc
Passion for achieving excellent client satisfaction. Strong knowledge of SQL Server and Transact SQL. Knowledge of web-based technology and related products, such as Snowflake, Tableau, HTML, JavaScript and CSS. Ability to work to agreed timescales, keeping clients informed of any delays. Confidence in a customer facing role with excellent more »
Hackney, Greater London, Shoreditch, United Kingdom
Talent Smart
We are seeking a highly skilled and motivated Data Engineer. The ideal candidate will have extensive experience in data engineering, particularly with Snowflake, ETL processes, and Power BI. The Data Engineer will be responsible for designing, developing, and maintaining our data infrastructure, ensuring seamless data integration and delivery of high … quality insights. Key Responsibilities: Data Pipeline Development: Design, build, and maintain efficient and reliable ETL processes to move data from various sources into our Snowflake data warehouse. Data Modelling: Develop and maintain data models and schemas to support business needs and ensure data integrity and quality. Data Integration: Integrate data … from multiple sources, including APIs, databases, and flat files, into Snowflake. Performance Optimization: Optimize data storage and query performance in Snowflake to ensure fast and reliable data access. Data Governance: Implement and enforce data governance and security best practices to ensure data privacy and compliance with relevant regulations. Reporting and more »
Experience working in an electronic/systematic trading or investment firm. Experience working directly with Portfolio Managers, Traders, Quants and/or Researchers. AWS, Snowflake JavaScript, Typescript, HTML5, React .Net, C#, Java, JEE, Jakarta EE, Spring, Object-relational Mappers (ORM). RESTful Web Services Microservices Implementations. Data visualisation. Role Description more »
experience with BI tools such as Looker is highly advantageous Experience working with cloud data warehouses, ideally with AWS/Redshift, Azure, GCP, or Snowflake Experience with dbt is highly advantageous Responsibilities Analyze, organize, and prepare raw data for modeling and data analytics Architect and assist in building data systems more »
solutions including the choice of data sources and ETL approach Familiar with engineering processes for developing APIs Understanding the principles of building solutions using Snowflake, open-source frameworks, multi-cloud infrastructure This is a contract position. more »
General knowledge of relational databases (e.g., SQL Server, Oracle, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra, Couchbase). Experience with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery). Excellent scripting skills (e.g., Python, SQL). Strong analytical and problem-solving skills. Excellent communication and interpersonal skills, with the ability to more »
Advanced knowledge of data visualization tools and dashboard design experience essential (Tableau preferred) Experience in the use of large databases and data warehouses required (Snowflake preferred) Experience with transitioning and deploying data science and quantitative models to Production environment required along with exposure to Agile development process -- ability to articulate more »
languages or toolsets: AutoSys, Azure Function App, Azure GIT, Azure Portal, C#, Databricks, GraphQL/Graph API, Informatica CDI, Informatica Power Center, Java, Javascript, Snowflake, PowerBI, PyRecs, Python, Selenium, Spark, and SQL Nice to have skills Ability to propose and estimate the financial impact of architectural alternatives Existing knowledge of more »
CD. professional experience with SQL and data transformation, ideally with dbt or similar. with at least one of these Cloud technologies: AWS, Microsoft Azure, Snowflake, GCP. Apply to the Role Roles like these are snapped up very quickly, so act now if you do not want to miss out! Reply more »
total experience in DWBI, Big Data, Cloud Technologies • Implementation experience and hands on experience in either of the 2 Cloud technologies – Azure, AWS, GCP, Snowflake, Databricks • Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache more »
required. Experience of Machine Learning Engineering and CRM are also highly desirable. High coding standards and experience of Dev Ops/MLOps, Azure, Synapse, Snowflake and Databricks. Great people skills will be also vital in the role. You will be working with, and expected to influence, colleagues at every level more »
engineering experience (ideally 8-10 years experience) SQL, PL/SQL, Oracle databases Java Web interfaces Nice to have: Exposure to cloud development (Azure, Snowflake) CANDIDATES MUST HAVE EXPERIENCE WITHIN THE PHARMACEUTICAL INDUSTRY OR MUST HAVE WORKED ON CLINICAL SYSTEMS. Interested? Please apply to this advert with an updated CV more »
with knowledge of data modeling concepts and data management techniques being advantageous. Skills in implementing reports/analytics are desirable. Knowledge of AWS and Snowflake is desirable. more »
in product delivery models and business analysis Risk and Compliance experience within the financial services sector Experience with Microsoft Azure cloud platform, Databricks and Snowflake Strong skills in data analysis, dimensional data modeling methodologies and business analysis Solid understanding of Software Development Lifecycle (SDLC) and version control disciplines Works with more »
build, and maintain scalable data architectures, including pipelines and cloud-based data warehouses. Tech: Python (NumPy, Pandas), SQL, ETL, Cloud (AWS, Azure or GCP), Snowflake, Airflow, BigQuery, PowerBI/Tableau Industry: Fintech, Maritime trading Immersum are supporting the growth of a specialist consultancy who solely specialise in the Maritime trading … with Git for code collaboration and change tracking. Data Pipeline Tools: Proficiency with tools such as Apache Airflow. Cloud Platforms: Familiarity with AWS, Azure, Snowflake, and GCP. Visualisation: Tableau or PowerBI Delivery Tools: Familiarity with agile backlogs, code repositories, automated builds, testing, and releases. more »
in Data Science, Data Management, Information Technology, or similar Proficient in creating models in excel, as well as utilising data visualisation tools (Tableau) and Snowflake Prior experience working in a large, matrixed organisation, where you have played a pivotal role in supporting multiple departments on various projects. This is an more »
join their growing team. Key Responsibilities: KPI Reporting for the Product Team. Manage Google Analytics & Google Tag Manager. Managing the pipeline from GA4 to Snowflake via GCP. Implement tracking for new journeys and ensure accurate reporting. Create & maintain visualisations in ThoughtSpot & Tableau. Apply today and I'll be in touch more »
allowance for use within our subsidized onsite canteen Must have skills Working knowledge of Azure DevOps (Git, build and release pipelines), Python, databricks, and snowflake as a requirement. Nice to haves are: PowerBI, and Attunity Replicate Significant information technology and/or application development, database development and/or python … OpenID Connect, and JWT tokens. Database Skills: Proficiency in working with databases, particularly SQL databases like Microsoft SQL Server or Azure SQL Database or Snowflake Agile Methodologies: Experience with Agile development methodologies like Scrum or Kanban for project management and collaboration. more »
Analytics technology team, tasked with developing, managing, and optimizing data pipelines within our data ecosystem. The Data Engineer will leverage cutting-edge technology in Snowflake, Python, Informatica, and Azure to improve our data capabilities and support our Finance business partners in their decision-making processes. Key responsibilities include: Design, build … maintain efficient, reliable data pipelines using ETL and ELT processes. Ensure the seamless flow and availability of high-quality data across the organisation Use Snowflake for data storage, processing, and analytics. Optimise data structures and queries to support analytics and BI initiatives Develop scripts in Python and SQL to automate … our subsidized onsite canteen Must have skills Experience as a Data Engineer, with a solid background in data pipeline construction and data architecture using Snowflake, Python, and Informatica Familiarity with Azure and other cloud-native technologies Understanding of finance industry data domains and their application in data engineering Strong problem more »
with data scientists and architects on several projects EXPERIENCE: Proven track record of success in Data Engineering teams Experience with relational and MPP databases (Snowflake, Redshift, Big Query etc). Experience with data modelling techniques (Kimball/Star, Datavault, etc). Experience with SQL and query design on large, complex … datasets. Experience with cloud and big-data tools and frameworks like Databricks/Spark, Airflow, Snowflake, etc. Expertise designing and developing with distributed data processing platforms like Databricks/Spark. Experience using ELT/ETL tools such as DBT, FiveTran, etc. Understanding of Agile Delivery best practice Good knowledge of more »