Market PAS platforms (e.g. OpenTWINS, DXC Assure, Sequel, IRIS). Knowledge of BI/MI tooling (e.g. Power BI, Tableau, Qlik). Familiarity with data warehouse technologies (e.g. SQL, Snowflake, Azure, Informatica, etc.). Exposure to Agile delivery and use of tools such as Jira or Azure DevOps. Certification in Project Management (PMP, PRINCE2) or Agile (Scrum Master, PMI-ACP More ❯
sales, and account teams to uncover real-world data needs and friction points. Build working data products-from custom Python pipelines and enriched datasets to Power BI templates and Snowflake-ready views. Support pre- and post-sales with prototypes, demos, onboarding materials, and technical discovery. Act as a trusted technical advisor, helping clients see the value in Signal's unique More ❯
AE performance This role requires Experience in the following Languages: SQL (Advanced) Jinja (Advanced) Experience with the following tools: dbt Core Git A Cloud warehouse provider e.g. Databricks, GCP, Snowflake The following would be nice to have Experience in the following Languages: Python Experience with the following tools: Github Lightdash Elementary CircleCI Databricks Airflow Kubernetes DuckDB Spark Data Modelling Techniques More ❯
the Role As a Data Platform Engineer, you'll help shape and scale our data infrastructure, making analytics faster, more reliable, and cost-efficient. You'll work with AWS, Snowflake, Python, and Terraform, building tooling, and onboarding new data sources. You'll collaborate closely with teams across the business, ensuring our platform is secure, scalable, and easy to use. Our More ❯
launching and scaling data products with measurable business value. Expertise in product development lifecycles, agile methodologies, and stakeholder engagement. Strong understanding of data architectures, warehouses, and analytics platforms (e.g., Snowflake, BI tools). Ability to lead through influence across engineering, analytics, and business teams. Excellent written and verbal communication skills for technical and executive audiences. Preferred Qualifications Bachelor's or More ❯
backend services and React frontends. You'll utilise tools such as Terraform for infrastructure-as-code ( IaC ), AWS (Lambda, EC2, EKS, Step Functions, VPC etc.) for ETL, Airflow pipelines, Snowflake, and ensure architectural alignment with AI/ML initiatives and data-driven services. You will serve as the go-to engineer for: End-to-end data migration architecture ( on-premise More ❯
State Street Alpha, FNZ, etc), digital drivers, cloud architecture & providers, integration tools & approaches, and artificial intelligence methods & techniques (experience in Cloud Computing Platforms -knowledge of any or all of Snowflake, Azure, AWS and Google would be beneficial) Knowledge of the common functions in a typical Data organisation Demonstratable interest and awareness in emerging technologies Hands-on coding experience with SQL More ❯
no line management responsibilities. Our platforms are built with Clojure, employ a polylith architecture, are deployed using CI/CD, heavily exploit automation, and run on AWS, GCP, k8s, Snowflake and more. We serve 9 petabytes and 77 billion objects annually, which amounts to 20 billion ad impressions across the globe. You'll play a leading role in significantly scaling More ❯
Data Stores & Databases: Relational Databases: PostgreSQL (including managed versions like AWS Aurora, GCP Cloud SQL) NoSQL Databases: DynamoDB Search Databases: OpenSearch, Elasticsearch Vector Databases: Qdrant Caching: Redis Data Warehousing: Snowflake What we're looking for: 4+ year job history designing, managing, deploying, and supporting Cloud Infrastructure in a production environment using major public cloud providers. (One of GCP or AWS More ❯
enables our customers to navigate the rapidly changing Digital-First world we live in. We foster strong partnerships with leading technology giants including Microsoft, AWS, Oracle, Red Hat, OutSystems, Snowflake, ensuring that our customers are provided with the highest quality solutions and services. We're an award-winning employer reflecting how our employees are at the very heart of Version More ❯
knowledge of algorithms and data structures. Degree in Computer Science or related field preferred. Highly preferred: Experience with TDD, BDD or other testing methodologies Preferred: Familiarity with PostgreSQL and Snowflake Preferred: Familiarity with Web Frameworks such as Django, Flask or FastAPI Preferred: Familiarity with event streaming platforms such as Apache Kafka Preferred: Familiarity with data pipeline platforms such as Apache More ❯
or similar Proficient writing and maintaining bash scripts Experience writing concise and illustrative documentation Experience Microsoft Azure and Google Cloud Experience with Data Engineering and Analytics products such as Snowflake, Redshift, Google Analytics, Segment, ELK Stack Qualifications Bachelor's degree in computer science or equivalent experience combined with theoretical knowledge What's in it For You? Flexibility & Work-Life Balance More ❯
automation solutions Driving AI strategy development Overseeing system enhancements and API integrations Managing data engineering processes, including ETL/ELT pipelines with Azure Data Factory Leading the development of Snowflake Data Warehouse Managing and leading Data Engineers and Developers Handling IT projects Candidate requirements: Experience with Microsoft Power Platform development Knowledge of Azure Data Factory Experience with cloud data warehousing More ❯
Improving efficiency through automating solutions - Driving AI strategy - Oversee system enhancements and API integrations - Oversee data engineering process - ETL/ELT pipelines with Azure Data Factory - Drive development of Snowflake Data Warehouse - Team management and leading Data Engineers/Developers - Digital roadmap development - IT projects To be considered suitable you will need the following skills/experience: - Experience of development More ❯
experience working with complex datasets. Familiarity with ETL processes, data warehousing, and reporting systems. Solid understanding of financial metrics and reporting. Experience with database management and cloud platforms (e.g., Snowflake). Excellent communication and stakeholder management skills. Understanding of Agile methodologies. Nice to Have Experience with semantic layer tools and SSRS. Knowledge of Azure DevOps. Reasonable Adjustments: Respect and equality More ❯
aim is to transition the business from its current landscape of bespoke, custom-built systems to more modern, streamlined solutions - such as the implementation of ServiceNow and a new Snowflake data warehouse. Due to the technical nature of the role, we're looking for a Business Analyst with a strong understanding of data structures and data modelling, and hands-on More ❯
aim is to transition the business from its current landscape of bespoke, custom-built systems to more modern, streamlined solutions - such as the implementation of ServiceNow and a new Snowflake data warehouse. Due to the technical nature of the role, we're looking for a Business Analyst with a strong understanding of data structures and data modelling, and hands-on More ❯
aim is to transition the business from its current landscape of bespoke, custom-built systems to more modern, streamlined solutions - such as the implementation of ServiceNow and a new Snowflake data warehouse. Due to the technical nature of the role, we're looking for a Business Analyst with a strong understanding of data structures and data modelling, and hands-on More ❯
hands-on experience with Spark. Experience building, maintaining, and debugging DBT pipelines. Strong proficiency in developing, monitoring, and debugging ETL jobs. Deep understanding of SQL and experience with Databricks, Snowflake, BigQuery, Azure, Hadoop, or CDP environments. Hands-on technical support experience, including escalation management and adherence to SLAs. Familiarity with CI/CD technologies and version control systems like Git. More ❯
future of InsurTech with cutting-edge machine learning and AI? At Simply Business, we're not just using data; we're continuously evolving our platform with technologies like AWS, Snowflake, and Kafka to drive real value and inform company strategy. As a leading player in the market, our mission is to remain at the forefront of data engineering, ML, and More ❯
hands-on experience with Spark. Experience building, maintaining, and debugging DBT pipelines. Strong proficiency in developing, monitoring, and debugging ETL jobs. Deep understanding of SQL and experience with Databricks, Snowflake, BigQuery, Azure, Hadoop, or CDP environments. Hands-on technical support experience, including escalation management and adherence to SLAs. Familiarity with CI/CD technologies and version control systems like Git. More ❯
you bring to the role: 5 years in a Sales Engineering, Solutions Engineering, Consulting or similar role within the data space, ideally with experience in modern data tools like Snowflake, Databricks, Fivetran, or Tableau. Hands-on Python scripting skills for data pipeline support. Familiarity with core data engineering concepts such as orchestration, ELT, Git, and Role-Based Access Control (RBAC More ❯
ability to work in a team-oriented environment. Nice to Have: - Familiarity with Microservices architecture and API development. - Knowledge of databases like Redis, PostgreSQL, and DWH (such as Redshift, Snowflake, etc.). We will handle your application and information related to your application in accordance with the Applicant Privacy Policy available here . Apply for this job indicates a required More ❯
warehouses including performance tuning, query optimization and execution plan analysis · Advanced knowledge of data warehousing principles, dimensional modelling and star schema design · Hands-on experience with SQL Server and Snowflake, including their architecture, features and best practices · Familiarity with data integration tools (SSIS, ADF) and techniques (ELT, ETL) · Experience with reporting and analytical tools such as SSAS, SSRS or Power More ❯
Oriented language) Experience building reliable, distributed applications for Data Processing or similar areas Hands-on experience developing cloud applications (e.g. AWS, GCP, Azure) Experience with technologies like BigQuery and SnowFlake Preferred Qualifications: Experience writing testable and modular code Experience working in a fast-paced environment, collaborating across teams and disciplines Experience designing, deploying, and maintaining distributed systems Data pipelines, data More ❯