discovery, combining quantum-inspired physics with generative models Real-World Impact : Every feature shipped helps scientists prioritize molecules and design better candidates, faster Modern Stack & Challenges : Python, FastAPI, Airflow, Snowflake, Kubernetes, ML workflows, scientific infra, data engineering at scale High Ownership, High Impact : Engineers contribute to architecture, tooling, and scientific decision-making Interdisciplinary Team : Collaborate with chemists, physicists, ML researchers More ❯
building, debugging, and optimizing APIs, distributed systems, and/or data pipelines. Experience with modern technologies and frameworks including FastAPI, PydanticAI, Haystack AI, OpenTelemetry, Procrastinate, databases like Postgres and Snowflake, queue systems such as SQS/Kafka, and Airflow. Hands-on experience with Kubernetes, including orchestration and lifecycle maintenance; you are not an SRE but you know how things run More ❯
focused team to ensure successful implementation. Essential Skills Proficient in Python, with familiarity with legacy code bases in VBA and SQL. Strong understanding of cloud-based data ecosystems, including Snowflake, Databricks, and the Azure software stack. Deep knowledge of front and middle office processes within asset management. Solid understanding of financial instruments, trade lifecycle events, and high-level strategic thinking More ❯
to: o Relational Databases such as SQL Server, Oracle DB, IBM DB o Non-Relational Databases such as MongoDB, Redis o Data Warehouse and Data Lake tools such as Snowflake, Hadoop o File servers, NAS, Isilon, Cloud Drives • Ability to work collaboratively with key stakeholders (Data & Analytics group, Data Strategy & Management team, Access, Network and Security Architects, various Security Engineers More ❯
one programming and/or scripting language (Go, Python or Bash). Skilledin presentation and documentation. Troubleshootingand analytical skills. Time and project management skills. BONUS POINTS Experience with databases (Snowflake, MariaDB, Galera, MySQL) Experience with infrastructure and application security Experience with disaster recovery planning and implementation Experience working within Agile development environments Familiar/experience with AI/ML platforms More ❯
Nottingham, Nottinghamshire, United Kingdom Hybrid / WFH Options
Capital One (Europe) plc
your stakeholder relationships - you also break down jargon to fit your audience needs You're excited about getting into the details of technology & automation solutions Any knowledge of Salesforce, Snowflake, G Suite, Google App Script, Excel programming, and Robotic Process Automation (RPA) would be desirable (not essential) Any knowledge of AWS would be desirable (not essential) Where and how you More ❯
building production-level data science applications Proven track record of leading successful data science initiatives that delivered significant business impact Experience with big data technologies and cloud platforms (AWS, Snowflake, etc.) Strong project management skills with ability to prioritize, manage resources, and deliver complex projects on time Exceptional communication skills with the ability to present complex findings to both technical More ❯
in data product development. Strong understanding of data structures, algorithms, and statistical concepts. Proficiency in Python and ETL frameworks Deep knowledge of data pipeline architectures and products such as Snowflake or similar Experience with delivering data products to clients via APIs Familiarity with data visualization tools Knowledge of locating, assessing and integrating third party data-sets Experience with machine learning More ❯
building production-level data science applications Proven track record of leading successful data science initiatives that delivered significant business impact Experience with big data technologies and cloud platforms (AWS, Snowflake, etc.) Strong project management skills with ability to prioritize, manage resources, and deliver complex projects on time Exceptional communication skills with the ability to present complex findings to both technical More ❯
building production-level data science applications Proven track record of leading successful data science initiatives that delivered significant business impact Experience with big data technologies and cloud platforms (AWS, Snowflake, etc.) Strong project management skills with ability to prioritize, manage resources, and deliver complex projects on time Exceptional communication skills with the ability to present complex findings to both technical More ❯
Hightouch AEs throughout the customer journey, delivering tailored demonstrations and customer POCs based on agreed-upon requirements. Develop deep technical domain expertise in Data Activation, CDP, data warehousing (e.g., Snowflake, BigQuery, Databricks), and integration platforms, with a focus on Enterprise use cases across various industries within EMEA, including financial services, retail, SaaS, and similar enterprises. Be accountable for the exceptional More ❯
taxonomies, maturity assessments and alignment of gaps to enabling technology solutions Appreciation/Experience in designing and governing data platform architectures (e.g. broad understanding of Informatica, Collibra, Ab Initio, Snowflake, Databricks) Appreciation of and interest in attaining end-to-end data skills e.g., data quality, metadata, data-mesh, data security, privacy & compliance Experience with Enterprise/platform/application (e.g. More ❯
disciplinary teams Have interest or experience with key Technology Platforms across AI solutions (OpenAI, AWS), Cloud (Azure, AWS, Google), Supply Chain (O9, BlueYonder, Kinaxis), CRM (Dynamics, Salesforce), Data (Palantir, Snowflake), ERP (SAP S4) or others. Have proven stakeholder management and communication skills, leveraged to influence and persuade on key architectural decisions Keen to explore how emerging technologies can be leveraged More ❯
taxonomies, maturity assessments and alignment of gaps to enabling technology solutions Appreciation/Experience in designing and governing data platform architectures (e.g. broad understanding of Informatica, Collibra, Ab Initio, Snowflake, Databricks) Appreciation of and interest in attaining end-to-end data skills e.g., data quality, metadata, data-mesh, data security, privacy & compliance Experience with Enterprise/platform/application (e.g. More ❯
taxonomies, maturity assessments and alignment of gaps to enabling technology solutions Appreciation/Experience in designing and governing data platform architectures (e.g. broad understanding of Informatica, Collibra, Ab Initio, Snowflake, Databricks) Appreciation of and interest in attaining end-to-end data skills e.g., data quality, metadata, data-mesh, data security, privacy & compliance Experience with Enterprise/platform/application (e.g. More ❯
technology stack including: Object-Oriented design SOLID principles and modern design patterns Full stack development experience including Front-end JavaScript frameworks like Angular & React Databases and EDW technology like Snowflake SOA & Microservices architecture implementation using REST APIs, queue-based messaging patterns, exposure to Mulesoft/Kong is a plus On-premise/cloud-based infrastructures, SDLC pipelines, and deployments/ More ❯
taxonomies, maturity assessments and alignment of gaps to enabling technology solutions Appreciation/Experience in designing and governing data platform architectures (e.g. broad understanding of Informatica, Collibra, Ab Initio, Snowflake, Databricks) Appreciation of and interest in attaining end-to-end data skills e.g., data quality, metadata, data-mesh, data security, privacy & compliance Experience with Enterprise/platform/application (e.g. More ❯
taxonomies, maturity assessments and alignment of gaps to enabling technology solutions Appreciation/Experience in designing and governing data platform architectures (e.g. broad understanding of Informatica, Collibra, Ab Initio, Snowflake, Databricks) Appreciation of and interest in attaining end-to-end data skills e.g., data quality, metadata, data-mesh, data security, privacy & compliance Experience with Enterprise/platform/application (e.g. More ❯
taxonomies, maturity assessments and alignment of gaps to enabling technology solutions Appreciation/Experience in designing and governing data platform architectures (e.g. broad understanding of Informatica, Collibra, Ab Initio, Snowflake, Databricks) Appreciation of and interest in attaining end-to-end data skills e.g., data quality, metadata, data-mesh, data security, privacy & compliance Experience with Enterprise/platform/application (e.g. More ❯
familiarity with Python-related programming languages (e.g. Pyspark, Polars) is beneficial Proficiency in SQL for data extraction, transformation, and manipulation is beneficial Experience with data lakehouse paradigms (e.g. Databricks, Snowflake, implementations from major cloud providers) is beneficial Exposure to structured and unstructured data storage solutions in some capacity (e.g. SQL, Postgres, MongoDB, AWS S3) is beneficial Experience working in an More ❯
a Serverless Lambda backend on AWS . We're using Typescript top to bottom , and Apollo GraphQL sits in the middle. We keep our data in DynamoDB , Postgres and snowflake for our data-lake using one or the other as best suits a particular workload. You can learn more about our tech stack at While we're looking for talented More ❯
The Role Responsibilities: Collaborate with business analysts and business users to understand technical specifications and business requirements. Develop and optimize data pipelines, models, and schemas using Python, SQL Server, Snowflake, Databricks, and other modern tools. Design data architecture to support data integration, analysis, and machine learning workflows. Implement data quality checks, validation procedures, and data governance standards to ensure data … attention to detail. Self-driven, collaborative, and capable of working with minimal guidance. Proficiency in Python and SQL programming. Experience with relational databases (SQL Server) and data warehousing solutions (Snowflake, Azure Databricks). Knowledge of ETL processes, scheduling tools (Autosys, Control-M), and CI/CD pipelines. Familiarity with data validation, quality checks, and data governance. Preferred, but not required More ❯
Join to apply for the Data Solutions Architect -GCP role at EXL 2 days ago Be among the first 25 applicants Join to apply for the Data Solutions Architect -GCP role at EXL Get AI-powered advice on this job More ❯
management business are seeking a Power BI enthusiast to join their growing team in Central London with 3 days per week in the office. Following the successful implementation of Snowflake as their single source of truth data warehouse they’re now growing their Data Team and now need to bolster the Power BI experience within the team. This hire will More ❯
ll Do Collaborate with a cross-functional team to design and develop scalable, cloud-native software solutions leveraging Azure. Architect and implement solutions using Python, .NET (C#), Angular, and Snowflake for data management. Lead code reviews and provides mentorship to elevate team performance and code quality. Foster a DevOps culture by implementing CI/CD pipelines, test automation, and performance More ❯