either Java, Golang (Go), Python or Scala – multiple would be even better! • Experience with cloud platforms e.g. AWS, Azure, GCP - Google Cloud • Experience with data warehouses e.g. BigQuery, Redshift, Snowflake • Ability to work collaboratively • Open to new AI tools and technologies • Could come from roles like Senior Backend Engineer or Senior Full Stack Engineer They offer a great package on More ❯
for data modelling and analysis Power BI experience in a business-facing environment Nice to have: Python for data analysis or scripting Familiarity with cloud data environments (e.g. Azure, Snowflake, Databricks) If you're a BI developer with London Market experience looking to make a real impact, this is a rare opportunity to help shape a data-driven future from More ❯
and data warehousing principles Experience working with business users, analysts, and other developers in a collaborative setup Nice to have: Exposure to cloud data environments (e.g. Azure SQL, Synapse, Snowflake) Familiarity with ETL tooling, SSIS, or modern orchestration platforms Understanding of how database structures support BI and reporting tools (e.g. Power BI) This is a high-impact role in a More ❯
You Bring: 5+ years of experience in data analytics or reporting Strong skills in Power BI or Tableau, SQL, and Python Experience with cloud platforms (Azure preferred), Alteryx, and Snowflake Hands-on experience with data migration and integration across ERP platforms Essential Bachelor’s degree in Data Science, Computer Science, Statistics, or a related field Detail-oriented with excellent analytical More ❯
South East London, England, United Kingdom Hybrid / WFH Options
MVF
effectively (git, bitbucket) The terminal is your second home! Demonstrable experience of building data pipelines Clear ability to prioritise tasks and drive them through to completion Nice to have Snowflake Python experience Terraform CI/CD More ❯
visits to the Basingstoke office, depending on location. In addition, it would be advantageous to the candidate to have experience with the following technologies: Experience with Spark (AWS Glue) Snowflake Native Applications or AWS Marketplace solutions. Experience using Trusted Execution Environments (TEEs) Kafka Automated cloud infrastructure provisioning Basic understanding of encryption protocols We are committed to using the best technology More ❯
NumPy) and deep expertise in SQL for building robust data extraction, transformation, and analysis pipelines. Hands-on experience with big data processing frameworks such as Apache Spark, Databricks, or Snowflake, with a focus on scalability and performance optimization Familiarity with graph databases (e.g., Neo4j, Memgraph) or search platforms (e.g., Elasticsearch, OpenSearch) to support complex data relationships and querying needs Solid More ❯
offs, collaborative design, and proactive roadmap planning, with an openness to exploring innovative solutions like generative AI where appropriate. We work with an Airflow/AWS/Fivetran/Snowflake/Looker stack and typically use Python and Docker in our pipelines. You’ll need to be highly proficient in these or similar tools and comfortable navigating a modern data … the full data lifecycle and ELT patterns Comfortable evaluating both business and technical requirements Skilled at working with large datasets and optimising data flows Experience with Airflow, AWS, Fivetran, Snowflake, Docker (or similar) Strong in Python, SQL, and cloud platforms (AWS or comparable) Experienced in handling real-time data pipelines Experienced in evolving data pipelines over time to meet new More ❯
offs, collaborative design, and proactive roadmap planning, with an openness to exploring innovative solutions like generative AI where appropriate. We work with an Airflow/AWS/Fivetran/Snowflake/Looker stack and typically use Python and Docker in our pipelines. You’ll need to be highly proficient in these or similar tools and comfortable navigating a modern data … the full data lifecycle and ELT patterns Comfortable evaluating both business and technical requirements Skilled at working with large datasets and optimising data flows Experience with Airflow, AWS, Fivetran, Snowflake, Docker (or similar) Strong in Python, SQL, and cloud platforms (AWS or comparable) Experienced in handling real-time data pipelines Experienced in evolving data pipelines over time to meet new More ❯
South East London, England, United Kingdom Hybrid / WFH Options
SGI
datasets (e.g. MSCI), manage schema changes, and ensure data quality Optimise existing model code and build new components in a clean, maintainable way Enhance data pipelines and tooling using Snowflake, AWS, and Python Work directly with investment professionals to scope and deliver solutions What we’re looking for: Strong Python engineering skills, including use of data libraries like pandas and More ❯
Shoreham-By-Sea, West Sussex, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
complex systems impact assessments Experience with ETL processes, data warehousing concepts, and reporting system design Knowledge of database platforms (SQL Server, Oracle, MySQLPostgreSQL, or similar) and ideally warehouse solutions (Snowflake) Experience working with data transformation tools, including setting test parameters for quality. Systems Integration & Architecture Deep understanding of enterprise system architectures and data flow dependencies Experience with both proprietary custom More ❯
Southampton, Hampshire, United Kingdom Hybrid / WFH Options
gen2fund.com
manipulate data. Experience beneficial in Object Oriented language development, including integrations with SOAP and REST based APIs. Familiarity with reporting and analytics tools, Qlik, Microsoft Excel, Power BI, Synapse, Snowflake or similar. Self-motivated, proactive, with a strong sense of ownership, initiative, and problem-solving abilities. Excellent communication skills, with the ability to translate complex technical concepts to non- technical More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Roc Search
AI Engineering to build intelligent services and working alongside Data Scientists to deploy scalable AI-driven features and applications. A familiarity with cloud-based AI/ML services in Snowflake Cortex or DOMO AI would be a strong advantage. Key Reponsibilities Design and implement robust, scalable backend systems using Python as the primary language. Apply object-oriented principles and design More ❯
new data stack could include Python. Experience building and maintaining backend systems, APIs, and data services. Deep proficiency with SQL and experience working with production data stores (PostgreSQL, Redshift, Snowflake, DynamoDB, etc.). Hands-on experience with data pipelines, ETL/ELT processes, and/or event-driven architectures. Comfort working with and transforming time-series or other noisy sensor More ❯
in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact on deliverables. Familiarity More ❯
in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact on deliverables. Familiarity More ❯
in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact on deliverables. Familiarity More ❯
in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact on deliverables. Familiarity More ❯
in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact on deliverables. Familiarity More ❯
in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact on deliverables. Familiarity More ❯
in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact on deliverables. Familiarity More ❯
in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact on deliverables. Familiarity More ❯
in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact on deliverables. Familiarity More ❯
in data architecture, including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing its impact on deliverables. Familiarity More ❯