London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
Engineer or in a similar data engineering/BI role Advanced SQL skills with hands-on experience using dbt for data modeling Strong familiarity with modern data platforms like Snowflake, Looker, AWS/GCP, Tableau, or Airflow Experience with version control tools (e.g., Git) Ability to design, build, and document scalable, reliable data models Comfortable gathering business requirements and translating More ❯
ideally within fintech or cloud-native organisations (AWS preferred). Strong technical background in data engineering, analytics, or data science. Experience with modern data stacks (e.g., SQL, dbt, Airflow, Snowflake, Looker/Power BI) and AI/ML tooling (e.g., Python, MLflow, MLOps). A track record of building and managing high-performing data teams. Strategic thinking and ability to More ❯
Modeling data for a civil service department replacing a legacy HR system Experience and qualifications Technical 3+ years' experience in data or software engineering Knowledge of Python, SQL, Databricks, Snowflake, and major cloud platforms (AWS/Azure/GCP) Ability to learn quickly and adapt to new technologies and sectors Understanding of data engineering best practices and system design Strong More ❯
practices and procedures within the department. Required Qualifications Bachelor's degree with at least 5 years of experience, or equivalent. In-depth knowledge and expertise in data engineering, including: Snowflake (data warehousing and performance tuning) Informatica (ETL/ELT development and orchestration) - nice to have Python (data processing and scripting) - required AWS (data services such as S3, Glue, Redshift, Lambda More ❯
Spark Streaming, Kinesis) Familiarity with schema design and semi-structured data formats Exposure to containerisation, graph databases, or machine learning concepts Proficiency with cloud-native data tools (BigQuery, Redshift, Snowflake) Enthusiasm for learning and experimenting with new technologies Why Join Capco Deliver high-impact technology solutions for Tier 1 financial institutions Work in a collaborative, flat, and entrepreneurial consulting culture More ❯
meet compliance standards. Mentor: Upskill other platform engineers, data engineers and AI engineers to deliver and build adoption on your team's initiatives Our Tech Stack Cloud Data Warehouse - Snowflake AWS Data Solutions - Kinesis, SNS, SQS, S3, ECS, Lambda Data Governance & Quality - Collate & Monte Carlo Infrastructure as Code - Terraform Data Integration & Transformation - Python, DBT, Fivetran, Airflow CI/CD - Github More ❯
tools. Understanding of Agile methodologies. Additional Skills Experience mentoring or supporting team development. Knowledge of Azure SQL DB, Data Factory, Data Lake, Logic Apps, Data Bricks (Spark SQL), and Snowflake is advantageous. More ❯
Need to Succeed Strong skills in Python and SQL Demonstrable hands-on experience in AWS cloud Data ingestions both batch and streaming data and data transformations (Airflow, Glue, Lambda, Snowflake Data Loader, FiveTran, Spark, Hive etc.). Apply agile thinking to your work. Delivering in iterations that incrementally build on what went before. Excellent problem-solving and analytical skills. Good More ❯
Technical Skills: Proven expertise in designing, building, and operating data pipelines, warehouses, and scalable data architectures. Deep hands-on experience with modern data stacks. Our tech includes Python, SQL, Snowflake, Apache Iceberg, AWS S3, PostgresDB, Airflow, dbt, and Apache Spark, deployed via AWS, Docker, and Terraform. Experience with similar technologies is essential. Coaching & Growth Mindset: Passion for developing others through More ❯
Technical Skills: Proven expertise in designing, building, and operating data pipelines, warehouses, and scalable data architectures. Deep hands-on experience with modern data stacks. Our tech includes Python, SQL, Snowflake, Apache Iceberg, AWS S3, PostgresDB, Airflow, dbt, and Apache Spark, deployed via AWS, Docker, and Terraform. Experience with similar technologies is essential. Coaching & Growth Mindset: Passion for developing others through More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
use of machine learning. Key Focus Areas Own and execute enterprise data strategy Build and lead a multi-disciplinary data & AI team Drive modern data platform development (dbt, Airflow, Snowflake, Looker/Power BI) Deliver business-critical analytics and reporting Support responsible AI/ML initiatives Define data governance, privacy, and compliance frameworks What We're Looking For Proven data More ❯
SQL. Vast experience in data modelling using tools such as Erwin, Power Designer, SQLDBM or Sparx EA. Minimum 10 years experience in using databases such as Oracle, SQL Server, Snowflake or any other OLTP and OLAP databases. Minimum 5 years experience with reporting tools: Power BI, Business Objects, Tableau or OBI. Understanding of Master Data Management technology landscape, processes and More ❯
oil and gas. Preferred Qualifications: • Certifications in relevant technologies or methodologies. • Proven experience in building, operating, and supporting robust and performance databases and data pipelines. • Experience with Databricks and Snowflake • Solid understanding of web performance optimization, security, and best practices Experience supporting Power BI dashboards More ❯
as Terraform or Ansible for deployment and infrastructure management Hands-on experience with; ETL/ELT orchestration and pipeline tools (Airflow, Airbyte, DBT, etc.) Data warehousing tools and platforms (Snowflake, Iceberg, etc.) SQL databases, particularly MySQL Desired Experience: Experience with cloud-based services, particularly AWS Proven ability to manage stakeholders, their expectations and explain complex problems or solutions in a More ❯
your own ideas-your voice will be heard. Qualifcations: Degree in Computer Science, Information Technology or a related field. Skill & Experience: 3-5 years SQL experience (bonus: NoSQL or Snowflake). 2-3 years of hands-on Python (scripting and development). Experience in a fast-paced startup or agile environment. Strong background in schema design and dimensional data modeling. More ❯
your own ideas-your voice will be heard. Qualifcations: Degree in Computer Science, Information Technology or a related field. Skill & Experience: 3-5 years SQL experience (bonus: NoSQL or Snowflake). 2-3 years of hands-on Python (scripting and development). Experience in a fast-paced startup or agile environment. Strong background in schema design and dimensional data modeling. More ❯
Market PAS platforms (e.g., OpenTWINS, DXC Assure, Sequel, IRIS). Knowledge of BI/MI tooling (e.g., Power BI, Tableau, Qlik). Familiarity with data warehouse technologies (e.g., SQL, Snowflake, Azure, Informatica, etc.). Exposure to Agile delivery and use of tools such as Jira or Azure DevOps. More ❯
Must-Have: 2+ years experience in an analytics or data consultancy role Proficiency in SQL and data modelling (preferably with dbt) Hands-on experience with cloud data warehouses (BigQuery, Snowflake, Redshift) Familiarity with BI tools (Looker, Power BI, Tableau, etc.) Excellent communication skills - able to simplify technical concepts for non-technical stakeholders Nice-to-Have: Experience working in client-facing More ❯
role. Advanced Power BI skills, including DAX, Power Query (M), and custom visuals. Strong experience in data transformation and modelling . Proven ability in data integration across multiple sources (Snowflake, SQL, APIs, Excel, etc.). Experience working in or with the defence, government, or public sector is highly desirable. Knowledge of Data Warehousing concepts and practices. Familiarity with Agile methodologies More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Anvil Analytical
role. Advanced Power BI skills, including DAX, Power Query (M), and custom visuals. Strong experience in data transformation and modelling . Proven ability in data integration across multiple sources (Snowflake, SQL, APIs, Excel, etc.). Experience working in or with the defence, government, or public sector is highly desirable. Knowledge of Data Warehousing concepts and practices. Familiarity with Agile methodologies More ❯
as: Hadoop, Kafka, Apache Spark, Apache Flink, object, relational and NoSQL data stores. Hands-on experience with big data application development and cloud data warehousing (e.g. Hadoop, Spark, Redshift, Snowflake, GCP BigQuery) Expertise in building data architectures that support batch and streaming paradigms Experience with standards such as JSON, XML, YAML, Avro, Parquet Strong communication skills Open to learning new More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Anson Mccade
for large-scale platforms and diverse stakeholder groups Strong data modelling skills across OLTP and OLAP systems, with experience building datamarts and warehouses Proficiency with cloud-native tools (e.g. Snowflake, Databricks, Azure Data Services, etc.) High-level SQL skills and experience with tools like SSMS, SSIS, or SSRS Strong understanding of data governance, RBAC, and information security practices Able to More ❯
ETL tools, data migration, and data cleansing methodologies. Moderate to advanced SQL skills, with experience writing complex queries. Experience integrating with cloud-based data warehouses/data lakes (e.g., Snowflake, AWS, Databricks, Big Query) and data analytics tools (e.g., Tableau). Excellent communication skills, with the ability to explain complex technical concepts to non-technical stakeholders. Ability to thrive in More ❯
data engineering roles, preferably for a customer facing data product Expertise in designing and implementing large-scale data processing systems with data tooling such as Spark, Kafka, Airflow, dbt, Snowflake, Databricks, or similar Strong programming skills in languages such as SQL, Python, Go or Scala Demonstrable use and an understanding of effective use of AI tooling in your development process More ❯
Experience in managing a team of Data Engineers Experience with Data modelling, Data warehousing, and building ETL pipelines Experience with AWS (S3, EKS, EC2, RDS) or similar cloud services, Snowflake, Fivetran, Airbyte, dbt, Docker, Argo Experience in SQL, Python, and Terraform Experience with building Data pipelines and applications to stream and process datasets Robust understanding of DevOps principles is required More ❯