London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
share options Hybrid working - 1 day a week in a central London office High-growth scale-up with a strong mission and serious funding Modern tech stack: Python, SQL, Snowflake, Apache Iceberg, AWS, Airflow, dbt, Spark Work cross-functionally with engineering, product, analytics, and data science leaders What You'll Be Doing Lead, mentor, and grow a high-impact team More ❯
experience with: Cloud platforms (AWS/Azure) Data Engineering (Airflow/DBT, Spark) DevSecOps practices Additional highly valued skills include: Terraform Python/Java AWS/Azure Data Engineering Snowflake/Databricks You may be assessed on key skills relevant for success in this role, such as risk and controls, change and transformation, business acumen, strategic thinking, digital and technology More ❯
Modeling data for a civil service department replacing a legacy HR system Experience and qualifications Technical 3+ years' experience in data or software engineering Knowledge of Python, SQL, Databricks, Snowflake, and major cloud platforms (AWS/Azure/GCP) Ability to learn quickly and adapt to new technologies and sectors Understanding of data engineering best practices and system design Strong More ❯
Cambridge, Cambridgeshire, United Kingdom Hybrid / WFH Options
Softwire
Modeling data for a civil service department replacing a legacy HR system Experience and qualifications Technical 3+ years' experience in data or software engineering Knowledge of Python, SQL, Databricks, Snowflake, and major cloud platforms (AWS/Azure/GCP) Ability to learn quickly and adapt to new technologies and sectors Understanding of data engineering best practices and system design Strong More ❯
Cardiff, South Glamorgan, Wales, United Kingdom Hybrid / WFH Options
Octad Recruitment Consultants (Octad Ltd )
engineering experience (IaaS ? PaaS), including Infra as Code. Strong SQL skills and proficiency in Python or PySpark . Built or maintained data lakes/warehouses using Synapse , Fabric , Databricks , Snowflake , or Redshift . Experience hardening cloud environments (NSGs, identity, Defender). Demonstrated automation of backups, CI/CD deployments, or DR workflows. Nice-to-Haves: Experience with Azure OpenAI , vector More ❯
meet compliance standards. Mentor: Upskill other platform engineers, data engineers and AI engineers to deliver and build adoption on your team's initiatives Our Tech Stack Cloud Data Warehouse - Snowflake AWS Data Solutions - Kinesis, SNS, SQS, S3, ECS, Lambda Data Governance & Quality - Collate & Monte Carlo Infrastructure as Code - Terraform Data Integration & Transformation - Python, DBT, Fivetran, Airflow CI/CD - Github More ❯
tools. Understanding of Agile methodologies. Additional Skills Experience mentoring or supporting team development. Knowledge of Azure SQL DB, Data Factory, Data Lake, Logic Apps, Data Bricks (Spark SQL), and Snowflake is advantageous. More ❯
using Python and/or another Data Science based scripting language. Demonstrated experience and responsibility with data, processes, and building ETL pipelines. Experience with cloud data warehouses such as Snowflake, Azure Data Warehouse, Amazon Redshift, and Google BigQuery. Building visualizations using Power BI or Tableau Experience in designing ETL/ELT solutions, preferably using tools like SSIS, Alteryx, AWS Glue More ❯
Technical Skills: Proven expertise in designing, building, and operating data pipelines, warehouses, and scalable data architectures. Deep hands-on experience with modern data stacks. Our tech includes Python, SQL, Snowflake, Apache Iceberg, AWS S3, PostgresDB, Airflow, dbt, and Apache Spark, deployed via AWS, Docker, and Terraform. Experience with similar technologies is essential. Coaching & Growth Mindset: Passion for developing others through More ❯
Aurora, and Airflow). Proficiency in CI/CD tools like Jenkins, Terraform, or similar automation tools. Strong SQL skills with hands-on experience using data platforms such as Snowflake, Databricks, Spark, Presto, and EMR. Experience with monitoring tools like Datadog, Prometheus, Grafana, and ELK stack. Demonstrated ability to troubleshoot and resolve complex technical issues. Excellent communication and collaboration skills More ❯
Aurora, and Airflow). Proficiency in CI/CD tools like Jenkins, Terraform, or similar automation tools. Strong SQL skills with hands-on experience using data platforms such as Snowflake, Databricks, Spark, Presto, and EMR. Experience with monitoring tools like Datadog, Prometheus, Grafana, and ELK stack. Demonstrated ability to troubleshoot and resolve complex technical issues. Excellent communication and collaboration skills More ❯
Aurora, and Airflow). Proficiency in CI/CD tools like Jenkins, Terraform, or similar automation tools. Strong SQL skills with hands-on experience using data platforms such as Snowflake, Databricks, Spark, Presto, and EMR. Experience with monitoring tools like Datadog, Prometheus, Grafana, and ELK stack. Demonstrated ability to troubleshoot and resolve complex technical issues. Excellent communication and collaboration skills More ❯
Aurora, and Airflow). Proficiency in CI/CD tools like Jenkins, Terraform, or similar automation tools. Strong SQL skills with hands-on experience using data platforms such as Snowflake, Databricks, Spark, Presto, and EMR. Experience with monitoring tools like Datadog, Prometheus, Grafana, and ELK stack. Demonstrated ability to troubleshoot and resolve complex technical issues. Excellent communication and collaboration skills More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
use of machine learning. Key Focus Areas Own and execute enterprise data strategy Build and lead a multi-disciplinary data & AI team Drive modern data platform development (dbt, Airflow, Snowflake, Looker/Power BI) Deliver business-critical analytics and reporting Support responsible AI/ML initiatives Define data governance, privacy, and compliance frameworks What We're Looking For Proven data More ❯
Bromsgrove, Worcestershire, United Kingdom Hybrid / WFH Options
Reed Technology
deliverables Producing and maintaining high-quality technical documentation Championing data engineering best practices and standards across the business Technical skills Cloud data platforms - Azure, AWS, or GCP (Azure preferred) Snowflake - Deep knowledge and hands-on experience Matillion - Expertise in ETL orchestration Data warehousing and advanced analytics Dimensional modelling and data vault methodologies Stakeholder engagement and cross-functional collaboration Flexible hybrid More ❯
SQL. Vast experience in data modelling using tools such as Erwin, Power Designer, SQLDBM or Sparx EA. Minimum 10 years experience in using databases such as Oracle, SQL Server, Snowflake or any other OLTP and OLAP databases. Minimum 5 years experience with reporting tools: Power BI, Business Objects, Tableau or OBI. Understanding of Master Data Management technology landscape, processes and More ❯
prestigious client. About You: You'll bring technical excellence, hands-on experience, and the ability to manage multiple test activities simultaneously. We're looking for: Expertise in testing Oracle, Snowflake, and Postgres databases and data warehouses Experience delivering fully reconciled Facts and Dimensions with accurate end-user reports Proficiency with reporting tools such as Oracle OAS and Microsoft Power BI More ❯
Edgbaston, Birmingham, West Midlands (County), United Kingdom
Network IT
relationship building to understand and provide business needs. Experience: We are looking for a Data Engineer who has experience designing and implementing data warehouses, with strong technical competency using Snowflake (preferably certified), Azure Data Factory for building cloud ETL pipelines, Power BI and Data Build Tool (DBT). Other elements of your experience which are desirable to our client include More ❯
as Terraform or Ansible for deployment and infrastructure management Hands-on experience with; ETL/ELT orchestration and pipeline tools (Airflow, Airbyte, DBT, etc.) Data warehousing tools and platforms (Snowflake, Iceberg, etc.) SQL databases, particularly MySQL Desired Experience: Experience with cloud-based services, particularly AWS Proven ability to manage stakeholders, their expectations and explain complex problems or solutions in a More ❯
adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. You will have expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Job More ❯
your own ideas-your voice will be heard. Qualifcations: Degree in Computer Science, Information Technology or a related field. Skill & Experience: 3-5 years SQL experience (bonus: NoSQL or Snowflake). 2-3 years of hands-on Python (scripting and development). Experience in a fast-paced startup or agile environment. Strong background in schema design and dimensional data modeling. More ❯
your own ideas-your voice will be heard. Qualifcations: Degree in Computer Science, Information Technology or a related field. Skill & Experience: 3-5 years SQL experience (bonus: NoSQL or Snowflake). 2-3 years of hands-on Python (scripting and development). Experience in a fast-paced startup or agile environment. Strong background in schema design and dimensional data modeling. More ❯
. Experience with data lakes/lakehouses (Databricks, Unity Catalog). Familiarity with Data Mesh, Data Fabric, and product-led data strategies. Expertise in cloud platforms (AWS, Azure, GCP, Snowflake). Technical Skills Proficiency in big data tools (Apache Spark, Hadoop). Programming knowledge (Python, R, Java) is a plus. Understanding of ETL/ELT, SQL, NoSQL, and data visualisation More ❯
setup as part of foundational work. Essential Skills Proven experience as a Senior or Lead Data Engineer in complex, agile environments. Strong proficiency in Airflow (or similar orchestration tools), Snowflake, AWS, Python, and SQL. Excellent problem-solving skills and ability to work with incomplete or evolving information. Strong communication and interpersonal skills; able to work independently and take initiative. Additional More ❯
expertise by monitoring project experiences across various business domains to drive EPAM's data business Requirements Bachelor's degree in Computer Science, Engineering, or related field Extensive experience with Snowflake and/or Databricks Familiarity with Machine Learning and Large Language Models (LLM) Experience in building Data Warehouses and Data Lakes Expertise in Data Quality, Data Modeling, and Analytics Experience More ❯