Remote (Mostly remote with some occasional travel to Hemel Hempstead) Contract: Outside IR35 Day rate: Up to £550 per day Duration: 6 months Start date: ASAP Key skills: Snowflake, DBT, SQL, Python, AWS and Kimball The Client who are in the process of migrating to SnowFlake therefore require extra support. As a result, they require someone from a strong SQL … Python development background with excellent working knowledge of DataBuildTool (DBT). You will be undertaking aspects of the development lifecycle and be experienced in data modeling, process design, development, and testing. And whilst this company is going through a large-scale migration, this will present you with an opportunity to be at the cutting edge of data engineering. … have experience in the following: - Advanced SQL knowledge - SnowFlake (ideally certified) - Python development - AWS cloud experience essential, relating to data tooling and development - Working knowledge of DataBuildTool (DBT). o Develop staging, intermediate and marts in DBT to achieve analytics requirements o Optimize existing models to make it more reusable by following DBT best practices o Spot opportunities More ❯
Remote (Mostly remote with some occasional travel to Hemel Hempstead) Contract: Outside IR35 Day rate: Up to £550 per day Duration: 6 months Start date: ASAP Key skills: Snowflake, DBT, SQL, Python, AWS and Kimball The Client who are in the process of migrating to SnowFlake therefore require extra support. As a result, they require someone from a strong SQL … Python development background with excellent working knowledge of DataBuildTool (DBT). You will be undertaking aspects of the development lifecycle and be experienced in data modeling, process design, development, and testing. And whilst this company is going through a large-scale migration, this will present you with an opportunity to be at the cutting edge of data engineering. … have experience in the following: - Advanced SQL knowledge - SnowFlake (ideally certified) - Python development - AWS cloud experience essential, relating to data tooling and development - Working knowledge of DataBuildTool (DBT). o Develop staging, intermediate and marts in DBT to achieve analytics requirements o Optimize existing models to make it more reusable by following DBT best practices o Spot opportunities More ❯
or changes. Build abstractions that make it easy to plug in new service behaviour or data models. Ensure emulators work seamlessly with orchestration and infrastructure-as-code tools (e.g., dbt, Terraform, Airflow, CDKs). Gather and act on feedback from internal and external teams to prioritize high-impact integrations. Build usage analytics and telemetry to understand adoption patterns and developer More ❯
meaningful outcomes. Technical expertise: Advanced Python skills for data engineering tasks. Experience building and maintaining web scraping pipelines. Strong SQL skills, with expertise in performance tuning. Strong proficiency with dbt for data transformations. Hands-on experience with Apache Airflow or Prefect. Proficiency with GitHub, GitHub Actions, and CI/CD pipelines. Nice to have: Experience with GCP (BigQuery, Dataflow, Composer More ❯
expertise with AWS services, including Glue, Redshift, Data Catalog, and large-scale data storage solutions such as data lakes. Proficiency in ETL/ELT tools (e.g. Apache Spark, Airflow, dbt). Skilled in data processing languages such as Python, Java, and SQL. Strong knowledge of data warehousing, data lakes, and data lakehouse architectures. Excellent analytical and problem-solving skills with More ❯
of the AWS services like Redshift, Lambda,S3,Step Functions, Batch, Cloud formation, Lake Formation, Code Build, CI/CD, GitHub, IAM, SQS, SNS, Aurora DB - Good experience with DBT, Apache Iceberg, Docker, Microsoft BI stack (nice to have) - Experience in data warehouse design (Kimball and lake house, medallion and data vault) is a definite preference as is knowledge of More ❯
full stack data development (from ingestion to visualization). Strong expertise in Snowflake, including data modeling, warehousing, and performance optimization. Hands-on experience with ETL tools (e.g., Apache Airflow, dbt, Fivetran) and integrating data from ERP systems like NetSuite. Proficiency in SQL, Python, and/or other scripting languages for data processing and automation. Familiarity with LLM integrations (e.g., for More ❯
engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith, Langfuse and More ❯
Sheffield, South Yorkshire, England, United Kingdom Hybrid / WFH Options
Vivedia Ltd
Experience with cloud platforms (AWS, Azure, GCP) and tools like Snowflake, Databricks, or BigQuery . Familiarity with streaming technologies (Kafka, Spark Streaming, Flink) is a plus. Tools & Frameworks: Airflow, dbt, Prefect, CI/CD pipelines, Terraform. Mindset: Curious, data-obsessed, and driven to create meaningful business impact. Soft Skills: Excellent communication and collaboration — translating complex technical ideas into business insight More ❯
Strong proficiency in SQL, Python, and data modeling for analytical and operational use cases Hands-on experience with production-grade ETL/ELT frameworks and workflow orchestration (e.g., Airflow, dbt, Talend, AWS Glue, GCP Dataform/Cloud Composer) Proven ability to design, deploy, and optimize data warehouses and lakehouse architectures using technologies like BigQuery, Redshift, Snowflake, and Databricks Experience with More ❯
Proven experience in a leadership or technical lead role, with official line management responsibility. Strong experience with modern data stack technologies, including Python, Snowflake, AWS (S3, EC2, Terraform), Airflow, dbt, Apache Spark, Apache Iceberg, and Postgres. Skilled in balancing technical excellence with business priorities in a fast-paced environment. Strong communication and stakeholder management skills, able to translate technical concepts More ❯
Greater Leeds Area, United Kingdom Hybrid / WFH Options
Corecom Consulting
Iceberg. Experience with data warehouse design, ETL/ELT, and CI/CD pipelines (GitHub, CodeBuild). Knowledge of infrastructure as code, performance tuning, and data migration. Exposure to DBT, Docker, or Microsoft BI stack is a plus. What’s on offer: Opportunity to work on large-scale, high-impact data projects. A collaborative, supportive environment with scope to grow More ❯
bradford, yorkshire and the humber, united kingdom Hybrid / WFH Options
Corecom Consulting
Iceberg. Experience with data warehouse design, ETL/ELT, and CI/CD pipelines (GitHub, CodeBuild). Knowledge of infrastructure as code, performance tuning, and data migration. Exposure to DBT, Docker, or Microsoft BI stack is a plus. What’s on offer: Opportunity to work on large-scale, high-impact data projects. A collaborative, supportive environment with scope to grow More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
office once a week . Requirements: Proven experience as an Analytics Engineer or in a similar data engineering/BI role Advanced SQL skills with hands-on experience using dbt for data modeling Strong familiarity with modern data platforms like Snowflake, Looker, AWS/GCP, Tableau, or Airflow Experience with version control tools (e.g., Git) Ability to design, build, and More ❯
and enhancing data governance, this is your chance to play a vital role in a collaborative, fast-paced environment. What Youll Do: Design, build, and maintain efficient ETL pipelinesleveraging DBT or your tool of choiceto feed a BigQuery data warehouse. Transform raw data into clean, structured formats suitable for analysis. Collaborate across engineering and analytics teams to implement scalable dataMore ❯
a Data Engineer Hands-on with Databricks, Spark, Python, SQL Cloud experience (Azure, AWS, or GCP) Strong understanding of data quality, governance, and security Nice to Have: Delta Lake, DBT, Snowflake, Terraform, CI/CD, or DevOps exposure You'll: Build and optimise ETL pipelines Enable analytics and reporting teams Drive automation and best practices Why Apply? Modern tech stack More ❯
databases such as PostgreSQL, DynamoDB, or OpenSearch. -Problem-Solver: Comfortable working in Linux environments and confident debugging logs, scripts, and production issues. -Additional Skills: Exposure to Kafka, Spark, or dbt Core, with an interest in domain-driven data contracts. Meet Citywire We cover - and connect - all sides of the $100 trillion global asset management industry - through our news, events and More ❯
databases such as PostgreSQL, DynamoDB, or OpenSearch. -Problem-Solver: Comfortable working in Linux environments and confident debugging logs, scripts, and production issues. -Additional Skills: Exposure to Kafka, Spark, or dbt Core, with an interest in domain-driven data contracts. Meet Citywire We cover - and connect - all sides of the $100 trillion global asset management industry - through our news, events and More ❯
databases such as PostgreSQL, DynamoDB, or OpenSearch. -Problem-Solver: Comfortable working in Linux environments and confident debugging logs, scripts, and production issues. -Additional Skills: Exposure to Kafka, Spark, or dbt Core, with an interest in domain-driven data contracts. Meet Citywire We cover - and connect - all sides of the $100 trillion global asset management industry - through our news, events and More ❯
databases such as PostgreSQL, DynamoDB, or OpenSearch. -Problem-Solver: Comfortable working in Linux environments and confident debugging logs, scripts, and production issues. -Additional Skills: Exposure to Kafka, Spark, or dbt Core, with an interest in domain-driven data contracts. Meet Citywire We cover - and connect - all sides of the $100 trillion global asset management industry - through our news, events and More ❯
london (city of london), south east england, united kingdom
Citywire
databases such as PostgreSQL, DynamoDB, or OpenSearch. -Problem-Solver: Comfortable working in Linux environments and confident debugging logs, scripts, and production issues. -Additional Skills: Exposure to Kafka, Spark, or dbt Core, with an interest in domain-driven data contracts. Meet Citywire We cover - and connect - all sides of the $100 trillion global asset management industry - through our news, events and More ❯
of Python and SQL (experience with Snowflake highly desirable) Knowledge of BI tools such as Superset, Tableau , PowerBI or similar is desirable Knowledge of orchestration tools such as Airflow, DBT or Google Cloud Dataflow is a bonus Analytical and problem-solving skills, with a deep curiosity for fraud detection Excellent attention to detail to ensure quality of project delivery for More ❯
of Python and SQL (experience with Snowflake highly desirable) Knowledge of BI tools such as Superset, Tableau , PowerBI or similar is desirable Knowledge of orchestration tools such as Airflow, DBT or Google Cloud Dataflow is a bonus Analytical and problem-solving skills, with a deep curiosity for fraud detection Excellent attention to detail to ensure quality of project delivery for More ❯
of Python and SQL (experience with Snowflake highly desirable) Knowledge of BI tools such as Superset, Tableau , PowerBI or similar is desirable Knowledge of orchestration tools such as Airflow, DBT or Google Cloud Dataflow is a bonus Analytical and problem-solving skills, with a deep curiosity for fraud detection Excellent attention to detail to ensure quality of project delivery for More ❯