meaningful outcomes. Technical expertise: Advanced Python skills for data engineering tasks. Experience building and maintaining web scraping pipelines. Strong SQL skills, with expertise in performance tuning. Strong proficiency with dbt for data transformations. Hands-on experience with Apache Airflow or Prefect. Proficiency with GitHub, GitHub Actions, and CI/CD pipelines. Nice to have: Experience with GCP (BigQuery, Dataflow, Composer More ❯
expertise with AWS services, including Glue, Redshift, Data Catalog, and large-scale data storage solutions such as data lakes. Proficiency in ETL/ELT tools (e.g. Apache Spark, Airflow, dbt). Skilled in data processing languages such as Python, Java, and SQL. Strong knowledge of data warehousing, data lakes, and data lakehouse architectures. Excellent analytical and problem-solving skills with More ❯
of the AWS services like Redshift, Lambda,S3,Step Functions, Batch, Cloud formation, Lake Formation, Code Build, CI/CD, GitHub, IAM, SQS, SNS, Aurora DB - Good experience with DBT, Apache Iceberg, Docker, Microsoft BI stack (nice to have) - Experience in data warehouse design (Kimball and lake house, medallion and data vault) is a definite preference as is knowledge of More ❯
full stack data development (from ingestion to visualization). Strong expertise in Snowflake, including data modeling, warehousing, and performance optimization. Hands-on experience with ETL tools (e.g., Apache Airflow, dbt, Fivetran) and integrating data from ERP systems like NetSuite. Proficiency in SQL, Python, and/or other scripting languages for data processing and automation. Familiarity with LLM integrations (e.g., for More ❯
engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith, Langfuse and More ❯
Sheffield, South Yorkshire, England, United Kingdom Hybrid / WFH Options
Vivedia Ltd
Experience with cloud platforms (AWS, Azure, GCP) and tools like Snowflake, Databricks, or BigQuery . Familiarity with streaming technologies (Kafka, Spark Streaming, Flink) is a plus. Tools & Frameworks: Airflow, dbt, Prefect, CI/CD pipelines, Terraform. Mindset: Curious, data-obsessed, and driven to create meaningful business impact. Soft Skills: Excellent communication and collaboration — translating complex technical ideas into business insight More ❯
coursework or personal projects Understanding of data pipelines, ETL/ELT, or data warehousing Awareness of cloud platforms (e.g., AWS, Azure, or GCP) Understanding of modern data tools (e.g., dbt, Airflow, Snowflake, BigQuery) Version control tools (e.g., Git) What You’ll Gain Hands-on experience with a modern AI-enabled data stack in a real-world, production environment The chance More ❯
Greater Leeds Area, United Kingdom Hybrid / WFH Options
Corecom Consulting
Iceberg. Experience with data warehouse design, ETL/ELT, and CI/CD pipelines (GitHub, CodeBuild). Knowledge of infrastructure as code, performance tuning, and data migration. Exposure to DBT, Docker, or Microsoft BI stack is a plus. What’s on offer: Opportunity to work on large-scale, high-impact data projects. A collaborative, supportive environment with scope to grow More ❯
bradford, yorkshire and the humber, united kingdom Hybrid / WFH Options
Corecom Consulting
Iceberg. Experience with data warehouse design, ETL/ELT, and CI/CD pipelines (GitHub, CodeBuild). Knowledge of infrastructure as code, performance tuning, and data migration. Exposure to DBT, Docker, or Microsoft BI stack is a plus. What’s on offer: Opportunity to work on large-scale, high-impact data projects. A collaborative, supportive environment with scope to grow More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
office once a week . Requirements: Proven experience as an Analytics Engineer or in a similar data engineering/BI role Advanced SQL skills with hands-on experience using dbt for data modeling Strong familiarity with modern data platforms like Snowflake, Looker, AWS/GCP, Tableau, or Airflow Experience with version control tools (e.g., Git) Ability to design, build, and More ❯
and enhancing data governance, this is your chance to play a vital role in a collaborative, fast-paced environment. What Youll Do: Design, build, and maintain efficient ETL pipelinesleveraging DBT or your tool of choiceto feed a BigQuery data warehouse. Transform raw data into clean, structured formats suitable for analysis. Collaborate across engineering and analytics teams to implement scalable dataMore ❯
City of London, London, United Kingdom Hybrid / WFH Options
Robert Half
a Data Engineer Hands-on with Databricks, Spark, Python, SQL Cloud experience (Azure, AWS, or GCP) Strong understanding of data quality, governance, and security Nice to Have: Delta Lake, DBT, Snowflake, Terraform, CI/CD, or DevOps exposure You’ll: Build and optimise ETL pipelines Enable analytics and reporting teams Drive automation and best practices Why Apply? Modern tech stack More ❯
a Data Engineer Hands-on with Databricks, Spark, Python, SQL Cloud experience (Azure, AWS, or GCP) Strong understanding of data quality, governance, and security Nice to Have: Delta Lake, DBT, Snowflake, Terraform, CI/CD, or DevOps exposure You’ll: Build and optimise ETL pipelines Enable analytics and reporting teams Drive automation and best practices Why Apply? Modern tech stack More ❯
databases such as PostgreSQL, DynamoDB, or OpenSearch. -Problem-Solver: Comfortable working in Linux environments and confident debugging logs, scripts, and production issues. -Additional Skills: Exposure to Kafka, Spark, or dbt Core, with an interest in domain-driven data contracts. Meet Citywire We cover - and connect - all sides of the $100 trillion global asset management industry - through our news, events and More ❯
databases such as PostgreSQL, DynamoDB, or OpenSearch. -Problem-Solver: Comfortable working in Linux environments and confident debugging logs, scripts, and production issues. -Additional Skills: Exposure to Kafka, Spark, or dbt Core, with an interest in domain-driven data contracts. Meet Citywire We cover - and connect - all sides of the $100 trillion global asset management industry - through our news, events and More ❯
databases such as PostgreSQL, DynamoDB, or OpenSearch. -Problem-Solver: Comfortable working in Linux environments and confident debugging logs, scripts, and production issues. -Additional Skills: Exposure to Kafka, Spark, or dbt Core, with an interest in domain-driven data contracts. Meet Citywire We cover - and connect - all sides of the $100 trillion global asset management industry - through our news, events and More ❯
databases such as PostgreSQL, DynamoDB, or OpenSearch. -Problem-Solver: Comfortable working in Linux environments and confident debugging logs, scripts, and production issues. -Additional Skills: Exposure to Kafka, Spark, or dbt Core, with an interest in domain-driven data contracts. Meet Citywire We cover - and connect - all sides of the $100 trillion global asset management industry - through our news, events and More ❯
london (city of london), south east england, united kingdom
Citywire
databases such as PostgreSQL, DynamoDB, or OpenSearch. -Problem-Solver: Comfortable working in Linux environments and confident debugging logs, scripts, and production issues. -Additional Skills: Exposure to Kafka, Spark, or dbt Core, with an interest in domain-driven data contracts. Meet Citywire We cover - and connect - all sides of the $100 trillion global asset management industry - through our news, events and More ❯
of Python and SQL (experience with Snowflake highly desirable) Knowledge of BI tools such as Superset, Tableau , PowerBI or similar is desirable Knowledge of orchestration tools such as Airflow, DBT or Google Cloud Dataflow is a bonus Analytical and problem-solving skills, with a deep curiosity for fraud detection Excellent attention to detail to ensure quality of project delivery for More ❯
of Python and SQL (experience with Snowflake highly desirable) Knowledge of BI tools such as Superset, Tableau , PowerBI or similar is desirable Knowledge of orchestration tools such as Airflow, DBT or Google Cloud Dataflow is a bonus Analytical and problem-solving skills, with a deep curiosity for fraud detection Excellent attention to detail to ensure quality of project delivery for More ❯
of Python and SQL (experience with Snowflake highly desirable) Knowledge of BI tools such as Superset, Tableau , PowerBI or similar is desirable Knowledge of orchestration tools such as Airflow, DBT or Google Cloud Dataflow is a bonus Analytical and problem-solving skills, with a deep curiosity for fraud detection Excellent attention to detail to ensure quality of project delivery for More ❯
of Python and SQL (experience with Snowflake highly desirable) Knowledge of BI tools such as Superset, Tableau , PowerBI or similar is desirable Knowledge of orchestration tools such as Airflow, DBT or Google Cloud Dataflow is a bonus Analytical and problem-solving skills, with a deep curiosity for fraud detection Excellent attention to detail to ensure quality of project delivery for More ❯
london (city of london), south east england, united kingdom
LexisNexis Risk Solutions
of Python and SQL (experience with Snowflake highly desirable) Knowledge of BI tools such as Superset, Tableau , PowerBI or similar is desirable Knowledge of orchestration tools such as Airflow, DBT or Google Cloud Dataflow is a bonus Analytical and problem-solving skills, with a deep curiosity for fraud detection Excellent attention to detail to ensure quality of project delivery for More ❯
Senior Data Engineer - FinTech Unicorn - Python, SQL, DBT, Airflow, AWS/GCP OB have partnered with a UK FinTech Unicorn, who's Data function is undergoing rapid growth, and in-turn are looking to grow their Data team, with 2 highly skilled Data Engineers. You'll be working on shaping the companies Data function, driving Data best practices, and collaborate More ❯