or changes. Build abstractions that make it easy to plug in new service behaviour or data models. Ensure emulators work seamlessly with orchestration and infrastructure-as-code tools (e.g., dbt, Terraform, Airflow, CDKs). Gather and act on feedback from internal and external teams to prioritize high-impact integrations. Build usage analytics and telemetry to understand adoption patterns and developer More ❯
meaningful outcomes. Technical expertise: Advanced Python skills for data engineering tasks. Experience building and maintaining web scraping pipelines. Strong SQL skills, with expertise in performance tuning. Strong proficiency with dbt for data transformations. Hands-on experience with Apache Airflow or Prefect. Proficiency with GitHub, GitHub Actions, and CI/CD pipelines. Nice to have: Experience with GCP (BigQuery, Dataflow, Composer More ❯
Technical Expertise: Proficiency in programming languages such as Python, Java, or Scala. Advanced SQL skills for data transformation and performance optimization. Hands-on experience with data pipeline tools (Airflow, dbt, Kafka, or equivalent). Strong knowledge of big data processing frameworks (Apache Spark, Databricks, Flink, etc.). Cloud & Infrastructure: Experience with cloud computing platforms (AWS, Azure, Google Cloud). Familiarity More ❯
expertise with AWS services, including Glue, Redshift, Data Catalog, and large-scale data storage solutions such as data lakes. Proficiency in ETL/ELT tools (e.g. Apache Spark, Airflow, dbt). Skilled in data processing languages such as Python, Java, and SQL. Strong knowledge of data warehousing, data lakes, and data lakehouse architectures. Excellent analytical and problem-solving skills with More ❯
of the AWS services like Redshift, Lambda,S3,Step Functions, Batch, Cloud formation, Lake Formation, Code Build, CI/CD, GitHub, IAM, SQS, SNS, Aurora DB - Good experience with DBT, Apache Iceberg, Docker, Microsoft BI stack (nice to have) - Experience in data warehouse design (Kimball and lake house, medallion and data vault) is a definite preference as is knowledge of More ❯
full stack data development (from ingestion to visualization). Strong expertise in Snowflake, including data modeling, warehousing, and performance optimization. Hands-on experience with ETL tools (e.g., Apache Airflow, dbt, Fivetran) and integrating data from ERP systems like NetSuite. Proficiency in SQL, Python, and/or other scripting languages for data processing and automation. Familiarity with LLM integrations (e.g., for More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
Rackner
in software engineering (backend, API, or full-stack) Proficient in Python, Java, or C# Experienced with REST APIs (FastAPI, AWS Lambda) and OpenAPI specifications Skilled in data pipeline orchestration (dbt, Apache Airflow, Apache Spark, Iceberg) Knowledgeable in federal compliance frameworks (NIST 800-53, HIPAA, FISMA High) Preferred/Bonus: Prior work with DHA, VA, or federal healthcare IT programs Experience More ❯
engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith, Langfuse and More ❯
Strong proficiency in SQL, Python, and data modeling for analytical and operational use cases Hands-on experience with production-grade ETL/ELT frameworks and workflow orchestration (e.g., Airflow, dbt, Talend, AWS Glue, GCP Dataform/Cloud Composer) Proven ability to design, deploy, and optimize data warehouses and lakehouse architectures using technologies like BigQuery, Redshift, Snowflake, and Databricks Experience with More ❯
Greater Leeds Area, United Kingdom Hybrid / WFH Options
Corecom Consulting
Iceberg. Experience with data warehouse design, ETL/ELT, and CI/CD pipelines (GitHub, CodeBuild). Knowledge of infrastructure as code, performance tuning, and data migration. Exposure to DBT, Docker, or Microsoft BI stack is a plus. What’s on offer: Opportunity to work on large-scale, high-impact data projects. A collaborative, supportive environment with scope to grow More ❯
bradford, yorkshire and the humber, united kingdom Hybrid / WFH Options
Corecom Consulting
Iceberg. Experience with data warehouse design, ETL/ELT, and CI/CD pipelines (GitHub, CodeBuild). Knowledge of infrastructure as code, performance tuning, and data migration. Exposure to DBT, Docker, or Microsoft BI stack is a plus. What’s on offer: Opportunity to work on large-scale, high-impact data projects. A collaborative, supportive environment with scope to grow More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
office once a week . Requirements: Proven experience as an Analytics Engineer or in a similar data engineering/BI role Advanced SQL skills with hands-on experience using dbt for data modeling Strong familiarity with modern data platforms like Snowflake, Looker, AWS/GCP, Tableau, or Airflow Experience with version control tools (e.g., Git) Ability to design, build, and More ❯
and enhancing data governance, this is your chance to play a vital role in a collaborative, fast-paced environment. What Youll Do: Design, build, and maintain efficient ETL pipelinesleveraging DBT or your tool of choiceto feed a BigQuery data warehouse. Transform raw data into clean, structured formats suitable for analysis. Collaborate across engineering and analytics teams to implement scalable dataMore ❯
a Data Engineer Hands-on with Databricks, Spark, Python, SQL Cloud experience (Azure, AWS, or GCP) Strong understanding of data quality, governance, and security Nice to Have: Delta Lake, DBT, Snowflake, Terraform, CI/CD, or DevOps exposure You'll: Build and optimise ETL pipelines Enable analytics and reporting teams Drive automation and best practices Why Apply? Modern tech stack More ❯
and best practices. Required Skills • 10-12 years of experience in data engineering or related roles. • Strong coding skills in Python, SQL, and at least one ETL framework (Airflow, dbt, etc.). • Hands-on experience with cloud data platforms (AWS Glue, Redshift, GCP BigQuery, Azure Synapse, etc.). • Solid understanding of data warehousing, partitioning, schema design, and optimization. • Familiarity with More ❯
databases such as PostgreSQL, DynamoDB, or OpenSearch. -Problem-Solver: Comfortable working in Linux environments and confident debugging logs, scripts, and production issues. -Additional Skills: Exposure to Kafka, Spark, or dbt Core, with an interest in domain-driven data contracts. Meet Citywire We cover - and connect - all sides of the $100 trillion global asset management industry - through our news, events and More ❯
databases such as PostgreSQL, DynamoDB, or OpenSearch. -Problem-Solver: Comfortable working in Linux environments and confident debugging logs, scripts, and production issues. -Additional Skills: Exposure to Kafka, Spark, or dbt Core, with an interest in domain-driven data contracts. Meet Citywire We cover - and connect - all sides of the $100 trillion global asset management industry - through our news, events and More ❯
databases such as PostgreSQL, DynamoDB, or OpenSearch. -Problem-Solver: Comfortable working in Linux environments and confident debugging logs, scripts, and production issues. -Additional Skills: Exposure to Kafka, Spark, or dbt Core, with an interest in domain-driven data contracts. Meet Citywire We cover - and connect - all sides of the $100 trillion global asset management industry - through our news, events and More ❯
databases such as PostgreSQL, DynamoDB, or OpenSearch. -Problem-Solver: Comfortable working in Linux environments and confident debugging logs, scripts, and production issues. -Additional Skills: Exposure to Kafka, Spark, or dbt Core, with an interest in domain-driven data contracts. Meet Citywire We cover - and connect - all sides of the $100 trillion global asset management industry - through our news, events and More ❯
london (city of london), south east england, united kingdom
Citywire
databases such as PostgreSQL, DynamoDB, or OpenSearch. -Problem-Solver: Comfortable working in Linux environments and confident debugging logs, scripts, and production issues. -Additional Skills: Exposure to Kafka, Spark, or dbt Core, with an interest in domain-driven data contracts. Meet Citywire We cover - and connect - all sides of the $100 trillion global asset management industry - through our news, events and More ❯
of Python and SQL (experience with Snowflake highly desirable) Knowledge of BI tools such as Superset, Tableau , PowerBI or similar is desirable Knowledge of orchestration tools such as Airflow, DBT or Google Cloud Dataflow is a bonus Analytical and problem-solving skills, with a deep curiosity for fraud detection Excellent attention to detail to ensure quality of project delivery for More ❯
of Python and SQL (experience with Snowflake highly desirable) Knowledge of BI tools such as Superset, Tableau , PowerBI or similar is desirable Knowledge of orchestration tools such as Airflow, DBT or Google Cloud Dataflow is a bonus Analytical and problem-solving skills, with a deep curiosity for fraud detection Excellent attention to detail to ensure quality of project delivery for More ❯
of Python and SQL (experience with Snowflake highly desirable) Knowledge of BI tools such as Superset, Tableau , PowerBI or similar is desirable Knowledge of orchestration tools such as Airflow, DBT or Google Cloud Dataflow is a bonus Analytical and problem-solving skills, with a deep curiosity for fraud detection Excellent attention to detail to ensure quality of project delivery for More ❯
of Python and SQL (experience with Snowflake highly desirable) Knowledge of BI tools such as Superset, Tableau , PowerBI or similar is desirable Knowledge of orchestration tools such as Airflow, DBT or Google Cloud Dataflow is a bonus Analytical and problem-solving skills, with a deep curiosity for fraud detection Excellent attention to detail to ensure quality of project delivery for More ❯