ability (Vue, React or Angular good but not necessary) Agile The following is DESIRABLE, not essential: AWS or GCP Buy-side Data tools such as Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio Fixed Income performance, risk or attribution TypeScript and Node Role: Python Developer (Software Engineer Programmer Developer Python Fixed Income JavaScript Node Fixed Income Credit Rates Bonds ABS … to be in the office 1-2 times a week. The tech environment is very new and will soon likely include exposure to the following: Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio This is an environment that has been described as the only corporate environment with a start-up/fintech attitude towards technology. Hours are 9-5. More ❯
and scientists working across Africa, Europe, the UK and the US. ABOUT THE ROLE Sand Technologies focuses on cutting-edge cloud-based data projects, leveraging tools such as Databricks, DBT, Docker, Python, SQL, and PySpark to name a few. We work across a variety of data architectures such as Data Mesh, lakehouse, data vault and data warehouses. Our data engineers More ❯
and scientists working across Africa, Europe, the UK and the US. ABOUT THE ROLE Sand Technologies focuses on cutting-edge cloud-based data projects, leveraging tools such as Databricks, DBT, Docker, Python, SQL, and PySpark to name a few. We work across a variety of data architectures such as Data Mesh, lakehouse, data vault and data warehouses. Our data engineers More ❯
experience building production data pipelines Advanced Python skills (NumPy, Pandas, SQL Alchemy) and expert-level SQL across multiple database platforms Hands-on experience with modern data stack tools including dbt, Airflow, and cloud data warehouses (Snowflake, BigQuery, Redshift) Strong understanding of data modelling, schema design, and building maintainable ELT/ETL pipelines Experience with cloud platforms (AWS, Azure, GCP) and More ❯
Tech Stack: Cloud Data Warehouse - Snowflake AWS Data Solutions - Kinesis, SNS, SQS, S3, ECS, Lambda Data Governance & Quality - Collate & Monte Carlo Infrastructure as Code - Terraform Data Integration & Transformation - Python, DBT, Fivetran, Airflow CI/CD - Github Actions/Jenkins Business Intelligence - Looker Skills & Attributes We'd Like To See: Extensive experience in data engineering, including designing and maintaining robust dataMore ❯
deliver value-focused data solutions We'd love to talk to you if: You've got solid experience working with Python, SQL, Spar and data pipeline tools such as dbt or Airflow You're comfortable working across cloud platforms - especially AWS (Glue, Lambda, Athena), Azure (Data Factory, Synapse), or GCP (Dataflow, BigQuery, Cloud Composer) You have a good understanding of More ❯
. Proficiency with Docker, Linux, and bash. Ability to document code, architectures, and experiments. Preferred Qualifications Experience with databases and data warehousing (Hive, Iceberg). Data transformation skills (SQL, DBT). Experience with orchestration platforms (Airflow, Argo). Knowledge of data catalogs, metadata management, vector databases, relational/object databases. Experience with Kubernetes. Understanding of computational geometry (meshes, boundary representations More ❯
Our products are recognised by industry leaders like Gartner's Magic Quadrant, Forrester Wave and Frost Radar. Our tech stack: Superset and similar data visualisation tools. ETL tools: Airflow, DBT, Airbyte, Flink, etc. Data warehousing and storage solutions: ClickHouse, Trino, S3. AWS Cloud, Kubernetes, Helm. Relevant programming languages for data engineering tasks: SQL, Python, Java, etc. What you will be More ❯
data pipelines, and ETL processes Hands-on experience with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to translate complex More ❯
detail and care about the features they implement. What we need from you: At least 3 years of relevant data engineering experience Strong Python and SQL skills Experience with dbt Experience with AWS Experience working with a columnar database such as Redshift Strong Experience with ETL/ELT and the management of data pipelines Familiarity with snowplow Experience with DataMore ❯
reliable data-focused backend services Have hands-on experience with cloud infrastructure (GCP/AWS/Azure), infrastructure-as-code (Terraform), containerisation (Docker/k8s) and data pipelines (SQL, dbt, Airbyte) Love automation, process improvement and finding ways to help others work efficiently Are comfortable working autonomously and taking responsibility for the delivery of large technical projects Are eager to More ❯
They are also proficient in Azure Event Hub and Streaming Analytics, Managed Streaming for Apache Kafka, Azure DataBricks with Spark, and other open source technologies like Apache Airflow and dbt, Spark/Python, or Spark/Scala. Required technical and professional expertise Commercial experience as a Data Engineer or similar role, with a strong emphasis on Azure technologies. Proficiency in More ❯
Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the level of fairness and More ❯
and operating data pipelines, warehouses, and scalable data architectures. Deep hands-on experience with modern data stacks. Our tech includes Python, SQL, Snowflake, Apache Iceberg, AWS S3, PostgresDB, Airflow, dbt, and Apache Spark, deployed via AWS, Docker, and Terraform. Experience with similar technologies is essential. Coaching & Growth Mindset: Passion for developing others through mentorship, feedback, and knowledge sharing. Pragmatic Problem More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
office once a week . Requirements: Proven experience as an Analytics Engineer or in a similar data engineering/BI role Advanced SQL skills with hands-on experience using dbt for data modeling Strong familiarity with modern data platforms like Snowflake, Looker, AWS/GCP, Tableau, or Airflow Experience with version control tools (e.g., Git) Ability to design, build, and More ❯
pipeline development Experience with IaC tools such as Terraform or Ansible for deployment and infrastructure management Hands-on experience with; ETL/ELT orchestration and pipeline tools (Airflow, Airbyte, DBT, etc.) Data warehousing tools and platforms (Snowflake, Iceberg, etc.) SQL databases, particularly MySQL Desired Experience: Experience with cloud-based services, particularly AWS Proven ability to manage stakeholders, their expectations and More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
and secure use of machine learning. Key Focus Areas Own and execute enterprise data strategy Build and lead a multi-disciplinary data & AI team Drive modern data platform development (dbt, Airflow, Snowflake, Looker/Power BI) Deliver business-critical analytics and reporting Support responsible AI/ML initiatives Define data governance, privacy, and compliance frameworks What We're Looking For More ❯
Employment Type: Full-Time
Salary: £120,000 - £130,000 per annum, Inc benefits
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
high-quality data assets Strong architectural acumen and software engineering fundamentals Experience driving adoption of data governance and improving data platform usage across internal teams stack including: Snowflake AWS DBT Airflow Python Kinesis Terraform CI/CD tools BENEFITS The successful Principal Data Engineer will receive the following benefits: Salary up to £107,000 Hybrid working: 2 days per week More ❯
a fast-paced startup or agile environment. Strong background in schema design and dimensional data modeling. Able to communicate data architecture clearly with internal stakeholders. Experience with Azure, Airflow, DBT, Kubernetes, GitHub. Bonus points for: open-source contributions, an active GitHub profile, and curiosity for the latest in tech. A natural problem-solver who loves making things work. Focused on More ❯
a fast-paced startup or agile environment. Strong background in schema design and dimensional data modeling. Able to communicate data architecture clearly with internal stakeholders. Experience with Azure, Airflow, DBT, Kubernetes, GitHub. Bonus points for: open-source contributions, an active GitHub profile, and curiosity for the latest in tech. A natural problem-solver who loves making things work. Focused on More ❯
Hands-on experience building and integrating with RESTful APIs using FastAPI, Django REST Framework, or similar. -Data Workflows: Experience designing and maintaining real-time and batch data pipelines, including dbt Core and stream processing tools. -Infrastructure Know-How: Confident working with Terraform and CI/CD pipelines in a cloud-native environment. -Database Familiarity: Skilled in both SQL and NoSQL More ❯
perhaps through coursework, Kaggle competitions, or personal data projects You've shown initiative in teaching yourself new technical tools or concepts beyond what was required - such as exploring BigQuery, dbt, Airflow, Docker, or other data engineering technologies on your own time Progression This is an initial six-month engagement. If you perform well, the expectation is that you'll move More ❯
customer insights Comfortable working with large datasets from sources like CRM, web analytics, product telemetry, etc. Exposure to cloud platforms (AWS, GCP, Azure) and modern data pipelines (e.g. Airflow, dbt) is a plus Soft Skills: Business-oriented thinker with strong communication skills Able to clearly explain complex models to non-technical audiences Skilled in stakeholder engagement and translating analytics into More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Salt Search
customer insights Comfortable working with large datasets from sources like CRM, web analytics, product telemetry, etc. Exposure to cloud platforms (AWS, GCP, Azure) and modern data pipelines (e.g. Airflow, dbt) is a plus Soft Skills: Business-oriented thinker with strong communication skills Able to clearly explain complex models to non-technical audiences Skilled in stakeholder engagement and translating analytics into More ❯