continue to scale their data infrastructure , they're seeking a seasoned Principal Data Engineer to help design, build, and optimise a modern data platform, with a focus on orchestration (Airflow), scalable data warehouse architecture, and high-performance pipelines. The ideal candidate will bring both technical depth and strategic thinking , with the ability to communicate effectively across business and technical … warehouse solutions for BI and analytics. Define and drive the long-term architecture and data strategy in alignment with business goals. Own orchestration of ETL/ELT workflows using ApacheAirflow , including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices … Proven experience designing and delivering enterprise data strategies . Exceptional communication and stakeholder management skills. Expertise in enterprise-grade data warehouses (Snowflake, BigQuery, Redshift). Hands-on experience with ApacheAirflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud More ❯
with clients - Collaborating with cross-functional teams to deploy and operate solutions in production - Supporting real-time and near-real-time data analytics initiatives - Leveraging orchestration tools such as Airflow, Dagster, Azure Data Factory or Fivetran Required qualifications to be successful in this role: - Solid experience designing and delivering Snowflake-based data warehouse solutions - Strong background performing architectural assessments … Python, Java or Scala - Hands-on experience using DBT for pipeline development and transformation - Familiarity with cloud platforms such as AWS, Azure or GCP - Knowledge of orchestration tooling (e.g., Airflow, Dagster, Azure Data Factory, Fivetran) Desirable: - Experience deploying AI/ML models in production environments - Familiarity with AWS data services (e.g., S3, Glue, Kinesis, Athena) - Exposure to real-time More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Hartree Partners
pandas, xarray, SciPy/PyMC/PyTorch, or similar). Experience validating models with historical data and communicating results to non-specialists. Exposure to real-time data engineering (Kafka, Airflow, dbt) Track record turning research code into production services (CI/CD, containers etc) Strong SQL and data-management skills; experience querying large analytical databases (Snowflake highly desirable, but More ❯
pandas, xarray, SciPy/PyMC/PyTorch, or similar). Experience validating models with historical data and communicating results to non-specialists. Exposure to real-time data engineering (Kafka, Airflow, dbt) Track record turning research code into production services (CI/CD, containers etc) Strong SQL and data-management skills; experience querying large analytical databases (Snowflake highly desirable, but More ❯
continue to scale their data infrastructure , they're seeking a seasoned Principal Data Engineer to help design, build, and optimise a modern data platform, with a focus on orchestration (Airflow), scalable data warehouse architecture, and high-performance pipelines. The ideal candidate will bring both technical depth and strategic thinking , with the ability to communicate effectively across business and technical … warehouse solutions for BI and analytics. Define and drive the long-term architecture and data strategy in alignment with business goals. Own orchestration of ETL/ELT workflows using ApacheAirflow , including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices … Proven experience designing and delivering enterprise data strategies . Exceptional communication and stakeholder management skills. Expertise in enterprise-grade data warehouses (Snowflake, BigQuery, Redshift). Hands-on experience with ApacheAirflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud More ❯
City of London, London, United Kingdom Hybrid / WFH Options
83data
continue to scale their data infrastructure , they’re seeking a seasoned Principal Data Engineer to help design, build, and optimise a modern data platform, with a focus on orchestration (Airflow), scalable data warehouse architecture, and high-performance pipelines. The ideal candidate will bring both technical depth and strategic thinking , with the ability to communicate effectively across business and technical … warehouse solutions for BI and analytics. Define and drive the long-term architecture and data strategy in alignment with business goals. Own orchestration of ETL/ELT workflows using ApacheAirflow , including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices … Proven experience designing and delivering enterprise data strategies . Exceptional communication and stakeholder management skills. Expertise in enterprise-grade data warehouses (Snowflake, BigQuery, Redshift). Hands-on experience with ApacheAirflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud More ❯
London, England, United Kingdom Hybrid / WFH Options
83zero Limited
continue to scale their data infrastructure , they're seeking a seasoned Principal Data Engineer to help design, build, and optimise a modern data platform, with a focus on orchestration (Airflow), scalable data warehouse architecture, and high-performance pipelines. The ideal candidate will bring both technical depth and strategic thinking , with the ability to communicate effectively across business and technical … warehouse solutions for BI and analytics. Define and drive the long-term architecture and data strategy in alignment with business goals. Own orchestration of ETL/ELT workflows using ApacheAirflow , including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices … Proven experience designing and delivering enterprise data strategies . Exceptional communication and stakeholder management skills. Expertise in enterprise-grade data warehouses (Snowflake, BigQuery, Redshift). Hands-on experience with ApacheAirflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud More ❯
London, England, United Kingdom Hybrid / WFH Options
83data
continue to scale their data infrastructure , they’re seeking a seasoned Principal Data Engineer to help design, build, and optimise a modern data platform, with a focus on orchestration (Airflow), scalable data warehouse architecture, and high-performance pipelines. The ideal candidate will bring both technical depth and strategic thinking , with the ability to communicate effectively across business and technical … warehouse solutions for BI and analytics. Define and drive the long-term architecture and data strategy in alignment with business goals. Own orchestration of ETL/ELT workflows using ApacheAirflow , including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices … Proven experience designing and delivering enterprise data strategies . Exceptional communication and stakeholder management skills. Expertise in enterprise-grade data warehouses (Snowflake, BigQuery, Redshift). Hands-on experience with ApacheAirflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud More ❯
Experience in using modern data architectures, such as lakehouse. Experience with CI/CD pipelines and version control systems like Git. Knowledge of ETL tools and technologies such as ApacheAirflow, Informatica, or Talend. Knowledge of data governance and best practices in data management. Familiarity with cloud platforms and services such as AWS, Azure, or GCP for deploying … and managing data solutions. Strong problem-solving and analytical skills with the ability to diagnose and resolve complex data-related issues. SQL (for database management and querying) Apache Spark (for distributed data processing) Apache Spark Streaming, Kafka or similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or GCP More ❯
modern data modelling practices, analytics tooling, and interactive dashboard development in Power BI and Plotly/Dash. Key responsibilities: Designing and maintaining robust data transformation pipelines (ELT) using SQL, ApacheAirflow, or similar tools. Building and optimizing data models that power dashboards and analytical tools Developing clear, insightful, and interactive dashboards and reports using Power BI and Plotly … with data modeling techniques (e.g. dimensional, star/snowflake schemas) and analytics layer design to support business intelligence and self-serve reporting. Proficiency in analytics engineering tools such as airflow, SQL, and version control systems like Git. Hands-on experience developing dashboards and reports using Power BI, Plotly/Dash, or other modern visualisation tools. Strong understanding of data More ❯
City of London, London, United Kingdom Hybrid / WFH Options
OTA Recruitment
modern data modelling practices, analytics tooling, and interactive dashboard development in Power BI and Plotly/Dash. Key responsibilities: Designing and maintaining robust data transformation pipelines (ELT) using SQL, ApacheAirflow, or similar tools. Building and optimizing data models that power dashboards and analytical tools Developing clear, insightful, and interactive dashboards and reports using Power BI and Plotly … with data modeling techniques (e.g. dimensional, star/snowflake schemas) and analytics layer design to support business intelligence and self-serve reporting. Proficiency in analytics engineering tools such as airflow, SQL, and version control systems like Git. Hands-on experience developing dashboards and reports using Power BI, Plotly/Dash, or other modern visualisation tools. Strong understanding of data More ❯
data architectures, such as lakehouse. Experience with CI/CD pipelines, version control systems like Git, and containerization (e.g., Docker). Experience with ETL tools and technologies such as ApacheAirflow, Informatica, or Talend. Strong understanding of data governance and best practices in data management. Experience with cloud platforms and services such as AWS, Azure, or GCP for … deploying and managing data solutions. Strong problem-solving and analytical skills with the ability to diagnose and resolve complex data-related issues. SQL (for database management and querying) Apache Spark (for distributed data processing) Apache Spark Streaming, Kafka or similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or More ❯
focus on automation and data process improvement. Demonstrated experience in designing and implementing automation frameworks and solutions for data pipelines and transformations. Strong understanding of data processing frameworks (e.g., Apache Spark, Apache Kafka) and database technologies (e.g., SQL, NoSQL). Expertise in programming languages relevant to data engineering (e.g., Python, SQL). Hands on data preparation activities using … with Web Scraping frameworks (Scrapy or Beautiful Soup or similar). Familiarity with cloud data platforms (e.g., AWS, Azure, Google Cloud) is a plus. Hands on experience required with Airflow/Astronomer, DBT. Practical knowledge on Graph and Vector DB. Excellent problem-solving skills and ability to think strategically about data and its role in improving organizational efficiency. Strong More ❯
London, England, United Kingdom Hybrid / WFH Options
Endava
solutions aligned with business objectives. Key Responsibilities Data Pipeline Development Architect, implement and maintain real-time and batch data pipelines to handle large datasets efficiently. Employ frameworks such as Apache Spark, Databricks, Snowflake or Airflow to automate ingestion, transformation, and delivery. Data Integration & Transformation Work with Data Analysts to understand source-to-target mappings and quality requirements. Build … security measures (RBAC, encryption) and ensure regulatory compliance (GDPR). Document data lineage and recommend improvements for data ownership and stewardship. Qualifications Programming: Python, SQL, Scala, Java. Big Data: Apache Spark, Hadoop, Databricks, Snowflake, etc. Cloud: AWS (Glue, Redshift), Azure (Synapse, Data Factory, Fabric), GCP (BigQuery, Dataflow). Data Modelling & Storage: Relational (PostgreSQL, SQL Server), NoSQL (MongoDB, Cassandra), Dimensional More ❯
Strong problem-solving skills and ability to work in an Agile/Scrum environment. Preferred Qualifications AWS Certified Data Analytics – Specialty or AWS Certified Solutions Architect – Associate. Experience with Airflow for workflow orchestration. Exposure to big data frameworks such as Apache Spark, Hadoop, or Presto. Hands-on experience with machine learning pipelines and AI/ML data engineering More ❯
and SQL for data pipelines Experience with modern cloud data warehouses (like AWS Redshift, GCP BigQuery, Azure Synapse or Snowflake) Strong communication skills and fluency in English Experience with Apache Spark (in both batch and streaming) Experience with a job orchestrator (Airflow, Google Cloud Composer, Flyte, Prefect, Dagster) Hands-on experience with AWS Experience with dbt *Typeform drives More ❯
writing clean, maintainable code in SQL or Python . Experience with Scala , Java , or similar languages is a plus. Hands-on experience with data pipeline orchestration tools such as ApacheAirflow and Azure DevOps . Strong knowledge of cloud-based data engineering, particularly in AWS environments. Experience in data quality assessment (profiling, anomaly detection) and data documentation (schemas More ❯
London, England, United Kingdom Hybrid / WFH Options
Flutter
fully documented and meet appropriate standards for security, resilience and operational support. Skills & Experience Required Essential: Hands-on experience developing data pipelines in Databricks, with a strong understanding of Apache Spark and Delta Lake. Proficient in Python for data transformation and automation tasks. Solid understanding of AWS services, especially S3, Transfer Family, IAM, and VPC networking. Experience integrating data … Terraform (CDKtf) and AWS CDK with TypeScript. Ability to clearly document technical solutions and communicate with both technical and non-technical stakeholders. Desirable: Experience with job orchestration tools (e.g., Airflow, AWS Step Functions) Exposure to finance data structures or ERP systems (e.g., Oracle Fusion) Familiarity with CI/CD pipelines and deployment strategies in a cloud environment Monitoring and More ❯
London, England, United Kingdom Hybrid / WFH Options
Owen Thomas
fintech data scale) Connection pool tuning (e.g., PGBouncer, connection limits) Can you design pipelines (ETL/ELT) with Postgres as a core data store ? Bonus: Familiarity with tools like Airflow, dbt , or event-driven architectures ( Kafka, Pub/Sub ). 3. Cloud and Infrastructure Postgres in cloud (AWS RDS, Aurora, GCP CloudSQL, or self-hosted) Good understanding of backups More ❯
building production data pipelines Advanced Python skills (NumPy, Pandas, SQL Alchemy) and expert-level SQL across multiple database platforms Hands-on experience with modern data stack tools including dbt, Airflow, and cloud data warehouses (Snowflake, BigQuery, Redshift) Strong understanding of data modelling, schema design, and building maintainable ELT/ETL pipelines Experience with cloud platforms (AWS, Azure, GCP) and More ❯
and implementation of distributed data solutions using API and microservice-based architecture. Deep understanding of ETL/ELT architecture, streaming, and event-driven processing; familiarity with tools like dbt, Airflow, Kafka, or equivalents. Familiarity with mid-sized firm tech stacks, especially in financial services, including systems such as NetSuite, Salesforce, Addepar, Experience with Atlassian Jira or Microsoft DevOps and More ❯
for deployment and workflow orchestration Solid understanding of financial data and modelling techniques (preferred) Excellent analytical, communication, and problem-solving skills Experience with data engineering & ETL tools such as ApacheAirflow or custom ETL scripts. Strong problem-solving skills with a keen analytical mindset especially in handling large data sets and complex data transformations. Strong experience in setting More ❯
City of London, London, United Kingdom Hybrid / WFH Options
SGI
for deployment and workflow orchestration Solid understanding of financial data and modelling techniques (preferred) Excellent analytical, communication, and problem-solving skills Experience with data engineering & ETL tools such as ApacheAirflow or custom ETL scripts. Strong problem-solving skills with a keen analytical mindset especially in handling large data sets and complex data transformations. Strong experience in setting More ❯
in at least one of the big 3 cloud ML stacks (AWS, Azure, GCP). Hands-on experience with open-source ETL, and data pipeline orchestration tools such as ApacheAirflow and Nifi. Experience with large scale/Big Data technologies, such as Hadoop, Spark, Hive, Impala, PrestoDb, Kafka. Experience with workflow orchestration tools like Apache Airflow. More ❯