IR35Immediate start12 month contract Essential Been to school in the UK Data Ingestion of APIs GCP based (Google Cloud Platform) Snowflake BigQuery DBT Semantic layer (Cube/Looker) Desirable Airflow (ApacheAirflowMore ❯
start 12 month contract Essential Been to school in the UK Data Ingestion of APIs GCP based (Google Cloud Platform) Snowflake BigQuery DBT Semantic layer (Cube/Looker) Desirable Airflow (ApacheAirflowMore ❯
Technical Expertise: Solid experience in Python programming, particularly using data manipulation and processing libraries such as Pandas, NumPy, and Apache Spark. Hands-on experience with open-source data frameworks like Apache Spark, Apache Kafka, and Apache Airflow. Strong proficiency in SQL, including advanced query development and performance tuning. Good understanding of distributed computing principles and big … automation pipelines. Experience working with relational databases such as PostgreSQL, MySQL, or equivalent platforms. Skilled in using containerization technologies including Docker and Kubernetes. Experience with workflow orchestration tools like ApacheAirflow or Dagster. Familiar with streaming data pipelines and real-time analytics solutions. More ❯
Technical Expertise: Solid experience in Python programming, particularly using data manipulation and processing libraries such as Pandas, NumPy, and Apache Spark. Hands-on experience with open-source data frameworks like Apache Spark, Apache Kafka, and Apache Airflow. Strong proficiency in SQL, including advanced query development and performance tuning. Good understanding of distributed computing principles and big … automation pipelines. Experience working with relational databases such as PostgreSQL, MySQL, or equivalent platforms. Skilled in using containerization technologies including Docker and Kubernetes. Experience with workflow orchestration tools like ApacheAirflow or Dagster. Familiar with streaming data pipelines and real-time analytics solutions. More ❯
Technical Expertise: Solid experience in Python programming, particularly using data manipulation and processing libraries such as Pandas, NumPy, and Apache Spark. Hands-on experience with open-source data frameworks like Apache Spark, Apache Kafka, and Apache Airflow. Strong proficiency in SQL, including advanced query development and performance tuning. Good understanding of distributed computing principles and big … automation pipelines. Experience working with relational databases such as PostgreSQL, MySQL, or equivalent platforms. Skilled in using containerization technologies including Docker and Kubernetes. Experience with workflow orchestration tools like ApacheAirflow or Dagster. Familiar with streaming data pipelines and real-time analytics solutions. More ❯
london (city of london), south east england, united kingdom
Vallum Associates
Technical Expertise: Solid experience in Python programming, particularly using data manipulation and processing libraries such as Pandas, NumPy, and Apache Spark. Hands-on experience with open-source data frameworks like Apache Spark, Apache Kafka, and Apache Airflow. Strong proficiency in SQL, including advanced query development and performance tuning. Good understanding of distributed computing principles and big … automation pipelines. Experience working with relational databases such as PostgreSQL, MySQL, or equivalent platforms. Skilled in using containerization technologies including Docker and Kubernetes. Experience with workflow orchestration tools like ApacheAirflow or Dagster. Familiar with streaming data pipelines and real-time analytics solutions. More ❯
SQL Server Good proficiency in any OOP language (Python, Java, C#) Experience developing reliable and efficient data pipeline solutions Experience with on-prem or cloud data integration system (e.g. Apache NiFi, ApacheAirflow, AWS Glue) Familiarity with CI/CD pipelines and DevOps practices Excellent problem-solving and communication skills Bachelor's degree in Computer Science or More ❯
delivering end-to-end AI/ML projects. Nice to Have: Exposure to LLMs (Large Language Models), generative AI , or transformer architectures . Experience with data engineering tools (Spark, Airflow, Snowflake). Prior experience in fintech, healthtech, or similar domains is a plus. More ❯
data engineering tasks. Experience building and maintaining web scraping pipelines. Strong SQL skills, with expertise in performance tuning. Strong proficiency with dbt for data transformations. Hands-on experience with ApacheAirflow or Prefect. Proficiency with GitHub, GitHub Actions, and CI/CD pipelines. Nice to have: Experience with GCP (BigQuery, Dataflow, Composer, Pub/Sub) or AWS. Familiarity More ❯
Profile: Proven experience as a Data Engineer, with strong expertise in designing and managing large-scale data systems. Hands-on proficiency with modern data technologies such as Spark, Kafka, Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery More ❯
Strong knowledge of algorithms, design patterns, OOP, threading, multiprocessing, etc. Experience with SQL, NoSQL, or tick databases Experience working in a Unix environment and git Familiarity with Kafka, Docker, AirFlow, Luigi Strong communication skills in verbal and written English. Domain knowledge in futures & swaps is a plus Highly competitive compensation and bonus structure Meritocratic environment with ample opportunity for More ❯
following engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith More ❯
East London, London, United Kingdom Hybrid / WFH Options
InfinityQuest Ltd,
Hybrid) (3 days onsite & 2 days remote) Role Type: 6 Months Contract with possibility of extensions Mandatory Skills : Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, ApacheAirflow, Energy Trading experience Job Description:- Data Engineer (Python enterprise developer): 6+ years of experience in python scripting. Proficient in developing applications in Python language. Exposed to python More ❯
platforms. Proven experience in a leadership or technical lead role, with official line management responsibility. Strong experience with modern data stack technologies, including Python, Snowflake, AWS (S3, EC2, Terraform), Airflow, dbt, Apache Spark, Apache Iceberg, and Postgres. Skilled in balancing technical excellence with business priorities in a fast-paced environment. Strong communication and stakeholder management skills, able More ❯
record in full stack data development (from ingestion to visualization). Strong expertise in Snowflake, including data modeling, warehousing, and performance optimization. Hands-on experience with ETL tools (e.g., ApacheAirflow, dbt, Fivetran) and integrating data from ERP systems like NetSuite. Proficiency in SQL, Python, and/or other scripting languages for data processing and automation. Familiarity with More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
/BI role Advanced SQL skills with hands-on experience using dbt for data modeling Strong familiarity with modern data platforms like Snowflake, Looker, AWS/GCP, Tableau, or Airflow Experience with version control tools (e.g., Git) Ability to design, build, and document scalable, reliable data models Comfortable gathering business requirements and translating them into data architecture Strong problem More ❯
facing use. Skills & Qualifications Proven expertise in data engineering and platform strategy with end-to-end delivery in production. Strong proficiency in Python and SQL; Terraform (IaC); dbt (transformations); Airflow (orchestration); Looker (or equivalent BI). Cloud experience on GCP (preferred) or AWS/Azure. Deep knowledge of modern data architectures: Data Warehouses, Data Lakes, Lakehouses Solid grounding in More ❯
Deep understanding of data wareho using concepts, ETL/ELT pipelines and dimensional modelling Proficiency in advanced programming languages (Python/PySpark, SQL) Experience in data pipeline orchestration (e.g. Airflow, Data Factory) Familiarity with DevOps and CI/CD practices (Git, Azure DevOps etc) Ability to communicate technical concepts to both technical and non-technical audiences Proven experience in More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Kubrick Group
or GCP native stacks) Experience with platform observability and CI/CD for data platforms Hands-on experience with modern data engineering tools such as dbt, Fivetran, Matillion or Airflow History of supporting pre-sales activities in a product or consultancy-based business What Kubrick offers: A fast moving and fast growth business which is doing something seriously innovative More ❯
or GCP native stacks) Experience with platform observability and CI/CD for data platforms Hands-on experience with modern data engineering tools such as dbt, Fivetran, Matillion or Airflow History of supporting pre-sales activities in a product or consultancy-based business What Kubrick offers: A fast moving and fast growth business which is doing something seriously innovative More ❯
london, south east england, united kingdom Hybrid / WFH Options
Kubrick Group
or GCP native stacks) Experience with platform observability and CI/CD for data platforms Hands-on experience with modern data engineering tools such as dbt, Fivetran, Matillion or Airflow History of supporting pre-sales activities in a product or consultancy-based business What Kubrick offers: A fast moving and fast growth business which is doing something seriously innovative More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Kubrick Group
or GCP native stacks) Experience with platform observability and CI/CD for data platforms Hands-on experience with modern data engineering tools such as dbt, Fivetran, Matillion or Airflow History of supporting pre-sales activities in a product or consultancy-based business What Kubrick offers: A fast moving and fast growth business which is doing something seriously innovative More ❯
Databricks Engineer London- hybrid- 3 days per week on-site 6 Months + UMBRELLA only- Inside IR35 Key Responsibilities Design, develop, and maintain ETL/ELT pipelines using Airflow for orchestration and scheduling. Build and manage data transformation workflows in DBT running on Databricks . Optimize data models in Delta Lake for performance, scalability, and cost efficiency. Collaborate with … sharing. Required Skills & Qualifications 3-6 years of experience as a Data Engineer (or similar). Hands-on expertise with: DBT (dbt-core, dbt-databricks adapter, testing & documentation). ApacheAirflow (DAG design, operators, scheduling, dependencies). Databricks (Spark, SQL, Delta Lake, job clusters, SQL Warehouses). Strong SQL skills and understanding of data modeling (Kimball, Data Vault More ❯
Senior Data Engineer - FinTech Unicorn - Python, SQL, DBT, Airflow, AWS/GCP OB have partnered with a UK FinTech Unicorn, who's Data function is undergoing rapid growth, and in-turn are looking to grow their Data team, with 2 highly skilled Data Engineers. You'll be working on shaping the companies Data function, driving Data best practices, and More ❯
data engineering Strong Python & SQL skills Experience with Google Cloud or similar Familiarity with unit testing , Git , and Agile Desirable: Experience with bioinformatics or scientific datasets Knowledge of Nextflow , Airflow , NLP , and Docker Exposure to AI/ML applications If the role aligns with your skills and experience, please apply with your updated CV More ❯