robust way possible! Diverse training opportunities and social benefits (e.g. UK pension schema) What do you offer? Strong hands-on experience working with modern Big Data technologies such as Apache Spark, Trino, Apache Kafka, Apache Hadoop, Apache HBase, Apache Nifi, ApacheAirflow, Opensearch Proficiency in cloud-native technologies such as containerization and Kubernetes More ❯
the success of the data department overall. TLA works with the modern data stack, utilising Snowflake for our data warehouse, dbt to transform data across our medallion architecture, and ApacheAirflow for orchestration. Microsoft Azure is our choice of cloud provider for hosting infrastructure. Within the role you will be hands-on with all these exciting technologies. Many … Nice-to-Have Skills: Experience with both batch and near real-time data pipelines Familiarity with Infrastructure as Code (Terraform) Experience with dbt and medallion architecture patterns Knowledge of ApacheAirflow or similar orchestration tools Azure cloud platform experience Why Join TLA? TLA is a fast-moving, innovative digital business that partners with some of the biggest automotive More ❯
IR35Immediate start12 month contract Essential Been to school in the UK Data Ingestion of APIs GCP based (Google Cloud Platform) Snowflake BigQuery DBT Semantic layer (Cube/Looker) Desirable Airflow (ApacheAirflowMore ❯
start 12 month contract Essential Been to school in the UK Data Ingestion of APIs GCP based (Google Cloud Platform) Snowflake BigQuery DBT Semantic layer (Cube/Looker) Desirable Airflow (ApacheAirflowMore ❯
Technical Expertise: Solid experience in Python programming, particularly using data manipulation and processing libraries such as Pandas, NumPy, and Apache Spark. Hands-on experience with open-source data frameworks like Apache Spark, Apache Kafka, and Apache Airflow. Strong proficiency in SQL, including advanced query development and performance tuning. Good understanding of distributed computing principles and big … automation pipelines. Experience working with relational databases such as PostgreSQL, MySQL, or equivalent platforms. Skilled in using containerization technologies including Docker and Kubernetes. Experience with workflow orchestration tools like ApacheAirflow or Dagster. Familiar with streaming data pipelines and real-time analytics solutions. More ❯
Technical Expertise: Solid experience in Python programming, particularly using data manipulation and processing libraries such as Pandas, NumPy, and Apache Spark. Hands-on experience with open-source data frameworks like Apache Spark, Apache Kafka, and Apache Airflow. Strong proficiency in SQL, including advanced query development and performance tuning. Good understanding of distributed computing principles and big … automation pipelines. Experience working with relational databases such as PostgreSQL, MySQL, or equivalent platforms. Skilled in using containerization technologies including Docker and Kubernetes. Experience with workflow orchestration tools like ApacheAirflow or Dagster. Familiar with streaming data pipelines and real-time analytics solutions. More ❯
Technical Expertise: Solid experience in Python programming, particularly using data manipulation and processing libraries such as Pandas, NumPy, and Apache Spark. Hands-on experience with open-source data frameworks like Apache Spark, Apache Kafka, and Apache Airflow. Strong proficiency in SQL, including advanced query development and performance tuning. Good understanding of distributed computing principles and big … automation pipelines. Experience working with relational databases such as PostgreSQL, MySQL, or equivalent platforms. Skilled in using containerization technologies including Docker and Kubernetes. Experience with workflow orchestration tools like ApacheAirflow or Dagster. Familiar with streaming data pipelines and real-time analytics solutions. More ❯
Technical Expertise: Solid experience in Python programming, particularly using data manipulation and processing libraries such as Pandas, NumPy, and Apache Spark. Hands-on experience with open-source data frameworks like Apache Spark, Apache Kafka, and Apache Airflow. Strong proficiency in SQL, including advanced query development and performance tuning. Good understanding of distributed computing principles and big … automation pipelines. Experience working with relational databases such as PostgreSQL, MySQL, or equivalent platforms. Skilled in using containerization technologies including Docker and Kubernetes. Experience with workflow orchestration tools like ApacheAirflow or Dagster. Familiar with streaming data pipelines and real-time analytics solutions. More ❯
london (city of london), south east england, united kingdom
Vallum Associates
Technical Expertise: Solid experience in Python programming, particularly using data manipulation and processing libraries such as Pandas, NumPy, and Apache Spark. Hands-on experience with open-source data frameworks like Apache Spark, Apache Kafka, and Apache Airflow. Strong proficiency in SQL, including advanced query development and performance tuning. Good understanding of distributed computing principles and big … automation pipelines. Experience working with relational databases such as PostgreSQL, MySQL, or equivalent platforms. Skilled in using containerization technologies including Docker and Kubernetes. Experience with workflow orchestration tools like ApacheAirflow or Dagster. Familiar with streaming data pipelines and real-time analytics solutions. More ❯
SQL Server Good proficiency in any OOP language (Python, Java, C#) Experience developing reliable and efficient data pipeline solutions Experience with on-prem or cloud data integration system (e.g. Apache NiFi, ApacheAirflow, AWS Glue) Familiarity with CI/CD pipelines and DevOps practices Excellent problem-solving and communication skills Bachelor's degree in Computer Science or More ❯
delivering end-to-end AI/ML projects. Nice to Have: Exposure to LLMs (Large Language Models), generative AI , or transformer architectures . Experience with data engineering tools (Spark, Airflow, Snowflake). Prior experience in fintech, healthtech, or similar domains is a plus. More ❯
data engineering tasks. Experience building and maintaining web scraping pipelines. Strong SQL skills, with expertise in performance tuning. Strong proficiency with dbt for data transformations. Hands-on experience with ApacheAirflow or Prefect. Proficiency with GitHub, GitHub Actions, and CI/CD pipelines. Nice to have: Experience with GCP (BigQuery, Dataflow, Composer, Pub/Sub) or AWS. Familiarity More ❯
Profile: Proven experience as a Data Engineer, with strong expertise in designing and managing large-scale data systems. Hands-on proficiency with modern data technologies such as Spark, Kafka, Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery More ❯
Strong knowledge of algorithms, design patterns, OOP, threading, multiprocessing, etc. Experience with SQL, NoSQL, or tick databases Experience working in a Unix environment and git Familiarity with Kafka, Docker, AirFlow, Luigi Strong communication skills in verbal and written English. Domain knowledge in futures & swaps is a plus Highly competitive compensation and bonus structure Meritocratic environment with ample opportunity for More ❯
following engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith More ❯
East London, London, United Kingdom Hybrid / WFH Options
InfinityQuest Ltd,
Hybrid) (3 days onsite & 2 days remote) Role Type: 6 Months Contract with possibility of extensions Mandatory Skills : Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, ApacheAirflow, Energy Trading experience Job Description:- Data Engineer (Python enterprise developer): 6+ years of experience in python scripting. Proficient in developing applications in Python language. Exposed to python More ❯
platforms. Proven experience in a leadership or technical lead role, with official line management responsibility. Strong experience with modern data stack technologies, including Python, Snowflake, AWS (S3, EC2, Terraform), Airflow, dbt, Apache Spark, Apache Iceberg, and Postgres. Skilled in balancing technical excellence with business priorities in a fast-paced environment. Strong communication and stakeholder management skills, able More ❯
record in full stack data development (from ingestion to visualization). Strong expertise in Snowflake, including data modeling, warehousing, and performance optimization. Hands-on experience with ETL tools (e.g., ApacheAirflow, dbt, Fivetran) and integrating data from ERP systems like NetSuite. Proficiency in SQL, Python, and/or other scripting languages for data processing and automation. Familiarity with More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
/BI role Advanced SQL skills with hands-on experience using dbt for data modeling Strong familiarity with modern data platforms like Snowflake, Looker, AWS/GCP, Tableau, or Airflow Experience with version control tools (e.g., Git) Ability to design, build, and document scalable, reliable data models Comfortable gathering business requirements and translating them into data architecture Strong problem More ❯
facing use. Skills & Qualifications Proven expertise in data engineering and platform strategy with end-to-end delivery in production. Strong proficiency in Python and SQL; Terraform (IaC); dbt (transformations); Airflow (orchestration); Looker (or equivalent BI). Cloud experience on GCP (preferred) or AWS/Azure. Deep knowledge of modern data architectures: Data Warehouses, Data Lakes, Lakehouses Solid grounding in More ❯
or personal projects Understanding of data pipelines, ETL/ELT, or data warehousing Awareness of cloud platforms (e.g., AWS, Azure, or GCP) Understanding of modern data tools (e.g., dbt, Airflow, Snowflake, BigQuery) Version control tools (e.g., Git) What You’ll Gain Hands-on experience with a modern AI-enabled data stack in a real-world, production environment The chance More ❯
software development lifecycle, from conception to deployment. Capable of conceptualizing and implementing software architectures spanning multiple technologies and platforms. Technology stack Python Flask Java Spring JavaScript BigQuery Redis ElasticSearch Airflow Google Cloud Platform Kubernetes Docker Voted "Best Places to Work," our culture is driven by self-starters, team players, and visionaries. Headquartered in Los Angeles, California, the company operates More ❯
Deep understanding of data wareho using concepts, ETL/ELT pipelines and dimensional modelling Proficiency in advanced programming languages (Python/PySpark, SQL) Experience in data pipeline orchestration (e.g. Airflow, Data Factory) Familiarity with DevOps and CI/CD practices (Git, Azure DevOps etc) Ability to communicate technical concepts to both technical and non-technical audiences Proven experience in More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Client Server Ltd
As a Senior Data Engineer you will take ownership of the data platform, optimising it for scalability to ensure successful client onboarding. You'll use modern tools (such as Airflow, Prefect, Dagster or AWS Step Functions) for ETL design and orchestration, work on transformation logic to clean, validate and enrich data (including handling missing values, standardising formats and duplication More ❯
deep expertise in ETL and data warehousing (especially BigQuery). Hands-on competence with Looker (LookML) or similar BI tools. Strong SQL skills and familiarity with orchestration tools like Airflow (or similar). A thoughtful approach to data modeling, query optimization, and performance tuning. Excellent analytical problem-solving skills and the ability to drive projects in a fast-paced More ❯