robust way possible! Diverse training opportunities and social benefits (e.g. UK pension schema) What do you offer? Strong hands-on experience working with modern Big Data technologies such as Apache Spark, Trino, Apache Kafka, Apache Hadoop, Apache HBase, Apache Nifi, Apache Airflow, Opensearch Proficiency in cloud-native technologies such as containerization and Kubernetes Strong More ❯
production issues. Optimize applications for performance and responsiveness. Stay Up to Date with Technology: Keep yourself and the team updated on the latest Python technologies, frameworks, and tools like Apache Spark, Databricks, Apache Pulsar, Apache Airflow, Temporal, and Apache Flink, sharing knowledge and suggesting improvements. Documentation: Contribute to clear and concise documentation for software, processes, and … Experience with cloud platforms like AWS, GCP, or Azure. DevOps Tools: Familiarity with containerization (Docker) and infrastructure automation tools like Terraform or Ansible. Real-time Data Streaming: Experience with Apache Pulsar or similar systems for real-time messaging and stream processing is a plus. Data Engineering: Experience with Apache Spark, Databricks, or similar big data platforms for processing … large datasets, building data pipelines, and machine learning workflows. Workflow Orchestration: Familiarity with tools like Apache Airflow or Temporal for managing workflows and scheduling jobs in distributed systems. Stream Processing: Experience with Apache Flink or other stream processing frameworks is a plus. Desired Skills Asynchronous Programming: Familiarity with asynchronous programming tools like Celery or asyncio. Frontend Knowledge: Exposure More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Involved Solutions
customer data Continuously improve existing systems, introducing new technologies and methodologies that enhance efficiency, scalability, and cost optimisation Essential Skills for the Senior Data Engineer: Proficient with Databricks and Apache Spark, including performance tuning and advanced concepts such as Delta Lake and streaming Strong programming skills in Python with experience in software engineering principles, version control, unit testing and More ❯
South West London, London, United Kingdom Hybrid/Remote Options
ARC IT Recruitment Ltd
. AWS Platform Build: Demonstrable experience designing and building modern data platforms in AWS. ETL/Orchestration Expertise: Expertise in ETL/ELT design and data orchestration, specifically with Apache Airflow. SQL Mastery: Strong SQL skills with significant experience in query tuning and performance optimisation. Programming Proficiency: Proficiency in Python and Bash (for data processing, scripting, and automation). More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Arc IT Recruitment
. AWS Platform Build: Demonstrable experience designing and building modern data platforms in AWS. ETL/Orchestration Expertise: Expertise in ETL/ELT design and data orchestration, specifically with Apache Airflow. SQL Mastery: Strong SQL skills with significant experience in query tuning and performance optimisation. Programming Proficiency: Proficiency in Python and Bash (for data processing, scripting, and automation). More ❯
Azure, or GCP, with hands-on experience in cloud-based data services. Proficiency in SQL and Python for data manipulation and transformation. Experience with modern data engineering tools, including Apache Spark, Kafka, and Airflow. Strong understanding of data modelling, schema design, and data warehousing concepts. Familiarity with data governance, privacy, and compliance frameworks (e.g., GDPR, ISO27001). Hands-on More ❯
skills We're excited if you have 7+ years of experience delivering multi tier, highly scalable, distributed web applications Experience working with Distributed computing frameworks knowledge: Hive/Hadoop, Apache Spark, Kafka, Airflow Working with programming languages Python , Java, SQL. Working on building ETL (Extraction Transformation and Loading) solution using PySpark Experience in SQL/NoSQL database design Deep More ❯
AWS (S3, Lambda, Glue, Redshift) and/or Azure (Data Lake, Synapse). Programming & Scripting:Proficiency in Python, SQL, PySpark etc. ETL/ELT & Streaming:Expertise in technologies like Apache Airflow, Glue, Kafka, Informatica, EventBridge etc. Industrial Data Integration:Familiarity with OT data schema originating from OSIsoft PI, SCADA, MES, and Historian systems. Information Modeling:Experience in defining semantic More ❯
and Responsibilities While in this position your duties may include but are not limited to: Support the design, development, and maintenance of scalable data pipelines using tools such as Apache Airflow, dbt, or Azure Data Factory. Learn how to ingest, transform, and load data from a variety of sources, including APIs, databases, and flat files. Assist in the setup More ❯
East London, London, United Kingdom Hybrid/Remote Options
Client Server
London occasionally. About you : You have strong Python backend software engineer skills You have experience working with large data sets You have experience of using PySpark and ideally also Apache Spark You believe in automating wherever possible You're a collaborative problem solver with great communication skills Other technology in the stack includes: FastAPI, Django, Airflow, Kafka, ETL, CI More ❯
Google Cloud, Databricks) are a strong plus Technical Skills: • Proficiency in SQL and experience with relational databases (e.g., MySQL, PostgreSQL) • Familiarity with data pipeline and workflow management tools (e.g., Apache Airflow) • Experience with programming languages such as Python, Java, or Scala. Python is highly preferred • Basic understanding of cloud platforms and services (e.g., AWS, Azure, Google Cloud) • Knowledge of More ❯
liverpool, north west england, united kingdom Hybrid/Remote Options
Intuita - Vacancies
Azure Storage, Medallion Architecture, and working with data formats such as JSON, CSV, and Parquet. • Strong understanding of IT concepts, including security, IAM, Key Vault, and networking. • Exposure to Apache Airflow and DBT is a bonus. • Familiarity with agile principles and practices. • Experience with Azure DevOps pipelines. The "Nice to Haves": • Certification in Azure or related technologies. • Experience with More ❯
at least one cloud data platform (e.g. AWS, Azure, Google Cloud) and big data technologies (e.g., Hadoop, Spark). · Strong knowledge of data workflow solutions like Azure Data Factory, Apache NiFi, Apache Airflow etc · Good knowledge of stream and batch processing solutions like Apache Flink, Apache Kafka/· Good knowledge of log management, monitoring, and analytics More ❯
Gloucester, Gloucestershire, South West, United Kingdom
YT Technologies
frameworks Confident using Git and working within Agile/SCRUM teams Experience mentoring junior developers Knowledge of Oracle/relational databases, MongoDB, and GitLab CI/CD Familiarity with Apache NiFi, JavaScript/TypeScript, and React Experience with Elasticsearch, Kibana, Hibernate, and the Atlassian suite (Bitbucket, Jira, Confluence) Desirable; Experience with JSF (PrimeFaces) Knowledge of AWS and cloud-ready More ❯
City Of Westminster, London, United Kingdom Hybrid/Remote Options
Additional Resources
of Kubernetes, Docker, and cloud-native data ecosystems. Demonstrable experience with Infrastructure as Code tools (Terraform, Ansible). Hands-on experience with PostgreSQL and familiarity with lakehouse technologies (e.g. Apache Parquet, Delta Tables). Exposure to Spark, Databricks, and data lake/lakehouse environments. Understanding of Agile development methods, CI/CD pipelines, GitHub, and automated testing. Practical experience More ❯
Westminster, City of Westminster, Greater London, United Kingdom Hybrid/Remote Options
Additional Resources
of Kubernetes, Docker, and cloud-native data ecosystems. Demonstrable experience with Infrastructure as Code tools (Terraform, Ansible). Hands-on experience with PostgreSQL and familiarity with lakehouse technologies (e.g. Apache Parquet, Delta Tables). Exposure to Spark, Databricks, and data lake/lakehouse environments. Understanding of Agile development methods, CI/CD pipelines, GitHub, and automated testing. Practical experience More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Additional Resources Ltd
of Kubernetes, Docker, and cloud-native data ecosystems. Demonstrable experience with Infrastructure as Code tools (Terraform, Ansible). Hands-on experience with PostgreSQL and familiarity with lakehouse technologies (e.g. Apache Parquet, Delta Tables). Exposure to Spark, Databricks, and data lake/lakehouse environments. Understanding of Agile development methods, CI/CD pipelines, GitHub, and automated testing. Practical experience More ❯
london (westminster), south east england, united kingdom Hybrid/Remote Options
Lloyds Bank
machine learning models for fraud detection, credit risk, customer segmentation, and behavioural analytics using scalable frameworks like TensorFlow, PyTorch, and XGBoost. • Engineer robust data pipelines and ML workflows using Apache Spark, Vertex AI, and CI/CD tooling to ensure seamless model delivery and monitoring. • Apply advanced techniques in deep learning, natural language processing (NLP), and statistical modelling to More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Lorien
data storytelling and operational insights. Optimise data workflows across cloud and on-prem environments, ensuring performance and reliability. Skills & Experience: Strong experience in ETL pipeline development using tools like Apache Airflow, Informatica, or similar. Advanced SQL skills and experience with large-scale relational and cloud-based databases. Hands-on experience with Tableau for data visualisation and dashboarding. Exposure to More ❯
Hucclecote, Gloucestershire, United Kingdom Hybrid/Remote Options
Omega Resource Group
GitLab) Contributing across the software development lifecycle from requirements to deployment Tech Stack Includes: Java, Python, Linux, Git, JUnit, GitLab CI/CD, Oracle, MongoDB, JavaScript/TypeScript, React, Apache NiFi, Elasticsearch, Kibana, AWS, Hibernate, Atlassian Suite What s on Offer: Hybrid working and flexible schedules (4xFlex) Ongoing training and career development Exciting projects within the UK s secure More ❯
. Skilled with Git version control. Experience with Agile/SCRUM. Database experience (Oracle, relational, and/or MongoDB). CI/CD pipelines (GitLab). Familiarity with Apache NiFi, Hibernate, Elasticsearch, and Kibana. Front-end experience with JavaScript/TypeScript & React. Proficiency with Atlassian Suite (Jira, Confluence, Bitbucket). Experience mentoring junior engineers. Desirable skills and qualifications: Experience More ❯
coding principles JavaScript, TypeScript, Vue.JS, SingleSPA. Experience - UI Development, UI testing (e.g. Selenium), Data Integration, CI/CD Hands-on experience in web services (REST, SOAP, WSDL etc.), using Apache Commons Suite & Maven, SQL Database such as Oracle MySQL, PostgreSQL etc. Hands-on experience in utilizing Spring Framework (Core, MVC, Integration and Data) Experience with Big Data/Hadoop More ❯
scientists Happy to work onsite in central Bristol most of the time Nice to have: Experience in trading, betting or energy markets AWS data engineering tools, Docker, DevOps or Apache suite Analytical background and data modelling experience Why this role? Private medical, dental & life assurance 30 days annual leave + public holidays Cycle-to-Work scheme Annual company ski More ❯
or another language such as Python Good knowledge of developing in a Linux environment Working knowledge of Git version control and GitLabs CI/CD pipelines Experience working with Apache NiFi Some exposure to front-end elements like JavaScript, TypeScript or React Some data interrogation with ElasticSearch and Kibana Exposure to working with Atlassian products Looking for a role More ❯
teams to build scalable data pipelines and contribute to digital transformation initiatives across government departments. Key Responsibilities Design, develop and maintain robust data pipelines using PostgreSQL and Airflow or Apache Spark Collaborate with frontend/backend developers using Node.js or React Implement best practices in data modelling, ETL processes and performance optimisation Contribute to containerised deployments (Docker/Kubernetes More ❯