usable formats, and load it into data warehouses, data lakes or lakehouses. Big Data Technologies: Utilize big data technologies such as Spark, Kafka, and Flink for distributed data processing and analytics. Cloud Platforms: Deploy and manage data solutions on cloud platforms such as AWS, Azure, or Google Cloud Platform More ❯
usable formats, and load it into data warehouses, data lakes or lakehouses. Big Data Technologies: Utilize big data technologies such as Spark, Kafka, and Flink for distributed data processing and analytics. Cloud Platforms: Deploy and manage data solutions on cloud platforms such as AWS, Azure, or Google Cloud Platform More ❯
/CD pipelines. Big Data & Data Engineering : Strong background in processing large datasets and building data pipelines using platforms like Apache Spark , Databricks , ApacheFlink , or similar big data tools. Experience with batch and stream processing. Security : In-depth knowledge of security practices in cloud environments, including identity management More ❯
ETL processes, and database design Demonstrated ability to architect solutions for big data challenges Preferred Qualifications Experience with real-time data processing (Kafka, Kinesis, Flink) Knowledge of containerization and infrastructure-as-code (Docker, Kubernetes, Terraform) Familiarity with MLOps practices and tools (MLflow, Kubeflow, etc.) Experience with data governance frameworks More ❯
managing real-time data pipelines across a track record of multiple initiatives. Expertise in developing data backbones using distributed streaming platforms (Kafka, Spark Streaming, Flink, etc.). Experience working with cloud platforms such as AWS, GCP, or Azure for real-time data ingestion and storage. Programming skills in Python More ❯
managing real-time data pipelines across a track record of multiple initiatives. Expertise in developing data backbones using distributed streaming platforms (Kafka, Spark Streaming, Flink, etc.). Experience working with cloud platforms such as AWS, GCP, or Azure for real-time data ingestion and storage. Ability to optimise and More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Lloyds Bank plc
databases to build data solutions, such as SQL Server/Oracle , experience with relational and dimensional data structures. Experience in using distributed frameworks ( Spark, Flink, Beam, Hadoop ). Proficiency in infrastructure as code (IaC) using Terraform . Experience with CI/CD pipelines and related tools/frameworks. Containerisation More ❯
London, England, United Kingdom Hybrid / WFH Options
Lloyds Banking Group
databases to build data solutions, such as SQL Server/Oracle , experience with relational and dimensional data structures. Experience in using distributed frameworks ( Spark, Flink, Beam, Hadoop ). Proficiency in infrastructure as code (IaC) using Terraform . Experience with CI/CD pipelines and related tools/frameworks. Containerisation More ❯
workflow and knowledge of when and how to use dedicated hardware. Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. Experience with data quality and/or and data More ❯
workflow and knowledge of when and how to use dedicated hardware. · Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) · Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. · Experience with data quality and/or and data More ❯
workflow and knowledge of when and how to use dedicated hardware. Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. Experience with data quality and/or and data More ❯
AI/ML components or interest in learning data workflows for ML applications . Bonus if you have e xposure to Kafka, Spark, or Flink . Experience with data compliance regulations (GDPR). What you can expect from us: Opportunity for annual bonuses Medical Insurance Cycle to work scheme More ❯
AI/ML components or interest in learning data workflows for ML applications . Bonus if you have e xposure to Kafka, Spark, or Flink . Experience with data compliance regulations (GDPR). What you can expect from us: Salary 65-75k Opportunity for annual bonuses Medical Insurance More ❯
databases to build data solutions, such as SQL Server/Oracle , experience with relational and dimensional data structures. Experience in using distributed frameworks ( Spark, Flink, Beam, Hadoop ). Proficiency in infrastructure as code (IaC) using Terraform . Experience with CI/CD pipelines and related tools/frameworks. Containerisation More ❯
databases to build data solutions, such as SQL Server/Oracle , experience with relational and dimensional data structures Experience in using distributed frameworks ( Spark, Flink, Beam, Hadoop ) Proficiency in infrastructure as code (IaC) using Terraform Experience with CI/CD pipelines and related tools/frameworks Containerisation Good knowledge More ❯
native services, container orchestration (Kubernetes, Docker), infrastructure as code (Terraform, CloudFormation etc.), and serverless architectures. Strong experience with distributed data processing frameworks (Kafka, Spark, Flink, or similar). Proficiency in backend development using Java, Go, Python, or Scala, with a focus on writing clean, efficient, and maintainable code. Experience More ❯
advanced analytics infrastructure. Familiarity with infrastructure-as-code (IaC) tools such as Terraform or CloudFormation. Experience with modern data engineering technologies (e.g., Kafka, Spark, Flink, etc.). Why join YouLend? Award-Winning Workplace: YouLend has been recognised as one of the "Best Places to Work 2024" by the Sunday More ❯
London, England, United Kingdom Hybrid / WFH Options
Methods
automate data flows and manage complex workflows within hybrid environments. Event Streaming Experience: Utilise event-driven technologies such as Kafka, Apache NiFi, and ApacheFlink to handle real-time data streams effectively. Security and Compliance: Manage security setups and access controls, incorporating tools like Keycloak to protect data integrity More ❯
London, England, United Kingdom Hybrid / WFH Options
Apollo Solutions
with dbt, Fivetran, Apache Airflow, Data Mesh, Data Vault 2.0, Fabric, and Apache Spark Experience working with streaming technologies such as Apache Kafka, ApacheFlink, or Google Cloud Dataflow Hands-on experience with modern data orchestration tools like Dagster or Prefect Knowledge of data governance and cataloging tools like More ❯
the ability to translate business requirements into technical solutions Excellent communication skills, and the ability to document processes Experience with data streaming technologies (Kafka, Flink, etc.) would be preferred Experience with graph/vector databases would be advantageous Relevant certifications in Azure or AWS cloud technologies could be beneficial More ❯
You have experience with one or more higher-level Python or Java-based data processing frameworks such as Beam, Dataflow, Crunch, Scalding, Storm, Spark, Flink, etc. You have strong Python programming abilities Experience using pre-trained ML models is a plus You might have worked with Docker as well More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Atarus
teams to deliver robust data solutions 💡 What You’ll Bring Strong hands-on experience building streaming data platforms Deep understanding of tools like Kafka , Flink , Spark Streaming , etc. Proficiency in Python , Java , or Scala Cloud experience with AWS , GCP , or Azure Familiarity with orchestration tools like Airflow , Kubernetes Collaborative More ❯
teams to deliver robust data solutions 💡 What You’ll Bring Strong hands-on experience building streaming data platforms Deep understanding of tools like Kafka , Flink , Spark Streaming , etc. Proficiency in Python , Java , or Scala Cloud experience with AWS , GCP , or Azure Familiarity with orchestration tools like Airflow , Kubernetes Collaborative More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Atarus
teams to deliver robust data solutions What You’ll Bring Strong hands-on experience building streaming data platforms Deep understanding of tools like Kafka , Flink , Spark Streaming , etc. Proficiency in Python , Java , or Scala Cloud experience with AWS , GCP , or Azure Familiarity with orchestration tools like Airflow , Kubernetes Collaborative More ❯
data sets, both structured and unstructured Used a range of open source frameworks and development tools, e.g. NumPy/SciPy/Pandas, Spark, Kafka, Flink Working knowledge of one or more relevant database technologies, e.g. Oracle, Postgres, MongoDB, ArcticDB. Proficient on Linux Advantageous: An excellent understanding of financial markets More ❯