London, England, United Kingdom Hybrid / WFH Options
Rein-Ton
experience as a Data Engineer with a strong background in data pipelines. Proficiency in Python, Java, or Scala, and big data technologies (e.g., Hadoop, Spark, Kafka). Experience with Databricks, Azure AI Services, and cloud platforms (AWS, Google Cloud, Azure). Solid understanding of SQL and NoSQL databases. Strong More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
Aventum Group
Python, SQL, T-SQL, SSIS DB: Azure SQL Database, Cosmos DB, NoSQL, Methodologies: Agile, DevOps must have Concepts: ELT/ETL, DWH, APIs (RESTful), Spark APIs, FTP protocols, SSL, SFTP, PKI (public Key Infrastructure) and Integration testing Management Duties Yes We are an equal opportunity employer, and we are More ❯
london, south east england, united kingdom Hybrid / WFH Options
Aventum Group
Python, SQL, T-SQL, SSIS DB: Azure SQL Database, Cosmos DB, NoSQL, Methodologies: Agile, DevOps must have Concepts: ELT/ETL, DWH, APIs (RESTful), Spark APIs, FTP protocols, SSL, SFTP, PKI (public Key Infrastructure) and Integration testing Management Duties Yes We are an equal opportunity employer, and we are More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Aventum Group
Python, SQL, T-SQL, SSIS DB: Azure SQL Database, Cosmos DB, NoSQL, Methodologies: Agile, DevOps must have Concepts: ELT/ETL, DWH, APIs (RESTful), Spark APIs, FTP protocols, SSL, SFTP, PKI (public Key Infrastructure) and Integration testing Management Duties Yes We are an equal opportunity employer, and we are More ❯
Cambourne, England, United Kingdom Hybrid / WFH Options
Remotestar
innovative features while working with a talented and fun team. Responsibilities include: Development and maintenance of Real-Time Data Processing applications using frameworks like Spark Streaming, Spark Structured Streaming, Kafka Streams, and Kafka Connect. Manipulation of streaming data, including ingestion, transformation, and aggregation. Researching and developing new technologies … within a team. Documenting processes and sharing knowledge with the team. Preferred skills: Strong knowledge of Scala. Familiarity with distributed computing frameworks such as Spark, KStreams, Kafka. Experience with Kafka and streaming frameworks. Understanding of monolithic vs. microservice architectures. Familiarity with Apache ecosystem including Hadoop modules (HDFS, YARN … HBase, Hive, Spark) and Apache NiFi. Experience with containerization and orchestration tools like Docker and Kubernetes. Knowledge of time-series or analytics databases such as Elasticsearch. Experience with AWS services like S3, EC2, EMR, Redshift. Familiarity with data monitoring and visualization tools such as Prometheus and Grafana. Experience More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
technical audiences. One or more of the following certifications are desired: AWS Certified Developer, Databricks, Agile/Scrum, Python Programmer Preferred Qualifications: Familiarity with ApacheSpark or comparable distributed data processing frameworks, preferably for large-scale data transformations and analytics. Working knowledge of data governance platforms (e.g., Collibra More ❯
London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
and lakehouse architectures. - Knowledge of DevOps practices, including CI/CD pipelines and version control (eg, Git). - Understanding of big data technologies (eg, Spark, Hadoop) is a plus. #J-18808-Ljbffr More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
and lakehouse architectures. - Knowledge of DevOps practices, including CI/CD pipelines and version control (eg, Git). - Understanding of big data technologies (eg, Spark, Hadoop) is a plus. #J-18808-Ljbffr More ❯
City Of Bristol, England, United Kingdom Hybrid / WFH Options
ADLIB Recruitment | B Corp™
translate complex data concepts to cross-functional teams Bonus points for experience with: DevOps tools like Docker, Kubernetes, CI/CD Big data tools (Spark, Hadoop), ETL workflows, or high-throughput data streams Genomic data formats and tools Cold and hot storage management, ZFS/RAID systems, or tape More ❯
newport, wales, united kingdom Hybrid / WFH Options
ADLIB Recruitment | B Corp™
translate complex data concepts to cross-functional teams Bonus points for experience with: DevOps tools like Docker, Kubernetes, CI/CD Big data tools (Spark, Hadoop), ETL workflows, or high-throughput data streams Genomic data formats and tools Cold and hot storage management, ZFS/RAID systems, or tape More ❯
bath, south west england, united kingdom Hybrid / WFH Options
ADLIB Recruitment | B Corp™
translate complex data concepts to cross-functional teams Bonus points for experience with: DevOps tools like Docker, Kubernetes, CI/CD Big data tools (Spark, Hadoop), ETL workflows, or high-throughput data streams Genomic data formats and tools Cold and hot storage management, ZFS/RAID systems, or tape More ❯
bradley stoke, south west england, united kingdom Hybrid / WFH Options
ADLIB Recruitment | B Corp™
translate complex data concepts to cross-functional teams Bonus points for experience with: DevOps tools like Docker, Kubernetes, CI/CD Big data tools (Spark, Hadoop), ETL workflows, or high-throughput data streams Genomic data formats and tools Cold and hot storage management, ZFS/RAID systems, or tape More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
ADLIB Recruitment
translate complex data concepts to cross-functional teams Bonus points for experience with: DevOps tools like Docker, Kubernetes, CI/CD Big data tools (Spark, Hadoop), ETL workflows, or high-throughput data streams Genomic data formats and tools Cold and hot storage management, ZFS/RAID systems, or tape More ❯
London, England, United Kingdom Hybrid / WFH Options
Solirius Reply
learn, TensorFlow, XGBoost, PyTorch). Strong foundation in statistics, probability, and hypothesis testing. Experience with cloud platforms (AWS, GCP, Azure) and big data tools (Spark, Hive, Databricks, etc.) is a plus. Excellent communication and storytelling skills with the ability to explain complex concepts to non-technical stakeholders. Proven track More ❯
Reigate, England, United Kingdom Hybrid / WFH Options
esure Group
warehouses such as Databricks/Snowflake, ideally on AWSStrong Python experience, including deep knowledge of the Python data ecosystem, with hands-on expertise in Spark and AirflowHands-on experience in all phases of data modelling from conceptualization to database optimization supported by advanced SQL skillsHands-on Experience with implementing More ❯
Snowflake. Understanding of cloud platform infrastructure and its impact on data architecture. Data Technology Skills: A solid understanding of big data technologies such as ApacheSpark, and knowledge of Hadoop ecosystems. Knowledge of programming languages such as Python, R, or Java is beneficial. Exposure to ETL/ELT More ❯
London, England, United Kingdom Hybrid / WFH Options
Made Tech Limited
and able to guide how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with ApacheSpark, Databricks or Hadoop. Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes). Ability More ❯
London, England, United Kingdom Hybrid / WFH Options
Aker Systems Limited
to solve complex data challenges. Proven experience leading data engineering projects or teams. Expertise in designing and building data pipelines using frameworks such as ApacheSpark, Kafka, Glue, or similar. Solid understanding of data modelling concepts and experience working with both structured and semi-structured data. Strong knowledge More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
team collaboration One or more of the following certifications are desired: AWS Certified Developer, Databricks, Agile/Scrum, Python Programmer Preferred Qualifications: Familiarity with ApacheSpark or comparable distributed data processing frameworks, preferably for large-scale data transformations and analytics. Working knowledge of data governance platforms (e.g., Collibra More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
optimized solutions. One or more of the following certifications are desired: AWS Certified Developer, Databricks, Agile/Scrum, Python Programmer Preferred Qualifications: Familiarity with ApacheSpark or comparable distributed data processing frameworks, preferably for large-scale data transformations and analytics. Working knowledge of data governance platforms (e.g., Collibra More ❯
London, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
of data governance, data modeling, and business intelligence best practices. Knowledge of Agile, DevOps, Git, APIs, microservices, and data pipeline development . Familiarity with Spark, Kafka, or Snowflake is a plus. Desirable Certifications: Microsoft Certified: Fabric Analytics Engineer Associate Why Join Us? Competitive salary up to £70,000 per More ❯
for at least the past 5 years and being a UK national or dual UK national. Experience in prominent languages such as Python, Scala, Spark, SQL. Experience working with any database technologies from an application programming perspective - Oracle, MySQL, Mongo DB etc. Experience with the design, build and maintenance More ❯
London, England, United Kingdom Hybrid / WFH Options
Digthisdeal.com
CloudFormation). Familiarity with containerization and orchestration tools like Docker and Kubernetes. Experience in data engineering and working with big data technologies (e.g., Kafka, Spark). Familiarity with emerging technologies in AI, data science, and automation. Required Skills: Strong knowledge of TypeScript, React, and Node.js for full-stack development. More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
PA Consulting
have: Experience in the design and deployment of production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala, Spark, SQL. Perform tasks such as writing scripts, extracting data using APIs, writing SQL queries etc. Work closely with other engineering teams to integrate data More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
BlackRock, Inc
includes: DevOps automation, idempotent deployment testing, and continuous delivery pipelines Networking and security protocols, load balancers, API Gateways ETL tooling and workflow engines (e.g., Spark, Airflow, Dagster, Flyte) Accelerated compute libraries and hardware (e.g. pytorch, NVIDIA GPUs) Data modeling, and strategies for cleaning and validating data at scale Performance More ❯