Greater London, England, United Kingdom Hybrid / WFH Options
Validis
Proven ability to leverage CI/CD tools to streamline data pipeline development and deployment. Experience designing and implementing ETL pipelines using tools like Apache Airflow, Luigi, Spark, or similar frameworks (familiarity is a plus). Understanding of data warehousing concepts and data modelling techniques. Experience with SQL and more »
delivering moderate-to-complex data flows as part of a development team in collaboration with others. You’ll be confident using technologies such as: Apache Kafka, Apache NiFi, SAS DI Studio, or other data integration platforms. You can implement, deliver, and translate several data models, including unstructured data … and recognised standards to build solutions using various traditional or big data languages such as: SQL, PL/SQL, SAS Macro Language, Python, Scala, Apache Spark, Java, JavaScript etc, using various tools including SAS, Hue (Hive/Impala), Kibana (Elastic Search). Knowledge of data management on Cloud platforms more »
Manchester Area, United Kingdom Hybrid / WFH Options
The Green Recruitment Company
Support colleagues in relation to the delivery of ESG built environment solutions Exhibit thorough expertise in IES-VE, including modules like VE Compliance, Radiance, Apache HVAC, Apache Systems, MacroFlow, MicroFlow, and Vista-Pro, and the ability to extract sustainability outputs (e.g., for BREEAM, LEED) from IES-VE Your more »
work is largely down to you. It can be entirely Back End. Otherwise, the stack includes Redux Saga, Ag-Grid, Node, TypeScript, gRPC, protobuf, Apache Ignite, Apache Airflow and AWS. As the application suite grows and advances in complexity, there is a decent amount of interaction with the more »
Hertfordshire, South East, United Kingdom Hybrid / WFH Options
Hays
Directory Novell Netware systems, Zenworks, e-Directory Linux (SUSE) systems SUN Solaris Unix systems, NIS+ Lotus Domino MS Exchange and mail services Web servers, Apache, Tomcat Experience across the Oracle suite Data warehouse infrastructure, data archive solutions Storage area networks, volume management Anti-virus software Technical architectures and development more »
Modelling. Experience with at least one or more of these programming languages: Python, Scala/Java Experience with distributed data and computing tools, mainly Apache Spark & Kafka Understanding of critical path approaches, how to iterate to build value, engaging with stakeholders actively at all stages. Able to deal with more »
Flask, Tornado or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving data quality Preparing data for predictive and prescriptive modelling Hands on more »
Greater London, England, United Kingdom Hybrid / WFH Options
Validis
to leverage CI/CD tools to streamline data pipeline development and deployment. Proven expertise in designing and implementing ETL pipelines using tools like Apache Airflow, Luigi, Spark, or similar frameworks. Strong understanding of data warehousing concepts and data modelling techniques. Experience with SQL and proficiency in writing complex more »
London, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
Work with Hadoop, Spark, and other platforms for large-scale data processing. Real-Time Data Streaming: Develop and manage pipelines using CDC, Kafka, and Apache Spark. Database Management: Handle SQL databases like Oracle, MySQL, or PostgreSQL. Data Governance: Ensure data quality, security, and compliance with best practices. Ideal Candidate more »
Manchester, England, United Kingdom Hybrid / WFH Options
Vermelo RPO
sees challenges as development opportunities not problems Desirable Skills Experience of SAS Viya Experience of SAS Visual Analytics Experience of SQL Server Experience with Apache Airflow Experience using MS Dev Ops for workflow and CI/CD pipelines. Educated to degree standard more »
London, England, United Kingdom Hybrid / WFH Options
Pioneer Search
Senior Scala Developer - Apache Spark - Urgent Requirement Contract Length: 6 Months IR35 status: Inside Location: London - Hybrid working A Senior Scala Developer with experience in Apache Spark is needed for a British consultancy organisation. You will be an integral member of the team providing technical expertise to the more »
Chicago, Illinois, United States Hybrid / WFH Options
Request Technology - Robyn Honquest
required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Experience working with various types of databases like Relational, NoSQL, Object-based, Graph more »
Skills & Experience At least 10 years experience working with JavaScript or Python/Java Previous experience deploying Software into the Cloud EKS, Docker, Kubernetes Apache Spark or NiFi Microservice architecture experience Experience with AI/ML systems more »
Greater London, England, United Kingdom Hybrid / WFH Options
Understanding Recruitment
use Java (for a very small amount of scripting work) Have public cloud experience with AWS or other cloud providers Have an understanding of Apache products such as Kafka and Flake Good knowledge of development using CI/CD Bonus points if you knowledge of: Web products Financial markets more »
Luton, England, United Kingdom Hybrid / WFH Options
Ventula Consulting
models and following best practices. The Ability to develop pipelines using SageMaker, MLFlow or similar frameworks. Strong experience with data programming frameworks such as Apache Spark. Understanding of common Data Science and Machine Learning models, libraries and frameworks. This role provides a competitive salary plus excellent benefits package. In more »
experience in data engineering. Experienced in building ETL data pipelines. Relational database experience w/PostgreSQL. Understanding of tech within our stack: AWS/Apache beam/Kafka. Experience with Object Orientated Programming A desire to work in the commodities/trading sector. Permanent/Full-Time Employment. Hybrid more »
Proven experience in MLOps and deploying machine learning models on Kubernetes. Proficiency in cloud technologies, AWS, GCP, Azure Experience with data orchestration tools (e.g., Apache Airflow). Familiarity with Terraform for CI/CD and infrastructure as code. Strong programming skills in software development. A cloud-agnostic mindset, with more »
in Computer Science, Engineering (or other related STEM subject) 5+ years experience in data engineering 2+ years in a leadership role. Experience working with Apache Spark, Azure Data Factory and other data pipelines tools. Strong programming skills. Impeccable communication skills. Precise attention to detail. Pioneering attitude. If you are more »
testing, and maintenance of data pipelines and data storage systems on Google Cloud Platform (GCP). You will be working with technologies such as Apache Airflow, BigQuery, Python, and SQL to transform and load large data sets, ensuring high data quality and accessibility for business intelligence and analytics purposes. more »
teams to support the orchestration of our ETL pipelines using Airflow and manage our tech stack including Python, Next.js, Airflow, PostgreSQL MongoDB, Kafka and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and quickly resolving production issues. Contribute to more »
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
MBN Solutions
technologies, techniques, and architectures to build and maintain robust data pipelines. Technological Proficiency: Experience with technologies such as Azure Data Factory, Pentaho Data Integrator, Apache Hop, etc. ETL/ELT Practices: Strong understanding of modern ETL/ELT practices, frameworks, tooling, and execution environments. Data Delivery: Knowledge of data more »
environment. Familiarity with cloud-native computing concepts and experience with hybrid or private cloud platforms is advantageous. Technical expertise in a Microsoft, Redhat, and Apache data and software engineering environment. A team-oriented individual with a dedication to engineering excellence and the ability to lead and inspire a team more »