work is largely down to you. It can be entirely Back End. Otherwise, the stack includes Redux Saga, Ag-Grid, Node, TypeScript, gRPC, protobuf, Apache Ignite, Apache Airflow and AWS. As the application suite grows and advances in complexity, there is a decent amount of interaction with the more »
there is little work to do here. Experience is data-intensive applications is desirable here. Other technology in the stack includes Node, gRPC, protobuf, Apache Ignite, Apache Airflow and AWS. They have a hybrid-working set up that requires the team to be in the office more »
development, deployment of large scale data streaming Pipelines in GCP . Work on Data Streaming POC Experience required: Expertise in python and DataFlow/Apache beam Experience in handling streaming data. Strong experience in database replication using Message based CDC Experience in using Kafka implementations in a secured cloud more »
Nottingham, Nottinghamshire, East Midlands, United Kingdom
Microlise
computing concepts and experience working with hybrid or private cloud platforms is a plus Demonstrable technical experience working with a Microsoft, Red Hat, and Apache data and software engineering environment A team-oriented individual with a passion for engineered excellence and the ability to lead and motivate a team more »
Hertfordshire, South East, United Kingdom Hybrid / WFH Options
Hays
Directory Novell Netware systems, Zenworks, e-Directory Linux (SUSE) systems SUN Solaris Unix systems, NIS+ Lotus Domino MS Exchange and mail services Web servers, Apache, Tomcat Experience across the Oracle suite Data warehouse infrastructure, data archive solutions Storage area networks, volume management Anti-virus software Technical architectures and development more »
Modelling. Experience with at least one or more of these programming languages: Python, Scala/Java Experience with distributed data and computing tools, mainly Apache Spark & Kafka Understanding of critical path approaches, how to iterate to build value, engaging with stakeholders actively at all stages. Able to deal with more »
Flask, Tornado or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving data quality Preparing data for predictive and prescriptive modelling Hands on more »
Greater London, England, United Kingdom Hybrid / WFH Options
Validis
to leverage CI/CD tools to streamline data pipeline development and deployment. Proven expertise in designing and implementing ETL pipelines using tools like Apache Airflow, Luigi, Spark, or similar frameworks. Strong understanding of data warehousing concepts and data modelling techniques. Experience with SQL and proficiency in writing complex more »
London, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
Work with Hadoop, Spark, and other platforms for large-scale data processing. Real-Time Data Streaming: Develop and manage pipelines using CDC, Kafka, and Apache Spark. Database Management: Handle SQL databases like Oracle, MySQL, or PostgreSQL. Data Governance: Ensure data quality, security, and compliance with best practices. Ideal Candidate more »
business context. Commercially minded, thinking about ways to increase revenue & profitability. Proficiency in data manipulation tools (Python, Pandas, Spark, SQL) and data visualization tools (Apache Superset, Tableau, Power BI, ggplot2) and MS Excel. Grasp of pricing strategies, market dynamics, and consumer behaviour in the online space is a plus. more »
Manchester Area, United Kingdom Hybrid / WFH Options
Adria Solutions Ltd
data tasks. Knowledge of CI/CD approaches for Data Platforms using Bitbucket and Bitbucket Pipelines. Knowledge of AWS data lake approaches using Athena & Apache Iceberg tables. Exposure to visualisation development using Power BI. Knowledge of MS SQL Server, SSIS, Visual Studio, and SSDT projects. Experience in a relevant more »
. • Troubleshooting networks issues (tcpdump/Wireshark). • Scripting capabilities (SH/Bash/Python/Perl). • Configuration of common services (DNS/Apache/NGINX/Postfix/Squid/SSH/iptables). • Understanding of clustering services, enabling High Availability failover. • Experience with enterprise hardware and more »
a Senior Software Engineer for this role, you will collaborate with the founding team to contribute to the broader integration of our product with Apache Spark and maintain the solution up to date and compatible with a variety of supported runtimes. Your contributions to our core solution will directly more »
Manchester, England, United Kingdom Hybrid / WFH Options
Vermelo RPO
sees challenges as development opportunities not problems Desirable Skills Experience of SAS Viya Experience of SAS Visual Analytics Experience of SQL Server Experience with Apache Airflow Experience using MS Dev Ops for workflow and CI/CD pipelines. Educated to degree standard more »
Data Engineer to join our team and build robust data pipelines. Responsibilities: Build, and maintain scalable data pipelines. Maintain the pipeline through Databricks and Apache Beam Seamless data integration and quality. Data storage solutions using GCP (will still be considered with other cloud experience) Best practices in data management more »