London, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
Work with Hadoop, Spark, and other platforms for large-scale data processing. Real-Time Data Streaming: Develop and manage pipelines using CDC, Kafka, and Apache Spark. Database Management: Handle SQL databases like Oracle, MySQL, or PostgreSQL. Data Governance: Ensure data quality, security, and compliance with best practices. Ideal Candidate more »
an easy and developer friendly platform to create, manage and monitor their own kafka infrastructure Help drive our stream processing platform, whether it be Apache Flink, Spark, or another relevant technology Collaborate with cross-functional teams to design, develop, and implement scalable and reliable solutions Troubleshoot and resolve complex more »
libraries. Knowledge of Azure or other cloud services and ability to implement solutions utilizing them. Familiarity with databases (e.g., MySQL, MongoDB), web servers (e.g. Apache) and UI/UX design. Building and scaling infrastructure services using Microsoft Azure Experience of using core cloud application infrastructure services including identity platforms more »
South East London, England, United Kingdom Hybrid / WFH Options
Hunter Bond
s Degree in Computer Science, Engineering (or other related STEM subject)5+ years experience in data engineering2+ years in a leadership role.Experience working with Apache Spark, Azure Data Factory and other data pipelines tools.Strong programming skills.Impeccable communication skills.Precise attention to detail.Pioneering attitude.If you are a Lead Data Engineer and more »
Agile software development and system architecture within the Telco OSS domain, with preferred experience in Network GIS (Hexagon, IQ Geo) and workflow tooling (Appian, Apache Airflow). Strong understanding of platform and product dynamics, including Platform Engineering and its relevance to OSS. Extensive background in DevOps practices, encompassing test more »
South East London, England, United Kingdom Hybrid / WFH Options
Modo Energy
teams to support the orchestration of our ETL pipelines using Airflow and manage our tech stack including Python, Next.js, Airflow, PostgreSQL MongoDB, Kafka and Apache Iceberg.Optimize infrastructure costs and develop strategies for efficient resource utilization.Provide critical support by monitoring services and quickly resolving production issues.Contribute to the development and more »
step functions and ECS services. Strong understanding of AWS ecosystems like Lambdas, step functions and ECS services. Experience with data stack technologies, such as Apache Iceberg & Spark. Exposure to Apache Airflow, Prefect, Dagster, DBT. Expertise in data analysis with exposure to data services (such as Glue, Lake Formation more »
step functions and ECS services. Strong understanding of AWS ecosystems like Lambdas, step functions and ECS services. Experience with data stack technologies, such as Apache Iceberg & Spark. Exposure to Apache Airflow, Prefect, Dagster, DBT. Expertise in data analysis with exposure to data services (such as Glue, Lake Formation more »
Greater London, England, United Kingdom Hybrid / WFH Options
CommuniTech Recruitment Group
AWS ecosystems like Lambdas, step functions and ECS services. Experience of Dremio is a nice to have Experience with data stack technologies, such as Apache Iceberg & Spark. Exposure to Apache Airflow, Prefect, Dagster, DBT. Expertise in data analysis with exposure to data services (such as Glue, Lake Formation more »
to develop unit test cases. Help in backlog grooming. Key skills: Extensive experience in developing Bigdata pipelines in cloud using Bigdata technologies such as Apache Spark Expertise in performing complex data transformation using Spark SQL queries Experience in orchestrating data pipelines using Apache Airflow Proficiency in Git based more »
GoLang. - Significant experience with Hadoop, Spark and other distributed processing platforms and frameworks. - Experience working with Open table/storage formats like delta lake, apache iceberg or apache hudi. - Experience of developing and managing real time data streaming pipelines using Change data capture (CDC), Kafka and Apachemore »
pipelines Know your way around Unix based operating system Experience working with any major cloud provider (AWS, GCP, Azure) Fluency in English Experience using Apache Airflow Experience using Docker Experience using Apache Spark Benefits: Salary £40-50K per annum dependant on skills and experience 25 Days annual more »
in cloud development (ideally AWS)Knowledge and ideally hands-on experience with data streaming, event-based architectures and KafkaStrong communication and interpersonal skillsExperience with Apache Spark or Apache Flink would be ideal, but not essentialPlease note, this role is unable to provide sponsorship.If this role sounds of interest more »
Job DescriptionLead Data Engineer (Director) - Individual contributor - Azure, Data Factory, Databricks, Apache Spark - London Based I am hiring for a Lead Data Engineer for a crucial role within one of my Investment Bank clients in London. This role is at Director level as they require a very senior candidate … Responsibilities: Leading data engineering practicesSupport current applications Introduce AI practices to the team/project Communicate key successes with stakeholders Key Skills: Azure Databricks Apache Spark Datascience, AI, ML Certifications or continued upskilling/contribution to blog posts within Data & AI beneficial but not essential. This is a full … UK without sponsorship, if you are interested please apply or email me directly - aaron.dhammi@nicollcurtin.comLead Data Engineer (Director) - Individual contributor - Azure, Data Factory, Databricks, Apache Spark - London Based more »
Recent and proven experience of using Red Hat Linux (or others Unix flavours) including scripting in a commercial environment Experience supporting applications (Java, .NET, Apache, IIS); Desirable: Knowledge of Microsoft Windows Server. Experience with working within financial industry; Experience with working within an ITIL framework; Experience with working with more »
understanding of networking and IP packet structure Experience working in a DevOps team designing, developing and supporting solutions Experience in web-site development using Apache and PHP Project People is acting as an Employment Business in relation to this vacancy. more »
complete before business approval Oversee release management Maintain technology & security standards Stakeholder management Required Skills: 5+ years of development experience Familiarity with TypeScript, React, Apache NiFi, GraphQL, or Postgres is desirable. Proficiency in Atlassian's Jira and Confluence is desirable. Strong communication and time management skills Due to the more »
independently or as part of a team and operate to tight deadlines Sense of humour please Advantageous But Not a Must: Great knowledge of Apache, specifically Mod Rewrite Good working knowledge of Linux, including command line Server administration experience Ecommerce experience Smarty Competitive Salary Career Growth and Financial Stability more »
systems that incorporate various data backends, query languages and ORM frameworks.Experience designing and building ETL pipelines built around libraries and frameworks like Pandas and Apache Spark.Strong API design skills and a familiarity with building web applications.A proponent of great testing, first-class observability and automating everything.Familiarity with security principles more »
stakeholdersWillingness to pick up and learn new technologies and frameworksNice to have:Knowledge of databases, SQLFamiliarity with Boost ASIOFamiliarity with data serialization formats such Apache Arrow/Parquet, Google Protocol Buffers, FlatbuffersExperience with gRPC, http/REST and Websocket protocolsExperience with Google Cloud/AWS and/or containerization more »
expertise in developing and optimising ETL pipelines.Version Control: Experience with Git for code collaboration and change tracking.Data Pipeline Tools: Proficiency with tools such as Apache Airflow.Cloud Platforms: Familiarity with AWS, Azure, Snowflake, and GCP.Visualisation: Tableau or PowerBIDelivery Tools: Familiarity with agile backlogs, code repositories, automated builds, testing, and releases. more »
making, making trade-offs explicit and understandable to othersREQUIREMENTS7+ years' coding experience, including 3 years in a dedicated ML Engineering role2+ years’ experience with Apache SparkExperience working with GB+ scale dataExperience with deployed ML servicesExperience deploying multiple ML projects across different environmentsProductionisation experience in at least one cloud infrastructure more »
engineers of varying levels of experience. Flexibility and willingness to adapt to new software and techniques. Nice to Have Experience working with projects in Apache Spark, Databricks of similar. Expert cloud platform knowledge, e.g. Azure What will be your key responsibilities? A technical expert and leader on the Petcare more »
CMake Proficiency in developing cross-platform SDKs for Windows, macOS, Linux, WebAssembly and Embedded Platforms Knowledge of machine learning frameworks such as ONNXRuntime or Apache TVM Experience deploying and optimising real time embedded audio algorithms Familiarity with audio codecs, audio formats and audio streaming protocols is preferred Willingness to more »