pipelines solutions for the ingestion, transformation, and serving of data, as well as solutions for the orchestration of pipeline components (e.g. AWS Step Functions, Apache Airflow). Good understanding of data modelling, algorithm, and data transformation techniques to work with data platforms. Working knowledge of cloud development practices (AWS more »
testing, and maintenance of data pipelines and data storage systems on Google Cloud Platform (GCP). You will be working with technologies such as Apache Airflow, BigQuery, Python, and SQL to transform and load large data sets, ensuring high data quality and accessibility for business intelligence and analytics purposes. more »
London, England, United Kingdom Hybrid / WFH Options
Aventum Group
Profisee), Snowflake Data Integration, Azure Service Bus, Deltalake, BigQuery, Azure DevOps, Azure Monitor, Azure Data Factory, SQL Server, Azure DataLake Storage, Azure App Service, Apache Airflow, Apache Iceberg, Apache Spark, Apache Hudi, Apache Kafka, Power BI, BigQuery, Azure ML is a plus Experience with Azure more »
Microservice frameworks • working knowledge of client-side web technologies (React, JavaScript) • experience with Messaging frameworks (like Tibco, Kafka) • experience with web servers running Tomcat, Apache • exposure to Azure Cloud services (like Azure AKS, CI/CD) • knowledge of open-source market edge technologies like Cache frameworks, Monitoring tools etc. more »
in Computer Science, Engineering (or other related STEM subject) 5+ years experience in data engineering 2+ years in a leadership role. Experience working with Apache Spark, Azure Data Factory and other data pipelines tools. Strong programming skills. Impeccable communication skills. Precise attention to detail. Pioneering attitude. If you are more »
development (ideally AWS) Knowledge and ideally hands-on experience with data streaming, event-based architectures and Kafka Strong communication and interpersonal skills Experience with Apache Spark or Apache Flink would be ideal, but not essential Please note, this role is unable to provide sponsorship. If this role sounds more »
an easy and developer friendly platform to create, manage and monitor their own kafka infrastructure Help drive our stream processing platform, whether it be Apache Flink, Spark, or another relevant technology Collaborate with cross-functional teams to design, develop, and implement scalable and reliable solutions Troubleshoot and resolve complex more »
Lead Data Engineer (Director) - Individual contributor - Azure, Data Factory, Databricks, Apache Spark - London Based I am hiring for a Lead Data Engineer for a crucial role within one of my Investment Bank clients in London. This role is at Director level as they require a very senior candidate to … Leading data engineering practices Support current applications Introduce AI practices to the team/project Communicate key successes with stakeholders Key Skills: Azure Databricks Apache Spark Datascience, AI, ML Certifications or continued upskilling/contribution to blog posts within Data & AI beneficial but not essential. This is a full … without sponsorship, if you are interested please apply or email me directly - aaron.dhammi@nicollcurtin.com Lead Data Engineer (Director) - Individual contributor - Azure, Data Factory, Databricks, Apache Spark - London Based more »
engineers of varying levels of experience. Flexibility and willingness to adapt to new software and techniques. Nice to Have Experience working with projects in Apache Spark, Databricks of similar. Expert cloud platform knowledge, e.g. Azure What will be your key responsibilities? A technical expert and leader on the Petcare more »
data engineering or a similar role. > Proficiency in programming languages such as Python, Java, or Scala. > Strong experience with data processing frameworks such as Apache Spark, Apache Flink, or Hadoop. > Hands-on experience with cloud platforms such as AWS, Google Cloud, or Azure. > Experience with data warehousing technologies more »
managers, to understand data requirements and deliver high-quality solutions as well as architecting data ingestion, transformation, and storage processes using tools such as Apache Spark, Azure Data Factory, and other similar technologies. Other core duties include optimizing data pipeline performance, ensuring data accuracy, reliability, and timely delivery. Requirements … Services Certifications in relevant technologies, such as Azure Data Engineer or Databricks Certified Developer Experience with real-time data processing and streaming technologies like Apache Kafka or Azure Event Hubs Knowledge of data visualization tools, such as Power BI or Tableau Contributions to open-source projects or active participation more »
teams to support the orchestration of our ETL pipelines using Airflow and manage our tech stack including Python, Next.js, Airflow, PostgreSQL MongoDB, Kafka and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and quickly resolving production issues. Contribute to more »
experience in data engineering. Experienced in building ETL data pipelines. Relational database experience w/PostgreSQL. Understanding of tech within our stack: AWS/Apache beam/Kafka. Experience with Object Orientated Programming A desire to work in the commodities/trading sector. Permanent/Full-Time Employment. Hybrid more »
there is little work to do here. Experience is data-intensive applications is desirable here. Other technology in the stack includes Node, gRPC, protobuf, Apache Ignite, Apache Airflow and AWS. They have a hybrid-working set up that requires the team to be in the office more »
work is largely down to you. It can be entirely Back End. Otherwise, the stack includes Redux Saga, Ag-Grid, Node, TypeScript, gRPC, protobuf, Apache Ignite, Apache Airflow and AWS. As the application suite grows and advances in complexity, there is a decent amount of interaction with the more »
City of London, London, United Kingdom Hybrid / WFH Options
Client Server
to production, providing subject matter expertise on the .Net stack and contributing to technical design discussions. You'll use a range of technology including Apache Flink with Java for large scale data processing and will be able to assess and recommend new and emerging technologies, using the best tool …/CD and Infrastructure as Code (Terraform) You're familiar with other languages such as Java and are open to learning new things e.g. Apache Flink You've worked on systems that require high throughput and low latency You enjoy problem solving and have great communication and collaboration skills more »
Recent and proven experience of using Red Hat Linux (or others Unix flavours) including scripting in a commercial environment Experience supporting applications (Java, .NET, Apache, IIS); Desirable: Knowledge of Microsoft Windows Server. Experience with working within financial industry; Experience with working within an ITIL framework; Experience with working with more »
up and learn new technologies and frameworks Nice to have: Knowledge of databases, SQL Familiarity with Boost ASIO Familiarity with data serialization formats such Apache Arrow/Parquet, Google Protocol Buffers, Flatbuffers Experience with gRPC, http/REST and Websocket protocols Experience with Google Cloud/AWS and/ more »