Manchester, North West, United Kingdom Hybrid / WFH Options
N Brown Group
and practices and tools like Jira and Confluence. What technical skills you will have Experience with general Cloud products (Cloyd SQL, BigQuery, RedShift, Snowflake, Apache Beam, Spark) or similar products. Experience with open-source data-stack tools such as Airflow, Airbyte, DBT, Kafka etc. Awareness of data visualisation tools more »
Azure Data Lake , Azure Databricks or GCP Cloud Dataproc . Familiarity with big data technologies and distributed computing frameworks, such as Hadoop, Spark, or Apache Flink. Experience scaling an “API-Ecosystem ”, designing, and implementing “API-First” integration patterns. Experience working with authentication and authorisation protocols/patterns. Other Information more »
to production, providing subject matter expertise on the .Net stack and contributing to technical design discussions. You'll use a range of technology including Apache Flink with Java for large scale data processing and will be able to assess and recommend new and emerging technologies, using the best tool more »
Qualifications Meaningful experience in following technologies: Scala , SQL Experience and interest in Cloud platforms such as AZURE Data Factory. Experience in Distributed Processing using Apache Spark Ability to debug using tools like Ganglia UI, expertise in Optimizing Spark Jobs The ability to work across structured, semi-structured, and unstructured more »
Databricks • Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). • In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/ more »
Snowflake, or Databricks. Hands-on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam, or equivalent). In-depth knowledge of key technologies like Big Query, Redshift, Synapse, Pub Sub, Kinesis, MQ, Event Hubs, Kafka, Dataflow more »
multithreading, database access, performance tuning and design patterns. Experience in a diverse set of technologies including SQL, Spring, Spring Boot, Hibernate, JPA, Junit, Mockito, Apache Spark, Storm and related technologies. Practical experience in developing software products/solutions that are deployed on cloud (as PaaS, SaaS) using a client more »
to production, providing subject matter expertise on the .Net stack and contributing to technical design discussions. You'll use a range of technology including Apache Flink with Java for large scale data processing and will be able to assess and recommend new and emerging technologies, using the best tool more »
with AWS services such as S3 and Glue for data storage and processing. Familiarity with Kafka for real-time data streaming. Experience with PySpark, Apache Beam, and Airflow for building data pipelines. Proficiency in SQL and experience with databases including SQL Server, Redshift, and Mongo DB. Experience with BI more »
is flexibility on this. Requirements: Proven experience implementing Spark and Big Data solutions Database Management experience using SQL and NoSQL Proficiency with tools like Apache, Kafka, Spark, Airflow Strong level of programming experience using Python Expertise knowledge of AWS services like API Gateway, Lambda, Redshift, Glue, EMR, etc. Strong more »
Cambridgeshire, United Kingdom Hybrid / WFH Options
Jefferson Frank
to facilitate smooth deployments to production environments. Desired Skills and Experience: * Expertise in DBT (Data Build Tool) for transforming and modeling data. * Familiarity with Apache Airflow for orchestrating complex data workflows. * Proven experience as a Data Engineer, with a strong portfolio of successful data pipeline projects. Proficiency in Python more »
Luton, England, United Kingdom Hybrid / WFH Options
Ventula Consulting
experience as a Data Engineer using AWS or Azure. Hands-on experience with database engines such as SQL. Prior experience orchestrating data pipelines with Apache Airflow or similar tools. Practical experience building data applications in Python, Scala, or SQL. Understanding of Data Warehousing/Dimensional Modelling concepts. Strong knowledge more »
testing, and maintenance of data pipelines and data storage systems on Google Cloud Platform (GCP). You will be working with technologies such as Apache Airflow, BigQuery, Python, and SQL to transform and load large data sets, ensuring high data quality and accessibility for business intelligence and analytics purposes. more »
Greater London, England, United Kingdom Hybrid / WFH Options
Understanding Recruitment
use Java (for a very small amount of scripting work) Have public cloud experience with AWS or other cloud providers Have an understanding of Apache products such as Kafka and Flake Good knowledge of development using CI/CD Bonus points if you knowledge of: Web products Financial markets more »
Databricks * Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). * In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/ more »
Exposure to TCP/IP networking and familiarity with the FIX messaging protocol is a plus. Experience with financial and exchange data, SQL, Pandas, Apache Airflow, JavaScript, HTML, CSS, Vue.js/Vuetify is beneficial. Additional Benefits: Competitive salary and performance-based bonuses. Comprehensive benefits package including health insurance, retirement more »
London, England, United Kingdom Hybrid / WFH Options
Austin Fraser
a plus: Cutting-Edge Tech: Experience with containerisation, Kubernetes, and observability platforms. Workflow Wizardry: Familiarity with data orchestration tools like Airflow and ETL with Apache Beam. Data Visionary: Knowledge of DataVault (DV2) and data management concepts. Location: Our opportunities are available in London Victoria and Bracknell. Choose the work more »
City Of Peterborough, England, United Kingdom Hybrid / WFH Options
The ONE Group Ltd
CI/CD. Experienced in optimizing data for analytical and dashboarding tools like AWS Quicksight or Power BI. Familiarity with orchestration tools such as Apache Airflow for workflow scheduling and monitoring. Previous team lead experience in cloud-based data and analytics projects, preferably with AWS services. Prior involvement in more »
engineers of varying levels of experience. Flexibility and willingness to adapt to new software and techniques. Nice to Have Experience working with projects in Apache Spark, Databricks of similar. Expert cloud platform knowledge, e.g. Azure What will be your key responsibilities? A technical expert and leader on the Petcare more »
LIN Buses Serial Buses (RS485/RS232 etc..) SPI/I2C Python Go XML JSON HTML CSS Web backend servers (Angular, Django, NodeJS, React, Apache or similar) Web Sockets IP video and video routing Familiarity with Systems serving Real Time Information via Web Sockets Use of DDS and interfacing more »