junior members of the team and influencing them with your vision. Our tech stacks vary between products (such as OracleDB, MongoDB, Elastic Search and Hadoop for data storage and a mixture of commercial-off-the-shelf products and custom applications. We embrace a DevSecOps (Development, Security, and operations) mindset more »
relevant experience in building DW/BI systems · Demonstrated ability in data modeling, ETL development, and Data warehousing. · Strong experience with Big Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, etc.) · Expertise in a BI solution like Power BI · Hands on experience in modelling databases (particularly nosql), working on indexes more »
modern data engineering technology stack compatible with AWS. Experience with web scraping and other data ingestion methods and tools. Knowledge of distributed computing frameworks (Hadoop, Spark, Hive, Presto). Experience with data orchestration tools (Airflow, Orchestra, Azkaban). Expertise in cloud data warehousing and core data modelling concepts. Proficiency more »
to plan work to maximize the team's productivity and effectiveness. • Deep understanding of the AI development lifecycle • Proficiency in big data technologies like Hadoop, Spark, or similar frameworks. • Excellent skills in data visualization and interpretation. • Demonstrated history of successfully delivering high-quality, data-driven solutions, including deploying production more »
Science or related field. 4+ Years as a practical Data Engineer Proficiency in Python, Java, or Scala programming. Familiarity with big data frameworks (Spark, Hadoop, etc..) Nice to have: Cloud migration experience with AWS, AZURE or GCP If this role has piqued your interest and you would like to more »
best of breed Java toolsets - focused on MicroServices Architectures, powerful front- and backend frameworks, RESTful services, and everything from NoSQL databases like MongoDB and Hadoop, high-performance data grids like HazelCast to multi-node relational systems. You will be working in a Scrum Team of cross-functional skills in more »
best of breed Java toolsets - focused on MicroServices Architectures, powerful front- and backend frameworks, RESTful services, and everything from NoSQL databases like MongoDB and Hadoop, high-performance data grids like HazelCast to multi-node relational systems. You will be working in a Scrum Team of cross-functional skills in more »
best-of-breed Java toolsets - focused on MicroServices Architectures, powerful front- and backend frameworks, RESTful services, and everything from NoSQL databases like MongoDB and Hadoop, high-performance data grids like HazelCast to multi-node relational systems. You will be working in a Scrum Team of cross-functional skills in more »
Months Location - Hybrid ( 2 days a week) JD : Experience of working with Streaming & Batch technology stack – Confluent Kafka, Mongdb , Streamsets, IBM CDC, Hive, Hadoop, API, Informatica, Airflow, and other similar technologies SME level skills and experience of designing/architecting test automation solutions, ability to creatively problem solve is more »
London, England, United Kingdom Hybrid / WFH Options
Global Relay
JMeter or similar tools Web services technology such as REST, JSON or Thrift Testing web applications with Selenium WebDriver Big data technology such as Hadoop, MongoDB, Kafka or SQL Network principles and protocols such as HTTP, TLS and TCP Continuous integration systems such as Jenkins or Bamboo Continuous delivery more »
best of breed Java toolsets - focused on MicroServices Architectures, powerful front- and backend frameworks, RESTful services, and everything from NoSQL databases like MongoDB and Hadoop, high-performance data grids like HazelCast to multi-node relational systems. You will be working in a Scrum Team of cross-functional skills in more »
for Data & Analytics Platforms Experience with Relational Databases like Oracle, NoSQL Databases and/or Big Data technologies (e.g. Oracle, SQL Server, Postgres, Spark, Hadoop, other Open Source). Must have experience in Data Security Solutions (Identity and Access Management and Data Security Access Management) Must have experience of more »
such as Python, C#, .Net and/or JavaScript is highly desirable. Experience with cloud platforms (e.g., Azure) and data technologies (e.g., SQL, NoSQL, Hadoop, Spark). PLEASE NOTE: You must have either UK citizenship or permanent leave to remain in the UK. Due to the high volume of more »
Manchester, England, United Kingdom Hybrid / WFH Options
Lorien
DynamoDB/etc.) Solid understanding of data governance principles and how to implement these across the business Knowledge of Big Data technology (Spark/Hadoop/etc.) Excellent communication skills across various levels of stakeholders Benefits: Salary available £120,000 Bonus scheme Enhanced pension contribution available Genuine opportunity to more »
the space of data and AI technologies and business scenarios. Strong understanding of cutting edge and legacy Big Data and AI technologies such as Hadoop, Spark, OpenAI and Claude as well as architectures and domains such as Computer Vision, NLP, Neural Networks, Machine Learning, Generative AI, Data Warehouse and more »
Months Location: Milton Keynes Key responsibilities: Demonstrable experience as an Kafka Developer (Ideally Kafka Streams). Hand on experience in Big data technologies (Hadoop, Hue, Hive, Impala,Spark,.) and stronger within Kafka. Knowledge and experience using Key value data bases . Experience developing microservices using Spring. Design and more »
London, England, United Kingdom Hybrid / WFH Options
Global Relay
a Global Relay DevOps engineer you will be integrated with a software engineering team to develop on premise ('on-prem') solutions including working with Hadoop based technologies. Your role will involve designing, implementing and supporting automated, scalable solutions. Your contribution will have an immediate impact of enabling efficient delivery … them from reoccurring. Deployments: Writing and running deployment automation tools using helm, ansible, or other configuration management systems Platform Integration: With technologies such as Hadoop and Kubernetes Some of the technologies that you will interact with include: Containerisation and virtualisation: Docker, Kubernetes, VMWare Operating Systems: Linux Build and deployment … Jenkins, Bitbucket, Maven, Helm Instrumentation and monitoring: Loki, Prometheus, Grafana, Mimir Languages and frameworks: Bash, Java, Groovy, Go, Python Big data technologies: Cassandra, ArangoDB, Hadoop, Kafka, MongoDB, Ceph Where you have knowledge gaps, training and mentoring will be provided. About You: You have an automation-first mindset. You enjoy more »
Dayrate :£450 Job Purpose and Primary Objectives We are seeking a highly experienced Kafka Developer with expertise in Kafka Streams and Big Data technologies (Hadoop, Hue, Hive, Impala, Spark). The ideal candidate will have strong knowledge of key-value databases and experience in developing microservices using Spring. The … time data services for various 24/7 applications with high-performance requirements. Key Skills/Knowledge Solid experience in Big Data technologies, specifically Hadoop, Hive, Java and Spark/Scala . Advanced SQL knowledge for testing changes and replicating code functionality. Proficient with code repositories like GIT and more »
integrity can be maintained as part of business improvement plans which affect the organisation Review, manage and lead the development of data frameworks (e.g. Hadoop) and analysis of data to ensure accuracy of sources and data resilience Communications and Engagement Identify and understand the business needs, prioritise, and design … recognising entities in free text Experience creating and developing SQL server queries and/or stored procedures Experience using and developing data frameworks (e.g. Hadoop) Experience developing and scripting dashboards and data visualisations using tools such as QlikView, QlikSense and Tableau Experience interpreting and analysing complex data sets from more »
Milton Keynes, Buckinghamshire, South East, United Kingdom
Maclean Moore Ltd
Months Location: Milton Keynes Key responsibilities: Demonstrable experience as an Kafka Developer (Ideally Kafka Streams). Hand on experience in Big data technologies (Hadoop, Hue, Hive, Impala,Spark,.) and stronger within Kafka. Knowledge and experience using Key value data bases . Experience developing microservices using Spring. Design and … data services for different applications ,usually 24/7 applications with a big performance requirement . Key skills/knowledge/experience: Big Data Hadoop - Hive and Spark/Scala solid experience SQL advance knowledge - Been able to test changes and issues properly, replicating the code functionality into SQL more »
As an IT Specialist, you'll need broad expertise across various areas of the technology/software domain. Proficiency in AWS or Big Data, Hadoop or other SQL databases, Lucene, Spark, web app development (JavaScript, Node.js), Docker, Jenkins, Git, Python, or Ruby would be highly beneficial. Key Responsibilities: Meet more »
or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving data quality Preparing data for predictive and prescriptive modelling Hands on coding experience more »
london, south east england, United Kingdom Hybrid / WFH Options
Solirius Consulting
or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving data quality Preparing data for predictive and prescriptive modelling Hands on coding experience more »
London, England, United Kingdom Hybrid / WFH Options
Morgan McKinley
learn). Understanding of database technologies (ETL) and SQL proficiency for data manipulation, data mining and querying. Knowledge of Big Data Tools (Spark or Hadoop a plus). Power BI, Dashboard design/development. Regulatory Awareness/Compliance Uphold Regulatory/Compliance requirements relevant to your role escalating areas more »